[ale] Need a big external drive quick. Suggestions?
    Greg Freemyer 
    greg.freemyer at gmail.com
       
    Fri Jun  2 14:50:20 EDT 2006
    
    
  
On 6/2/06, Bob Toxen <transam at verysecurelinux.com> wrote:
> You probably should do that dd on a per partition basis, i.e.:
>
>      tcsh
>      foreach i (1 2 3 4 5)
>         echo Doing $i
>         dd bs=10240k if=/dev/hda$i of=/dev/hdb$i
>      end
>
> I find that sometimes just doing:
>
>      dd bs=10240k if=/dev/hda of=/dev/hdb
>
> doesn't work, probably due to architecture.
>
> Another advantage to doing it by partition is that after each partition
> is copied you can mount the new partition on /dev/hdb and test it to
> see if it worked.  You don't want to discover Sunday morning that something
> didn't work.
>
> Of course be REAL careful that you copy in the right direction.  First
> backing up the old drive (if possible) or, at least, its most critical
> data is a real good idea.
>
> Bob Toxen
Thanks Bob, but doing forensics I do this sort of thing all the time.
The only big deal here is the shear size.  Never had to capture a
700GB volume before.  Have always been able to simply buy a big enough
single disk to hold the image.
Sounds like you do dd captures fairly often as well.  On the dd, you
should add conv=noerror,sync.  That allows dd to continue in the
presence of a disk error.
We actually normally use dcfldd (DOD Computer Forensic Lab DD).  It
generates a md5 as it goes.  We have it generate a md5 for each 2 GB
segment and keep the 2 GB segments in seperate files.  We then verify
both the source and the dest have the same md5 for each 2 GB segment
as we recorded in the capture pass.  It surprises me how often we get
a single 2 GB segment md5 disagree.  We then just reperform the
capture for any segments that fail.
Obviously most of the above is scripted.
Greg
-- 
Greg Freemyer
The Norcross Group
Forensics for the 21st Century
    
    
More information about the Ale
mailing list