[TriLUG] Copying Files from a Dying HDD...with a twist

Sean Korb via TriLUG trilug at trilug.org
Tue Jul 3 11:52:03 EDT 2018


This is almost completely non sequitur but a decade ago I worked with a
boutique filesystem (no longer available so you should be safe) that
attempted to recover from a double disk failure on its array of RAID
arrays.  It was not successful and while file size and metadata were fully
retained, I discovered some characters were replaced with nulls.  It was
very difficult to detect. Though we had a cache of md5sums, the individual
filesizes were over 6GB in many cases and I determined it would take years
to fully scrub for coherent recovery.

My solution for not really the issue you are investigating: if you want to
find specifically nulls in your bits that have rotted away, you can use

for file in `ls -1 /allmyprecious`; do echo "Grep for nulls in '$file'   "
`find /allmyprecious/$file -type f -name "*.txt"  -exec grep -Pan '\x00'
/dev/null {} \; >> /allmylogs/logs/grepfornulls-2000-02-04`;done

That gave me a list of files to restore from a legacy mass storage
systems.  Once you have a list, it's not so tough to sort.

sean


On Tue, Jul 3, 2018 at 10:01 AM, Brian via TriLUG <trilug at trilug.org> wrote:

> Hi Gang,
>
> I know there're several choices available to me when it comes to copying
> bunches of files from one place to another (under Linux, if that didn't go
> without saying.. :-) ).  I have a bit of a twist I want to put on it.
>
> The source drive was dropped while it was running, and although it still
> functions, there are now collections of errors in certain areas (my guess
> being correlating to where the heads were when the drive was dropped).
>
> What I'd like to do is something that will mirror a directory structure,
> but if a read error is encountered in any given file, that file winds up in
> a different location.  In other words, I want to replicate the directory
> structure in two places; one that gets filled up with error-free files, the
> other that gets filled up with error-containing files.  That way I'll have
> an easy, organized way of seeing what files and directories were actually
> lost/corrupted.  e.g.
>
> /good_files/<original_directory_structure>
> /bad_files/<original_directory_structure>
>
> So the question is, is there a command that'll already do this? Something
> like rsync, with an option to give a different target path for files when
> errors occur.  I could always script something, but I feel like
> file-by-file invocations of cp would be much less efficient than some
> command that would handle it internally.
>
> Thanks!
> -Brian
> --
> This message was sent to: Sean Korb <spkorb at gmail.com>
> To unsubscribe, send a blank message to trilug-leave at trilug.org from that
> address.
> TriLUG mailing list : https://www.trilug.org/mailman/listinfo/trilug
> Unsubscribe or edit options on the web  : https://www.trilug.org/mailman
> /options/trilug/spkorb%40gmail.com
> Welcome to TriLUG: https://trilug.org/welcome




-- 
Sean Korb spkorb at spkorb.org http://www.spkorb.org
'65 Suprang,'68 Cougar,'78 R100/7,'60 Metro,'59 A35,'71 Pantera #1382
"The more you drive, the less intelligent you get" --Miller
"Computers are useless.  They can only give you answers." -P. Picasso


More information about the TriLUG mailing list