data shredder

Thomas K Gamble tkg at lanl.gov
Mon Dec 21 13:24:52 UTC 2009


On Monday 21 December 2009 02:04:49 am Gilles Gravier wrote:
> Hi!
> 
> On 21/12/2009 09:55, Amedee Van Gasse (ub) wrote:
> > On Mon, December 21, 2009 04:28, jesse stephen wrote:
> >> I'm looking for a data shredder for ubuntu 9.10
> >
> > The other suggestions are good, and if you want a low-tech solution:
> >
> > 1) delete your files with rm as usual
> > 2) overwrite the empty disk space with zeroes or random data
> > Use either one of these commands:
> >
> > dd if=/dev/null of=nullfile bs=1M
> > dd if=/dev/random of=randomfile bs=1M
> >
> > They will create a file called 'nullfile' or 'randomfile', filling all
> > the empty space on your disk. The dd command will automatically abort
> > when all free disk space is used.
> > Please note that this can take a *long* time, depending on the size of
> > your free disk space. Also /dev/random is a special device that generates
> > "entropy" (=random data) and with this method you use up all the
> > available entropy so sometimes it will stall until it has created enough
> > new entropy.
> >
> > When it's done, rm nullfile or em randomfile.
> > If you're really paranoid, repeat the procedure a couple of times.
> 
> The problem with these commands, is that you're not really helping...
> Forensics tools will read below one or more levels of re-write. You need
> to do this several times in a row... and, more importantly, you need to
> use special data patterns that will actually make reading shadows of
> former data harder if not impossible. There are standards for that. And
> they do not involve writing random data or zeros, but actual specific
> patterns.
> 
> Gilles.
> 

From wikipedia:  http://en.wikipedia.org/wiki/Data_remanence

Overwriting

A common method used to counter data remanence is to overwrite the storage 
medium with new data. This is often called wiping or shredding a file or disk. 
Because such methods can often be implemented in software alone, and may be 
able to selectively target only part of a medium, it is a popular, low-cost 
option for some applications. Overwriting is generally an acceptable method of 
clearing, as long as the media is writable and not damaged.

The simplest overwrite technique writes the same data everywhere—often just a 
pattern of all zeros. At a minimum, this will prevent the data from being 
retrieved simply by reading from the medium again using standard system 
functions.

To counter more advanced data recovery techniques, specific overwrite patterns 
are often prescribed. These may be generic patterns intended to eradicate any 
trace signatures. For example, writing repeated, alternating patterns of ones 
and zeros may be more effective than zeros alone. Combinations of patterns are 
frequently specified.

One challenge with an overwrite is that some areas of the disk may be 
inaccessible, due to media degradation or other errors. Software overwrite may 
also be problematic in high-security environments which require stronger 
controls on data commingling than can be provided by the software in use. The 
use of advanced storage technologies may also make file-based overwrite 
ineffective.

Feasibility of recovering overwritten data

Peter Gutmann investigated data recovery from nominally overwritten media in 
the mid-1990s. He suggested magnetic force microscopy may be able to recover 
such data, and developed specific patterns, for specific drive technologies, 
designed to counter such. These patterns have come to be known as the Gutmann 
method.

Daniel Feenberg, an economist at the private National Bureau of Economic 
Research, claims that the chances of overwritten data being recovered from a 
modern hard drive amount to "urban legend". He also points to the "18½ minute 
gap" Rose Mary Woods created on a tape of Richard Nixon discussing the 
Watergate break-in. Erased information in the gap has not been recovered, and 
Feenberg claims doing so would be an easy task compared to recovery of a 
modern high density digital signal.

As of November 2007, the United States Department of Defense considers 
overwriting acceptable for clearing magnetic media within the same security 
area/zone, but not as a sanitization method. Only degaussing or physical 
destruction is acceptable for the latter.

On the other hand, according to the 2006 NIST Special Publication 800-88 (p. 
7): "Studies have shown that most of today’s media can be effectively cleared 
by one overwrite" and "for ATA disk drives manufactured after 2001 (over 15 
GB) the terms clearing and purging have converged." An analysis by Wright et 
al. of recovery techniques, including magnetic force microscopy, also 
concludes that a single wipe is all that is required for modern drives. They 
point out that the long time required for multiple wipes "has created a 
situation where many organizations ignore the issue all together – resulting 
in data leaks and loss. "


I especially like the comment that the chances of overwritten data being 
recovered from a modern hard drive amount to "urban legend".

If the idea that a single overwrite is sufficient is to be believed then we must 
assume that the stories of the FBI and other law enforcement agencies 
recovering data from hard drives, the data must not have been overwritten, or 
at least not thoroughly overwritten.

-- 
Thomas K. Gamble
Research Technologist, System/Network Administrator
Chemical Diagnostics and Engineering (C-CDE)
Los Alamos National Laboratory
MS-E543,p:505-665-4323 f:505-665-4267

There cannot be a crisis next week. My schedule is already full.
    Henry Kissinger




More information about the ubuntu-users mailing list