2014-01-04

How to Securely Delete a Harddisk

File systems usually do not really delete data on the hard disk if a file is being deleted by the operating system. Instead, the sectors that were used by the file are marked as no longer in use but the data stays on the disk and can be recovered. Physical deletion requires the sectors to be overwritten.

But even if sectors were overwritten there may still be chances to extract old data. That may be a expensive process which requires a professional lab but some data is worth the effort.

Therefore, secure data deletion requires more than just overwrite data once. It requires the sectors to be overwritten multiple times and best with special data patterns.

A Linux tool that can achieve that is scrub. Scrub overwrites hard disks, files, and other devices with special data patterns to make data recovery more difficult or maybe even impossible. Therefore, it can overwrite the disk with different patterns, e.g., according to U.S. NNSA Policy Letter NAP-14.1-C, U.S. DoD 5220.22-M, Roy Pfitzner's 7-random-pass method, or the pattern defined by the German Center of Security in Information Technologies (Bundesamt für Sicherheit in der Informationstechnik, BSI).

Install Scrub on Debian Linux:

sudo apt-get install scrub

Erasing Disks With Scrub

Scrub requires a file or device name and optionally the parameter -p (pattern) to select a certain pattern algorithm. In the example below, the hard disk /dev/sdc is overwritten with patterns defined by DoD 5220.22-M.

sudo scrub -p dod /dev/sdc

As you can see, secure data deletion is an easy task. The only drawback is the large amount of time required for overwriting a disk multiple times.

Further reading:

[1] Peter Gutmann, Secure Deletion of Data from Magnetic and Solid-State Memory
[2] scrub(1) - Linux man page

2013-12-25

Firefox TLS 1.2 Support

Since release 24, Firefox supports Transport Layer Security (TLS) 1.2. However, in with standard settings Firefox does not yet enable TLS 1.2 but just uses the old Secure Socket Layer (SSL) 3.0 and TLS 1.0 standard. The SSL 3.0 and TLS 1.0 standards are vulnerable to the so called BEAST attacks.

You can change that behaviour by changing the following settings in the about:config page of Firefox:

security.tls.version.min 0
security.tls.version.max 3

The numbers are codes for the different SSL/TLS versions:

0 - SSL 3.0
1 - TLS 1.0
2 - TLS 1.1
3 - TLS 1.2

Even if the older SSL 3.0 and TLS 1.0 standards are vulnerable, it usually is not useful to disable it as a lot of webservers do not yet support the newer TLS 1.1 or TLS 1.2 standards. It would no longer be possible to connect to such sites if security.tls.version.min would be set to 2 or 3.

2013-09-18

Global Firefox Settings on Linux

Today, I got an request to enforce various Firefox settings globally for all users on several Linux hosts.

All you have to do is creating two files in your Firefox installation directory (usually /usr/lib/firefox):

File: $FIREFOX_INSTALL_DIR/defaults/pref/000-local-config

// The file must start with a comment line
pref("general.config.filename", "local-config.js");
pref("general.config.obscure_value", 0);

File: $FIREFOX_INSTALL_DIR/local-config.js

// The file must start with a comment line

// Enforce the use of a Web Proxy Autodiscovery file
lockPref("network.proxy.autoconfig_url", "http://wpad.somedomain.com/wpad.dat");
lockPref("network.proxy.type", 2);

// Disable Firefox Save Per Side Download Feature
pref("browser.download.lastDir.savePerSite", false);

// Disable Firefox Displaying History in New Tabs
pref("browser.newtabpage.enabled", false);
pref("browser.newtabpage.url", "about:blank");

Setting set using the lockPref() function cannot be changed by the users (they are greyed out). Everything that has been set using the pref() function is valid unless the user overrides the setting.

This procedure has been tested with Firefox 23.

2013-08-20

Trouble with Android and OpenBSD Access Points

This year started with trouble as I discovered that my Android tablet permanently lost the connection to my OpenBSD 5.2 access point (athn(4)). The connection was unuseable and I was frustrated that it did not work with my new gadget. I tried to find a solution and spend a lot of time studing all kind of online forums. However, without success. So, I decided use instead a Linux based of the shelf access point and my OpenBSD box went to my storage room.

Last week I gave OpenBSD a second chance. And after a few minutes searching the Internet I found a post to the Android issue. It seems that OpenBSD 5.2 did not support the energy saving features used by many mobile devices. However, they improved OpenBSD in the meanwhile and the update to 5.3 solved the issue.

Unfortunately, one point is still open. The connection hangs when Android awakes from sleep and the Wi-Fi sleep policy (keep Wi-Fi on during sleep) is set to "always".
Set it to 'never'. With that setting Android turns Wi-Fi of if it goes to sleep and automatically reconnects to the access point after wake-up. This way my Android tablet works fine with OpenBSD 5.3.

See also:

2012-05-13

Tipp: Steve McConnells Software Estimation Book

A few months ago, my boss asked me to provide some estimates for our software work. I realized that not just me but also most of my colleagues did hard dealing with this issues. A structured approach was needed. Spending some time with Google, I found the book Software Estimation: Demistifying the Black Art by Steve McConnell.

In his book, Steve McConnell starts with building up a basic understanding about the backgrounds of software estimating (e.g. cone of uncertainty). Then he describes various approaches for effort, size, and schedule estimates (e.g. expert estimates, t-shirt sizing). McConnell handles all estimates as probability distributions. This approach allows you to describe the impact of a setting a milestone a few weeks earlier or later in a very confortable way. Finally, the book ends with a section about presenting estimates to managers. Especially this last chapter is a must read for every engineer who is going to be asked to provide estimates.

The book offered me an much improved view on how to deal with estimates. McConnell created a very pratical orientated book which is easy to understand and implement. Just a little background knowledge about statistics may be helpful for fine grasp.

Finally, after reading this book I got a very good understanding what must be done to create good estimates and were we can optimize our processes. Therefore, I want to recommend it to all my readers.

http://www.stevemcconnell.com/est.htm

2010-08-09

Check Out Which Files Are Double

I wrote a little script to figure out which files are double inside a directory tree. You can see how many bytes are wasted by storing data twice.


#!/usr/bin/perl -w

# Permission is hereby granted, free of charge,
# to any person obtaining a copy of this software
# and associated documentation files (the
# "Software"), to deal in the Software without
# restriction, including without limitation the
# rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of
# the Software, and to permit persons to whom
# the Software is furnished to do so, subject
# to the following conditions:
#
# This permission notice shall be included in
# all copies or substantial portions of
# the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT
# WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
# INCLUDING BUT NOT LIMITED TO THE WARRANTIES
# OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
# PURPOSE AND NONINFRINGEMENT. IN NO EVENT
# SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
# LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
# TORT OR OTHERWISE, ARISING FROM, OUT OF OR
# IN CONNECTION WITH THE SOFTWARE OR THE USE
# OR OTHER DEALINGS IN THE SOFTWARE.

use strict;
use File::Find;
use Digest::MD5;
use Digest::SHA;


# Debug
my $debug = 0;

# ARGV contains the basis directories
@ARGV = qw(.) unless @ARGV;
if(@ARGV > 1) {
print "Base directories: @ARGV\n\n";
} else {
print "Base directory: @ARGV\n\n";
}

# Save data in this structure
my %md5_sha_data;

# Process files
find(\&process_file, @ARGV);

# Create report
report();


sub process_file {
my $file = $_;
my $fullfile = $File::Find::name;
print "Processing: $file\n" if $debug;

my $skip = 0;

if(-d $file) {
$skip = 1;
} elsif(-l $file) {
$skip = 1;
}

if($skip) {
print "skipped\n\n" if $debug;
} else {
open(FILE,'<',$file) or die "Error: $!\n\n";
binmode(FILE);
my $md5 = Digest::MD5->new->addfile(*FILE)->hexdigest;
my $sha = Digest::SHA->new->addfile(*FILE)->hexdigest;
close(FILE);
my $size = -s $file;
print "$md5 $sha $size\n\n" if $debug;

save_data($fullfile, $md5, $sha, $size);
}
}


sub save_data {
my $file = shift;
my $md5 = shift;
my $sha = shift;
my $size = shift;
my $hash = "$md5$sha";

if(!exists($md5_sha_data{$hash})) {
$md5_sha_data{$hash}{"count"} = 1;
$md5_sha_data{$hash}{"size"} = $size;
push @{$md5_sha_data{$hash}{"filenames"}}, $file;
} else {
$md5_sha_data{$hash}{"count"} += 1;
push @{$md5_sha_data{$hash}{"filenames"}}, $file;
}
}


sub report {
my $total = 0;
for my $hash ( sort keys %md5_sha_data ){
if($md5_sha_data{$hash}{"count"} > 1) {
print "Multiple copies of equal file\n";
foreach (@{$md5_sha_data{$hash}{"filenames"}}){
print " $_\n";
}
my $sum = $md5_sha_data{$hash}{"count"} *
$md5_sha_data{$hash}{"size"} -
$md5_sha_data{$hash}{"size"};
$total += $sum;
print "$sum Bytes\n\n";
}
}

my $KBytes = $total / 1024;
$KBytes = sprintf("%.0f", $KBytes);
my $MBytes = $KBytes / 1024;
$MBytes = sprintf("%.0f", $MBytes);

print "$total Bytes ($KBytes KiB, $MBytes MiB) wasted space\n\n";
}

2010-08-03

Automatic FlightGear Scenario Downloader

FlightGear is a very nice open source flight simulator software. A major component of any flight simulator is the world scenery. The FlightGear project provides scenery files for the complete world on their FTP server. Unfortunately, the world scenery consists of more than 500 files. Each file approx 50 to 70 mega bytes in size.

A manual download of these files would require much endurance, as the server only permits four parallel downloads at the same time.

Therefore, I developed a script for automatic scenario downloading. Now, my computer can download the whole world scenery without any user interaction.

The script provides two interesting features:

It extracts a download list from the MD5SUM file that is available on the webserver. Thus, the downloader automatically detects which files are available. Later the MD5 sums are used to verify if the download was successful or not.

Furthermore, the scipt checks if files have been downloaded in the past. It skips these files to prevent downloading files twice. This way the user can easily interrupt and continue the downloading process.


#!/bin/bash

# Permission is hereby granted, free of charge,
# to any person obtaining a copy of this software
# and associated documentation files (the
# "Software"), to deal in the Software without
# restriction, including without limitation the
# rights to use, copy, modify, merge, publish,
# distribute, sublicense, and/or sell copies of
# the Software, and to permit persons to whom
# the Software is furnished to do so, subject
# to the following conditions:
#
# This permission notice shall be included in
# all copies or substantial portions of
# the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT
# WARRANTY OF ANY KIND, EXPRESS OR IMPLIED,
# INCLUDING BUT NOT LIMITED TO THE WARRANTIES
# OF MERCHANTABILITY, FITNESS FOR A PARTICULAR
# PURPOSE AND NONINFRINGEMENT. IN NO EVENT
# SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
# LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT,
# TORT OR OTHERWISE, ARISING FROM, OUT OF OR
# IN CONNECTION WITH THE SOFTWARE OR THE USE
# OR OTHER DEALINGS IN THE SOFTWARE.


URL="ftp://ftp.ibiblio.org/pub/mirrors/flightgear/ftp/Scenery-1.0.1/"
MD5SUM="MD5SUM"

WGET="/usr/bin/wget"
MD5="/usr/bin/md5sum"

if [ ! -f $MD5SUM ]
then
echo "### Downloading MD5SUM file"
wget "$URL$MD5SUM"
fi

if [ -f $MD5SUM ]
then
FILELIST=$(cat $MD5SUM | awk '{print $2}')
MD5LIST=$(cat $MD5SUM | awk '{print $1}')
CNT=1;

for FILE in $FILELIST
do
if [ ! -f $FILE ]
then
echo "### Downloading $FILE"
wget "$URL$FILE"

MD5ORIG=$(echo $MD5LIST | cut -d ' ' -f $CNT)
MD5FILE=$($MD5 $FILE | awk '{print $1}')

if [ $MD5ORIG == $MD5FILE ]
then
echo "### Success"
else
echo "### Error"
fi
else
echo "### File $FILE exists!"
fi

let CNT=$CNT+1
done


else
echo "### ERROR: Can not find file $MD5SUM"
fi


If you are not interested in FlightGear, you maybe use this scipt to download other files based on a MD5SUM file provided on a Web or FTP site.