I think I can speak from experience on the topic of archiving image
data. I have ~40,000 DVD-R disks in my office. This represents:
6 years of data collection at ALS 8.3.1
60 TB of data (>99% of everything collected, 2 copies)
10,000 data sets
318 PDB entries
$4000 of media
The purposes of this archive are:
1) an "eternal" off-site backup for users of ALS 8.3.1
2) a potential source of "interesting" data sets for methods developers
I define "interesting" as: a data set from a known structure that cannot
be solved by conventional methods. After all, if you are developing new
methods then data that can be solved by existing methods is of limited
utility. However, it is also difficult to develop and test a new
algorithm if you don't know what the "right answer" is. For this
reason, I think the most useful data sets to make available are the
"early" data sets from a structure project (the ones you couldn't
solve). Almost by definition, these are the most relevant data sets for
developing new methods. We all would like to get a structure solved
sooner than later. The problem is getting permission. Yes, it is
perhaps "legal" for me to give away people's data if they were collected
at a DOE facility (without a proprietary research agreement in place).
However, I would like to keep the few friends I have. It is important
to clear such transactions with the scientist who collected the data.
The difficulty is in connecting 10,000 image data sets to one of 40,000
PDB entries (which are generally deposited 1-2 years after data
collection) and then connecting interested parties to one of 500 users
who collected the data. This is not an insurmountable logistics
problem, but I'm afraid it is going to take me a while to do it all in
my "spare time".
IMHO we do not need a universal image format, we need WELL DOCUMENTED
image formats! Once that is done, then writing a converter to-and-from
imgCIF and any other format will be possible, and then imgCIF can start
to take hold. I believe a pivotal step toward a universal image format
is to have every generator of images (beamlines, diffractometer
manufacturers, etc.) make a lysozyme data set available in a very public
place. Preferably with instructions on how to process it properly. I
will now volunteer the following web page as the place to put all these
example data sets (for now). To put my money where my mouth is: A
lysozyme data set from ALS 8.3.1 is available here:
http://bl831.als.lbl.gov/example_data_sets/
WRT archiving in general, I think it also important to point out that
RAID only protects you against the total failure of one (or maybe two)
hard disk drives. RAID does NOT generally protect against anything
else, such as failures of RAID controller cards, flaky drive cables, bad
sectors or other "subtle" drive failures, filesystem corruption, power
surges, disgruntled sysadmins, etc. I have experienced all of these.
So, even if your stuff is on a RAID, always backup often.
I chose DVD-R because it is the cheapest media with a long rated
shelf-life (~100 years if you don't leave them in the sun). That, and
our astronomer colleagues (who also have the problem of storing large
amounts of digital image data) chose DVD-R as the storage medium for Sky
Survey 2 a few years back.
I recently checked on the prices of alternative media. Here is what I
came up with:
media price source lifetime
DVD-R $0.056/GB ( $0.25 / 4.5 GB ) 100 year?
LTO-3 $0.063/GB ( $50 / 800 GB ) 3-5 year
LTO-2 $0.063/GB ( $25 / 400 GB ) 3-5 year
DLTIV $0.200/GB ( $16 / 80 GB ) 3-5 year
HDD $0.424/GB ( $318 / 750 GB ) 3-5 year
BD-R $0.640/GB ( $16 / 25 GB ) 100 year?
CD-R $0.771/GB ( $0.54 / 0.7 GB ) 100 year?
8mm $1.000/GB ( $2.50 / 2.5 GB ) 3-5 year
ZIP $16.00/GB ( $4 / 0.25 GB ) 3-5 month
floppy $90.28/GB ( $0.13 / 1.44 MB ) 3-5 min
clay $2700./GB ( $.3/lb 1 bit/mm^3) >30,000 y
Note that hard disk drives cost 10x more than DVD-R. Blu-Ray disks
(BD-R) are more expensive than hard drives! Storing all this data on
hard drives would cost ~$50k, with an additional $5k/year for the 2
GB/hr we routinely collect. Storing all the data from all 30 PX
beamlines in the world would amount to ~$150k/year of hard drives.
The ~100 year lifetime of optical "-R" media has, of course yet to be
proved historically. Early CDs did have a problem that the glue used
for the label was slightly acid and corroded the aluminum reflective
layer that encodes the data (which, BTW, is directly under the label!).
Modern media no longer have this problem, and there are watchdog
agencies you can find with google that simulate the long-term effects of
time on any media you like. I can certainly say I have had problems
with worn-out DVD-R drives starting to make bad disks that pass
verification in the writer but not in a low-end reader. It seems that
the solution to this is to try reading the disk back in a good DVD-R
writer drive.
One must balance media lifetime with cost. The longest proved storage
lifetime of any media is clay tablets. In fact, the lifetime of data
stored with this medium is perhaps the definition of "recorded
history". I estimate that the maximum practical data density on clay
tablets would be 1 bit represented by an 0.5 mm diameter pit in the wet
clay, spaced on a 1x1 mm grid in a slab 1 mm thick. However, this
thickness is probably pushing it if you want the media to stand up to 30
millenia of earthquakes and global warming. Note that I have only
estimated the cost of the media, not the kiln to fire it or the
subsequent FTEs to curate the archive.
The problem I have now is similar. I have 40,000 DVDs with
progressively sketchier computerized records of what is on them as I go
back in time. (In the beginning, I was burning them all myself and
printing out sticky labels from a Word document). It is perhaps
relevant to mention here that this project is essentially unfunded. I
have twice applied for grants to make the image archive accessible to
the methods development community, but no luck. I mention this here
because I think it relevant to point out that the cost of curating a
public data base far exceeds the cost of the storage medium you use.
I also have a question:
It would seem LTO-3 or maybe the upcoming LTO-4 is not a bad short-term
alternative to DVD-R? Anybody out there have experience with one of
these tape drives? I would really appreciate the input.
-James Holton
MAD Scientist
Winter, G (Graeme) wrote:
Hi,
On the question of a "uniform format" for this data, I believe that
imgCIF has been working towards this end for a number of years. As a
very vocal supporter of this I would like to say that this is an ideal
archival format for the following reasons:
- the terms are clearly defined (or are currently in the process of
such)
- the images are compressed, typically by a factor of 2-3
- some data reduction packages (Mosflm, XDS) can read them in this
compressed form
Now, I would be telling lies if I said that this was all finished but I
think it is fair to say that this is already a long way down the path.
As soon as I am convinced that you can go losslessly to and from imgCIF,
and the data reduction programs will give precisely the same results, I
will convert thus freeing up 5 firewire disks.
For more information on this take a look at medsbio.org.
Cheers,
Graeme
-----Original Message-----
From: CCP4 bulletin board [mailto:[EMAIL PROTECTED] On Behalf Of
Mischa Machius
Sent: 17 August 2007 15:07
To: CCP4BB@JISCMAIL.AC.UK
Subject: [ccp4bb] Depositing Raw Data
Since there are several sub-plots in that mammoth thread, I thought
branching out would be a good idea.
I think working out the technicalities of how to publicly archive raw
data is fairly simple compared to the bigger picture.
1. Indeed, all the required meta-data will need to be captured just like
for refined coordinates. This will be an additional burden for the
depositor, but it's clearly necessary, and I do consider it trivial.
Trivial, as in the sense of "straightforward", i.e., there is no
fundamental problem blocking progress. As mentioned, current data
processing software captures most of the pertinent information already,
although that could be improved. I am sure that the beamlines,
diffraction-system manufacturers and authors of data- processing
software can be convinced to cooperate appropriately, if the community
needs these features.
2. More tricky is the issue of a unified format for the images, which
would be very helpful. There have been attempts at creating unified
image formats, but - to my knowledge - they haven't gotten anywhere.
However, I am also convinced that such formats can be designed, and that
detector manufacturers will have no problems implementing them,
considering that their detectors may not be purchased if they don't
comply with requirements defined by the community.
3. The hardware required to store all those data, even in a highly
redundant way, is clearly trivial.
4. The biggest problem I can see in the short run is the burden on the
databank when thousands of investigators start transferring gigabytes of
images, all at the same time.
5. I think the NSA might go bonkers over that traffic, although it
certainly has enough storage space. Imagine, they let their decoders go
wild on all those images. They might actually find interesting things in
them...
So, what's the hold-up?
Best - MM
On Aug 17, 2007, at 3:23 AM, Winter, G (Graeme) wrote:
Storing all the images *is* expensive but it can be done - the JCSG do
this and make available a good chunk of their raw diffraction data.
The
cost is, however, in preparing this to make the data useful for the
person who downloads it.
If we are going to store and publish the raw experimental measurements
(e.g. the images) which I think would be spectacular, we will also
need to define a minimum amount of metadata which should be supplied
with this to allow a reasonable chance of reproduction of the results.
This is clearly not trivial, but there is probably enough information
in the harvest and log files from e.g. CCP4, HKL2000, Phenix to allow
this.
The real problem will be in getting people to dig out that tape / dvd
with the images on, prepare the required metadata and "deposit" this
information somewhere. Actually storing it is a smaller challenge,
though this is a long way from being trivial.
On an aside - firewire disks are indeed a very cheap way of storing
the data. There is a good reason why they are much cheaper than the
equivalent RAID array. They fail. Ever lost 500GB of data in one go?
Ouch. ;o)
Just MHO.
Cheers,
Graeme
-----Original Message-----
From: CCP4 bulletin board [mailto:[EMAIL PROTECTED] On Behalf Of
Phil Evans
Sent: 16 August 2007 15:13
To: CCP4BB@JISCMAIL.AC.UK
Subject: Re: [ccp4bb] The importance of USING our validation tools
What do you count as raw data? Rawest are the images - everything
beyond that is modellling - but archiving images is _expensive_!
Unmerged intensities are probably more manageable
Phil
On 16 Aug 2007, at 15:05, Ashley Buckle wrote:
Dear Randy
These are very valid points, and I'm so glad you've taken the
important step of initiating this. For now I'd like to respond to one
of them, as it concerns something I and colleagues in Australia are
doing:
The more information that is available, the easier it will be to
detect fabrication (because it is harder to make up more information
convincingly). For instance, if the diffraction data are deposited,
we can check for consistency with the known properties of real
macromolecular crystals, e.g. that they contain disordered solvent
and not vacuum. As Tassos Perrakis has discovered, there are
characteristic ways in which the standard deviations depend on the
intensities and the resolution. If unmerged data are deposited,
there
will probably be evidence of radiation damage, weak effects from
intrinsic anomalous scatterers, etc. Raw images are probably even
harder to simulate convincingly.
After the recent Science retractions we realised that its about time
raw data was made available. So, we have set about creating the
necessary IT and software to do this for our diffraction data, and
are
encouraging Australian colleagues to do the same. We are about a week
away from launching a web-accessible repository for our recently
published (eg deposited in PDB) data, and this should coincide with
an
upcoming publication describing a new structure from our labs. The
aim
is that publication occurs simultaneously with release in PDB as well
as raw diffraction data on our website.
We hope to house as much of our data as possible, as well as data
from
other Australian labs, but obviously the potential dataset will be
huge, so we are trying to develop, and make available freely to the
community, software tools that allow others to easily setup their own
repositories. After brief discussion with PDB the plan is that PDB
include links from coordinates/SF's to the raw data using a simple
handle that can be incorporated into a URL. We would hope that we
can
convince the journals that raw data must be made available at the
time
of publication, in the same way as coordinates and structure factors.
Of course, we realise that there will be many hurdles along the way
but we are convinced that simply making the raw data available ASAP
is
a 'good thing'.
We are happy to share more details of our IT plans with the CCP4BB,
such that they can be improved, and look forward to hearing feedback
cheers
------------------------------------------------------------------------
--------
Mischa Machius, PhD
Associate Professor
UT Southwestern Medical Center at Dallas
5323 Harry Hines Blvd.; ND10.214A
Dallas, TX 75390-8816; U.S.A.
Tel: +1 214 645 6381
Fax: +1 214 645 6353