Do you need to recover files individually? If so, then image backup (at
least on its own) won't be a good option. One thing you could do is tar up
chunks (maybe a million files) and archive/backup those chunks. Keep a
catalog (hopefully a database with indexes) of which files are in which tar 
balls, and
then when you go to restore you only have to recover 1/80000 of your data
to get one file.

On Fri, Jan 20, 2017 at 02:18:04PM +0000, Bo Nielsen wrote:
> Hi all,
>
> I need advice.
> I must archive 80 billion small files, but that is not possible, as I see it.
> since it will fill in the TSM's Database about 73 Tb.
> The filespace is mounted on a Linux server.
> Is there a way to pack/zip the files, so it's a smaller number of files.
> anybody who has tried this ??
>
> Regards,
>
> Bo Nielsen
>
>
> IT Service
>
>
>
> Technical University of Denmark
>
> IT Service
>
> Frederiksborgvej 399
>
> Building 109
>
> DK - 4000 Roskilde
>
> Denmark
>
> Mobil +45 2337 0271
>
> boa...@dtu.dk<mailto:boa...@dtu.dk>

--
-- Skylar Thompson (skyl...@u.washington.edu)
-- Genome Sciences Department, System Administrator
-- Foege Building S046, (206)-685-7354
-- University of Washington School of Medicine

Reply via email to