Thanks for the quick replies. Some more info. I am backing up 22 SCO boxes
that have 9gb total space each. I also have 6+ LINUX servers with 75gb
drives. The SCO Master backups are not 9GB total but more like 4GB since
the system was originally built to run on a 1GB drive.

I have to stay with the SAS drives as the current RAID enclosure is SCSI
but I will rethink the RAID Level. I will have to dig out the manual on my
LSI card to see what levels it supports.


On Fri, May 16, 2014 at 4:03 PM, Doug Hughes <d...@will.to> wrote:

> If you have a 24x7 work cycle, avoid the green drives. They won't last.
> Red drives are reasonable and there are a lot of articles out there on the
> web about people using the red drives in production for 24x7. They
> generally have a 3 year warranty. If you expect to be doing active caching
> or other high-intensity workload, I'd go with the black drives.
>
>
>
> On Fri, May 16, 2014 at 3:36 PM, Dan Ritter <d...@randomstring.org> wrote:
>
>> On Fri, May 16, 2014 at 03:23:13PM -0400, john boris wrote:
>> > I have to rebuild my remote Backup Server (A place for my servers to
>> backup
>> > to hard dirves). I currently have a LINUX system with a RAID 5 array of
>> > 300GB SAS drives (Total 1.5TB) which has to be increased to 4TB or
>> larger
>> > (sorta depends on the cost) We are virtualizing 22 of my servers which
>> > currently use on board tapes for Master Backups and then do
>> differentials
>> > each night to my Backup Server over our WAN at night. Because this
>> server
>> > will now be holding Master Backups I need to grow the space. I don't
>> have
>> > the money nor budget to get any dedupe system or some other appliance.
>> >
>> > I started looking for drives and hit the smorgasbord of Green/Red
>> 32mb/16mb
>> > cache and the prices are all over the place. I can get a 1TB drive for
>> $133
>> > (approx) but the 500GB drive is $270 with a smaller cache. Same
>> > manufacturer and distributor. My RAID system has 8 slots which I want to
>> > use. There will be two arrays in the system. RAID1 (Mirrored for the oS)
>> > and the rest of the slots for RAID 5. Currently it is all one big array
>> and
>> > also uses LVM which I am not going to use when I rebuild this thing.
>> >
>> > So I am looking for some insight on what performance specs I should be
>> > looking for in such a project. I must confess as I type this I probably
>> > will also look at a NAS unit that would be comparable size and cost.
>> >
>> > My systems use a Backup program from Microlite (BackupEdge) which works
>> > just fine for what we do and hasn't failed me yet. (looking for some
>> real
>> > wood to knock on).
>>
>> You have a backup server, it has 8 3.5" SAS/SATA disk slots, and
>> you want to backup 22 machines nightly?
>>
>> You don't want RAID5. Use RAID10. 6 x 3TB will get you 9TB of
>> usable space, and a rebuild on a failed drive will involve much
>> less overhead. Your NICs are likely to be the bottleneck.
>>
>> The per-disk cache is not going to be interesting, because backups
>> are large contiguous writes and large contiguous reads.  SAS vs SATA
>> isn't going to help unless you're buying dual-port drives for controller
>> redundancy, which you didn't mention.  7200RPM SATA disks with a 5 year
>> warranty run about $200 each. (WD Black at NewEgg.)
>>
>> -dsr-
>> _______________________________________________
>> Tech mailing list
>> Tech@lists.lopsa.org
>> https://lists.lopsa.org/cgi-bin/mailman/listinfo/tech
>> This list provided by the League of Professional System Administrators
>>  http://lopsa.org/
>>
>
>
> _______________________________________________
> Tech mailing list
> Tech@lists.lopsa.org
> https://lists.lopsa.org/cgi-bin/mailman/listinfo/tech
> This list provided by the League of Professional System Administrators
>  http://lopsa.org/
>
>


-- 
John J. Boris, Sr.
Online Services
www.onlinesvc.com
_______________________________________________
Tech mailing list
Tech@lists.lopsa.org
https://lists.lopsa.org/cgi-bin/mailman/listinfo/tech
This list provided by the League of Professional System Administrators
 http://lopsa.org/

Reply via email to