Michael wrote:
> On Thursday, 6 June 2024 04:54:41 BST Dale wrote:
>
>> I was digging around Ebay.  Ran up on a used combo and then had a crazy
>> idea.  I found a ASUS B550-plus AC-HES mobo that is AM4.  I took that
>> idea and started building a combo with new parts.  CPU, Ryzen 7 5800X
>> and my little 4 port video card if CPU has no video support.  AMD says
>> it does.
> The Ryzen 7 5800X has 8 CPU cores, but no graphics cores:
>
> https://www.amd.com/en/products/processors/desktops/ryzen/5000-series/amd-ryzen-7-5800x.html
>
> If you take a look at the manual for the MoBo you'll see the 5000/3000 series 
> CPUs will be able to support up to four PCIe Gen 4 SSDs on the top PCIe x16 
> slot, if you install e.g. a Hyper M.2 x16 expansion card:
>
> https://www.asus.com/uk/motherboards-components/motherboards/accessories/
> hyper-m-2-x16-card-v2/
>
> but the same PCIe x16 slot will only be able to support up to three PCIe Gen 
> 3 
> SSDs with an expansion card, if you installed a Ryzen 5000/3000 G-Series 
> processor which has the integrated graphics cores.
>

I'm planning to use the video card I already have with 4 ports anyway. 
That will allow me to have another monitor for when I'm processing
files.  It's not much of a card but I don't need much anyway. 


>> The mobo has three PCIe X1 slots and a X4 slot.
> Actually it is more complicated, because bandwidth is shared across the PCIe 
> x16 slots:
>
> https://www.asus.com/motherboards-components/motherboards/prime/prime-b550-plus-ac-hes/
>
> It has four PCIe x16 slots (the top being the PCIe Gen 4 and the rest PCIe 
> Gen 
> 3), but if any of the bottom 3 slots are occupied, the top slot will *only* 
> run in x1 instead of x4 mode.  So you'll end up with four PCIe x16 slots, all 
> running in x1 mode.
>
> In addition, if you decide to plug in a NVMe card in the second M.2 port, 
> then 
> 2 out of the 6 SATA ports will be disabled - their bandwidth eaten up by the 
> second M.2 port.
>

This is still a improvement over the other mobo which didn't have the
slots at all, shared or otherwise.  That's the one thing that I did not
like about the other mobo.  The rest of it was fine.  The lack of PCIe
slots just become a deal breaker for me.

I plan to use the M.2 thingy that is closest to the CPU for the OS. 
>From my reading it is not shared but even if it is, it will still be
faster than my current SATA II drive, yes, SATA 2.  I don't plan to use
the one that is shared with the PCIe X1 slot.  I may lose two SATA mobo
ports but with a card but I can have either 10 or 12 ports on that
card.  Slower but still fast enough. 


>> A SAS card
>> would work just slower than when in X8 slot but the little PCIe 10 or 12
>> port cards would work fine.  Gives me 30 drives total at least on the
>> cards plus the four on the mobo, two goes away with using one of the
>> PCIe slots most likely.  Bifurcation I think they call it when they
>> share roadways.
> I can't find a PCIe x8 slot on the above MoBo ...  :-/
>

When I was discussing SAS or HBA cards, they were all X8 cards but Rich
reminded me that it would work in a X4 slot, just slower.  So, I'll
likely just use the X1 SATA cards I already have extras of and it is
whatever speed it can manage.  If I had a X8 slot and a HBA card that
could handle say 20 drives, then I'd go that route.  Since I really only
have a X4 slot, well, why bother. 


>> My thinking.  Build above now.  In a year, or two, I can build either
>> the rig I was working on a lot cheaper or a even newer rig that is even
>> faster if say AM6 socket CPUs have arrived.  Then the rig above with
>> some hard drive options can become the new NAS box.  I can then move
>> some drives out of the main rig, newer one a year or so down the road,
>> and not need so many PCIe slots in the main rig, hopefully anyway.  I
>> may even warm up to the idea of using USB for hard drives.  I'm
>> surprised hard drives don't come with USB connections instead of SATA
>> already.
>>
>> The only downside, the NAS box will have to run 24/7 as well.  Then I
>> have two puters running all the time.  To offset that, the combo above
>> does pull a lot less power than my current rig.  Not a huge difference
>> but fair amount.  Odds are the build a year or so down the road will
>> also pull less power than current rig.  I could end up with same amount
>> of power usage or less, even with two running instead of one.
>>
>> I said it was a crazy idea.  LOL   This time tho, I'm sort of planning
>> ahead instead of just coming up with a temporary fix all the time.  This
>> is also a little cheaper but still faster.  Another big thing, newer as
>> well.  My current rig is about 10 or 11 years old.  It may run another 5
>> or so years but could go out anytime.  At least I'll have a newer rig
>> not likely to let the smoke out.  Plus have a path to a more sane
>> setup.  I just need to get one of those chia harvester cases that holds
>> 40 or so hard drives.  ROFLMBO
>>
>> Dale
>>
>> :-)  :-) 
> I think it was mentioned, but for your assumed requirements I suggest you 
> take 
> a look at refurbished workstations, or tower servers.  They are designed to 
> run 24-7, can host a huge number of drives, have obscenely large amounts of 
> quad/octa channel RAM (ECC too), come with the full variety of expansion 
> options, disk caddies and adapters, receive OEM BIOS updates for ever and a 
> day and seek to minimise power consumption.  They won't scream their head off 
> in GHz, but more than compensate with multiple cores, dual CPUs, and more 
> than 
> enough PCIe lanes.
>
> I seems to me you are currently looking to buy the equivalent of a Ferrari in 
> terms of speed and technology, but intend to load it with rubble and use it 
> as 
> if it were a truck.  I'm exaggerating here, only to make a point.  It ought 
> to 
> be better overall if you buy a tool designed for the job you intend to use it.


I've looked at some of those.  They are very expensive.  I might can run
up on a good deal one day but given they are already used, who knows how
much life is left in the thing.  It is a good idea, just to large a bite
money wise.  I found a board once that I liked, just the mobo.  I could
build two, almost three, new rigs for what that thing costs used.  I'd
still have the CPU and memory to buy.  It was a bare mobo. 

I get the analogy.  I am getting better tho.  At least with this mobo, I
can re-purpose it later on.  With the other mobo I was looking at, it
would be good for a desktop where what comes on the mobo is all you need
but that's about the end of it.  I wish I had $2,000 or $3,000 to spend
on a used rig that is built to do exactly what I'm doing but I just
don't have it. 

The stuff is on the way.  I'll post when I get it built, or start a
thread if I get hung up with the install.  I've never even seen a EFI
system so that is all new to me.  The m.2 thing is new too.  Both have
good wiki pages so hopefully following that will get me going. 

Now I get to wait.  ;-) 

Dale

:-)  :-) 

Reply via email to