No switch, just a direct cable at present, so point to point.
On 6/09/2014 5:55 a.m., Liam Slusser wrote:
Mark -
No I didn't change any of the HBA bios or QLT parameters. Are you going
into a fibre channel switch?
thanks,
liam
On Fri, Sep 5, 2014 at 1:06 AM, Mark <mark0...@gmail.com> wrote:
Thanks Liam,
no ssd's.
Did you change any hba bios or qlt parameters ?
Unfortuntely the only Emulex adapter I have is unsupported - an LPe1150.
I tried a different "brand" of 4G QLogic adapter.
After adding an alias for it to even be seen, the performance was the
same. Around 200Mb/sec read and under 30 for write.
I do have access to an 8Gb one I could try next.
The luns will eventually be for Commvault storage, with a 64k block size.
Trying different zfs block sizes hasn't made a significant difference to
the Windows throughput.
Mark.
On 4/09/2014 9:31 p.m., Liam Slusser wrote:
I have a few ZFS servers (OI and OmniOS) that are fibre Channel targets.
I
use the qLogic 2562 8g dualport FC HBA card in all of them with great
success.
One of my systems is similar to yours, with 2 x (12 x 4T SAS) attached via
a LSI 9207-8e SAS HBA. I am able to saturate both 8gb fibre channel
connections with a single host.
My target is a Sun/Oracle T4 system running Oracle Solaris 10 also with a
qLogic 2562. I generally use a small block size (8k) since I only use
these servers for an Oracle RDBMS 12c database. I use the Oracle ASM
filesystem on the target side.
I've never tried to mount to a Windows server before with my setup. You
didn't mention if you had a SSD for ZIL, but you might want to try
disabling the ZIL and seeing if that helps your performance.
You can also monitor what the ZIL is doing with a great little program
written by Richard Elling.
http://www.richardelling.com/Home/scripts-and-programs-1/zilstat With
this
you can monitor the ZIL writes - it should help to figure out if that is
your problem or not.
Good luck!
thanks,
liam
On Thu, Sep 4, 2014 at 1:29 AM, Mark <mark0...@gmail.com> wrote:
Does anyone have experience with Comstar as a Fibre Channel Target ?
I build one some years ago, which ran really well, but my latest effort
is
disappointing.
Throughput is abysmal, only reaching about 20 Mbytes/sec depending on
block size.
Zpool is raidz with 30 x 4Tb SAS disk attached to a 6G SAS IT controller.
Local disk throughput reaches over 400 Mbytes/sec.
The Initiator is Windows 2012 with Qlogic 8G adapter, and Qlogic 4G at
the
OI end, with straight cable.
It works, no errors I can find, but performance just sucks.
In desperation, I'm about to try swapping the Qlogic for an Emulex HBA.
Can anyone offer suggestions on identifying possible causes ?
Mark.
_______________________________________________
openindiana-discuss mailing list
openindiana-discuss@openindiana.org
http://openindiana.org/mailman/listinfo/openindiana-discuss
_______________________________________________
openindiana-discuss mailing list
openindiana-discuss@openindiana.org
http://openindiana.org/mailman/listinfo/openindiana-discuss
_______________________________________________
openindiana-discuss mailing list
openindiana-discuss@openindiana.org
http://openindiana.org/mailman/listinfo/openindiana-discuss
_______________________________________________
openindiana-discuss mailing list
openindiana-discuss@openindiana.org
http://openindiana.org/mailman/listinfo/openindiana-discuss
_______________________________________________
openindiana-discuss mailing list
openindiana-discuss@openindiana.org
http://openindiana.org/mailman/listinfo/openindiana-discuss