Page 1 of 1

StarWind iSCSI + ESXi IOPS Bottleneck

Posted: Mon May 20, 2013 2:17 pm
by sladky
We have Free StarWind 6 iSCSI target + Server 2012 + ESXi 5.1 + 10 Gbit LAN

Storage configuration:
2xXeon E5 2609 + 192 GB RAM
RevoDrive 3 480 GB PCI-E SSD Card
Intel X540 10 GBit NIC

ESXi host configuration:
2xXeon E5 2690 + 384 GB RAM
No HDD
Intel X540 10 GBit NIC

Dell 8024 10 Gbit Switch

Tested performance of equipment:
More 7.000.000 packets per second
More 1.2 Gbyte per second

Machines for test:
1. Physical Xeon E5 2687 + 64 RAM + SSD + IOMETER software + Microsoft iSCSI initiator
2. Virtual 2 x vCPU + 8 GB RAM + SSD (from Datastore) + IOMETER software

IOMETER Results:
1 physical machine:
-10 GB iSCSI Drive from RevoDrive 3 (1-way 4k blocks, random read) = 40.000 IOPS
-10 GB iSCSI Drive from RevoDrive 3 (64-way 4k blocks, random read) = 42.000 IOPS
-10 GB iSCSI Drive from RAM-Drive (1-way 4k blocks, random read) = 90.000 IOPS
-10 GB iSCSI Drive from RAM-Drive (64-way 4k blocks, random read) = 90.000 IOPS
CPU load = low

2 virtual machine:
-10 GB iSCSI Drive from RevoDrive 3 (1-way 4k blocks, random read) = 1.800 IOPS
-10 GB iSCSI Drive from RevoDrive 3 (64-way 4k blocks, random read) = 42.000 IOPS
-10 GB iSCSI Drive from RAM-Drive (1-way 4k blocks, random read) = 4.400 IOPS
-10 GB iSCSI Drive from RAM-Drive (64-way 4k blocks, random read) = 65.000 IOPS
CPU load = 25%

Where is the bottleneck in case of virtualisation?
-vitrual machine IOPS limit?
-ESXi software initiator limit?
-?????

Thank you.

Re: StarWind iSCSI + ESXi IOPS Bottleneck

Posted: Tue May 21, 2013 2:51 pm
by sladky
Hm, I think the problem is on the ESXi side...

IOMETER latency depends from CPU load of ESXi host
For examle:
0% CPU load = 0.25 ms latency (4000 IOPS)
80% CPU load = 0.5 ms latency (2000 IOPS)

Re: StarWind iSCSI + ESXi IOPS Bottleneck

Posted: Thu May 23, 2013 10:39 am
by Anatoly (staff)
Keep us updated please.
Here is document that maybe useful for you:
http://www.starwindsoftware.com/starwin ... ice-manual