Tuesday, March 3, 2009

Benchmarking your EON ZFS NAS

Being able to test the performance of your storage unit is always important. Creating real world application loads and recording accurate statistics is not easy. Or is it? Sun has a great tool for this, called Filebench. Filebench is a new framework for simulating applications on file systems. So let's use filebench to test our EON ZFS NAS. There's a wide range of tests that can be performed and a detailed howto (see example varmail run) is here. Download the filebench_opensolaris-1.3.4_x86_pkg.tar.gz here Unpack it on your zpool
gzip -dc filebench_opensolaris-1.3.4_x86_pkg.tar.gz | tar -xf -
Create the necessary link
(cd /usr ; ln -s ../ZPOOL/filebench/reloc  benchmarks)
(cd /opt ; ln -s ../ZPOOL/filebench/reloc/filebench filebench)
That's it. We are ready to test.

Testing my PIII, Dell 4100, 1Ghz w 512Mb, 2Gb swap and 3x36Gb raidz1 pool royal produced the following:
::::::::::::::
copyfiles.stats
::::::::::::::
Flowop totals:
closefile2 997ops/s 0.0mb/s 0.0ms/op 11us/op-cpu
closefile1 997ops/s 0.0mb/s 0.0ms/op 19us/op-cpu
writefile2 997ops/s 15.0mb/s 0.2ms/op 230us/op-cpu
createfile2 997ops/s 0.0mb/s 0.3ms/op 304us/op-cpu
readfile1 998ops/s 15.0mb/s 0.1ms/op 109us/op-cpu
openfile1 998ops/s 0.0mb/s 0.1ms/op 113us/op-cpu

IO Summary: 6002 ops 5983.2 ops/s, 998/997 r/w 29.9mb/s,
4373uscpu/op
::::::::::::::
createfiles.stats
::::::::::::::
Flowop totals:
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
closefile1 189ops/s 0.0mb/s 0.6ms/op 19us/op-cpu
writefile1 189ops/s 2.9mb/s 34.1ms/op 229us/op-cpu
createfile1 189ops/s 0.0mb/s 44.1ms/op 367us/op-cpu

IO Summary: 149974 ops 566.2 ops/s, 0/189 r/w 2.9mb/s, 4
9326uscpu/op
::::::::::::::
deletefiles.stats
::::::::::::::
Flowop totals:
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
deletefile1 2725ops/s 0.0mb/s 3.9ms/op 140us/op-cpu

IO Summary: 50000 ops 2725.3 ops/s, 0/0 r/w 0.0mb/s,
0uscpu/op
::::::::::::::
mongo.stats
::::::::::::::
Flowop totals:
deletefile1 499ops/s 0.0mb/s 0.2ms/op 204us/op-cpu
closefile2 500ops/s 0.0mb/s 0.0ms/op 13us/op-cpu
readfile1 500ops/s 7.0mb/s 0.1ms/op 115us/op-cpu
openfile2 500ops/s 0.0mb/s 0.1ms/op 105us/op-cpu
closefile1 500ops/s 0.0mb/s 0.0ms/op 18us/op-cpu
appendfilerand1 500ops/s 4.0mb/s 0.3ms/op 292us/op-cpu
openfile1 500ops/s 0.0mb/s 0.1ms/op 84us/op-cpu

IO Summary: 7006 ops 3496.4 ops/s, 500/500 r/w 11.0mb/s,
4771uscpu/op
::::::::::::::
multistreamread.stats
::::::::::::::
Flowop totals:
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqread4 2ops/s 1.8mb/s 455.4ms/op 13636us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqread3 2ops/s 2.0mb/s 428.7ms/op 24988us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqread2 2ops/s 1.7mb/s 473.3ms/op 27942us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqread1 1ops/s 1.4mb/s 556.6ms/op 27728us/op-cpu

IO Summary: 83 ops 7.2 ops/s, 7/0 r/w 6.9mb/s, 13323
21uscpu/op
::::::::::::::
multistreamreaddirect.stats
::::::::::::::
Flowop totals:
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqread4 3ops/s 2.7mb/s 348.3ms/op 16887us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqread3 3ops/s 3.2mb/s 269.0ms/op 20442us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqread2 3ops/s 2.6mb/s 270.5ms/op 18601us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqread1 3ops/s 2.9mb/s 303.6ms/op 23271us/op-cpu

IO Summary: 128 ops 11.8 ops/s, 12/0 r/w 11.4mb/s, 8253
65uscpu/op
::::::::::::::
multistreamwrite.stats
::::::::::::::
Flowop totals:
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqwrite4 6ops/s 5.9mb/s 160.7ms/op 8830us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqwrite3 5ops/s 5.4mb/s 174.0ms/op 8889us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqwrite2 6ops/s 6.0mb/s 157.6ms/op 8765us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqwrite1 6ops/s 6.2mb/s 150.0ms/op 9035us/op-cpu

IO Summary: 248 ops 23.8 ops/s, 0/24 r/w 23.4mb/s, 332
054uscpu/op
::::::::::::::
multistreamwritedirect.stats
::::::::::::::
Flowop totals:
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqwrite4 6ops/s 5.4mb/s 170.1ms/op 9011us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqwrite3 6ops/s 5.5mb/s 167.2ms/op 8884us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqwrite2 6ops/s 6.2mb/s 148.6ms/op 8877us/op-cpu
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqwrite1 6ops/s 5.4mb/s 167.6ms/op 9260us/op-cpu

IO Summary: 249 ops 22.9 ops/s, 0/23 r/w 22.5mb/s, 347
543uscpu/op
::::::::::::::
randomread.stats
::::::::::::::
Flowop totals:
rand-rate 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
rand-read1 11269ops/s 88.0mb/s 0.1ms/op 67us/op-cpu

IO Summary: 112852 ops 11269.1 ops/s, 11269/0 r/w 88.0mb/s, 8
63uscpu/op
::::::::::::::
randomwrite.stats
::::::::::::::
Flowop totals:
rand-rate 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
rand-write1 8569ops/s 66.9mb/s 0.1ms/op 95us/op-cpu

IO Summary: 85813 ops 8569.1 ops/s, 0/8569 r/w 66.9mb/s,
1139uscpu/op
::::::::::::::
singlestreamread.stats
::::::::::::::
Flowop totals:
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqread 45ops/s 45.1mb/s 21.3ms/op 7863us/op-cpu

IO Summary: 472 ops 45.2 ops/s, 45/0 r/w 45.1mb/s, 2169
77uscpu/op
::::::::::::::
singlestreamwrite.stats
::::::::::::::
Flowop totals:
limit 0ops/s 0.0mb/s 0.0ms/op 0us/op-cpu
seqwrite 25ops/s 25.2mb/s 36.5ms/op 8801us/op-cpu

IO Summary: 280 ops 25.3 ops/s, 0/25 r/w 25.2mb/s, 341
069uscpu/op

1 comment:

Brandi said...

You can also test your network performance with iperf (graphical frontend jperf). Only for notice :-)

Cheers

Thomas