Hello zfs-discuss,
I have an nfs server with zfs as a local file server. System is snv_39 on SPARC. There are 6 raid-z pools (p1-p6). The problem is that I do not see any heavy traffic on network interfaces nor using zpool iostat. However using just old iostat I can see MUCH more traffic going to local disks. The difference is someting like 10x - zpool iostat shows for example ~6MB/s of reads however iostat shows ~50MB/s. The question is who's lying? As server is behaving not that good in regards to performance I suspect iostat is more accurate. Or maybe zpool iostat shows only 'application data' being transferred while iostat shows 'real' IOs to disks - would there be that big difference (checksums, what else?)??? On the other hand when I look at how much traffic is on network interfaces it's much closer to what I see using zpool iostat. So maybe zfs introduces that much overhead after all and zpool iostats shows app data being tranfered. Clients mount resources using NFSv3 over TCP. nfsd is set to have 2048 threads - all are utilized most of the day. Below iostat and zpool iostat output - both run at the same time in different terminals. bash-3.00# iostat -xnzC 1 | egrep "devic| c4$" extended device statistics r/s w/s kr/s kw/s wait actv wsvc_t asvc_t %w %b device 1095.5 1390.7 64252.7 13742.5 0.0 137.5 0.0 55.3 0 1051 c4 470.2 4394.0 28882.4 3462.0 0.0 748.7 0.0 153.9 0 5388 c4 893.7 3262.0 55206.1 3124.8 0.0 680.4 0.0 163.7 0 6391 c4 965.6 3043.7 61801.2 2727.4 0.0 358.0 0.0 89.3 0 5119 c4 1162.8 2422.9 73277.0 5953.1 0.0 506.9 0.0 141.4 0 5390 c4 1693.1 1292.4 98599.2 1806.1 0.0 538.3 0.0 180.3 0 5204 c4 1551.7 1343.3 99808.4 1142.5 0.0 899.3 0.0 310.6 0 6300 c4 624.2 4002.8 39899.0 3435.7 0.0 429.7 0.0 92.9 0 4048 c4 1017.1 2735.7 65866.1 5809.7 0.0 325.9 0.0 86.8 0 4425 c4 1038.9 2817.9 66914.1 4276.2 0.0 212.3 0.0 55.0 0 4241 c4 784.4 3410.0 48851.0 9078.4 0.0 349.9 0.0 83.4 0 4579 c4 732.3 3542.8 46408.7 8075.1 0.0 526.4 0.0 123.1 0 4075 c4 928.1 3108.3 54917.9 7490.8 0.0 811.8 0.0 201.1 0 5750 c4 931.0 2943.1 55627.1 10331.7 0.0 846.0 0.0 218.4 0 5795 c4 ^C bash-3.00# bash-3.00# zpool iostat 1 capacity operations bandwidth pool used avail read write read write ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 79 72 1.15M 920K p2 738G 78.2G 64 92 1.06M 1.27M p3 733G 83.1G 61 98 1.12M 1.28M p4 665G 82.7G 5 11 51.8K 55.4K p5 704G 43.9G 80 61 1.09M 873K p6 697G 51.2G 73 67 1.04M 935K ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 13 128 276K 767K p2 738G 78.2G 16 129 1.47M 704K p3 733G 83.1G 10 192 388K 683K p4 665G 82.7G 16 3 37.1K 5.24K p5 704G 43.9G 11 172 34.2K 617K p6 697G 51.2G 12 35 31.4K 140K ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 5 87 27.2K 739K p2 738G 78.2G 15 93 39.6K 391K p3 733G 83.1G 15 151 51.5K 298K p4 665G 82.7G 73 27 1.07M 118K p5 704G 43.9G 41 62 1.85M 317K p6 697G 51.2G 16 152 75.8K 879K ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 39 83 211K 505K p2 738G 78.2G 27 76 562K 396K p3 733G 83.1G 38 77 109K 276K p4 665G 82.7G 0 1 0 6.67K p5 704G 43.9G 30 78 83.4K 596K p6 697G 51.2G 29 85 110K 702K ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 394 39 1018K 3.09M p2 738G 78.2G 12 157 29.0K 274K p3 733G 83.1G 2 109 12.8K 844K p4 665G 82.7G 3 4 14.3K 20.0K p5 704G 43.9G 32 44 85.2K 527K p6 697G 51.2G 62 47 3.93M 365K ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 159 0 421K 3.81K p2 738G 78.2G 18 86 174K 407K p3 733G 83.1G 28 89 121K 230K p4 665G 82.7G 94 17 7.27M 43.3K p5 704G 43.9G 25 0 225K 0 p6 697G 51.2G 80 28 7.10M 810K ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 287 4 736K 34.2K p2 738G 78.2G 2 81 8.06K 389K p3 733G 83.1G 9 57 19.0K 493K p4 665G 82.7G 62 17 5.38M 70.6K p5 704G 43.9G 28 18 315K 152K p6 697G 51.2G 70 3 7.26M 133K ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 24 20 75.3K 239K p2 738G 78.2G 36 228 576K 477K p3 733G 83.1G 0 220 4.74K 662K p4 665G 82.7G 33 12 323K 35.1K p5 704G 43.9G 9 311 26.6K 1.28M p6 697G 51.2G 31 2 87.4K 11.4K ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 28 1 148K 11.4K p2 738G 78.2G 47 109 1.42M 1.11M p3 733G 83.1G 24 243 73.3K 661K p4 665G 82.7G 0 0 4.28K 0 p5 704G 43.9G 32 234 95.6K 1.32M p6 697G 51.2G 66 24 177K 2.23M ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 12 6 29.4K 64.5K p2 738G 78.2G 27 98 80.1K 1.46M p3 733G 83.1G 21 71 171K 795K p4 665G 82.7G 58 5 2.77M 19.4K p5 704G 43.9G 23 92 61.1K 470K p6 697G 51.2G 209 9 561K 428K ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 18 1 56.7K 125K p2 738G 78.2G 17 76 48.2K 4.76M p3 733G 83.1G 29 95 88.5K 1.07M p4 665G 82.7G 10 112 146K 197K p5 704G 43.9G 16 368 45.5K 815K p6 697G 51.2G 26 129 79.6K 844K ---------- ----- ----- ----- ----- ----- ----- p1 751G 64.6G 28 49 76.2K 4.49M p2 738G 78.2G 37 9 107K 209K p3 733G 83.1G 27 138 198K 938K p4 665G 82.7G 17 164 415K 258K p5 704G 43.9G 7 223 29.9K 930K p6 697G 51.2G 21 132 905K 458K ---------- ----- ----- ----- ----- ----- ----- ^C bash-3.00# Example full iostat output (all disks, iostat -xnz 1) extended device statistics r/s w/s kr/s kw/s wait actv wsvc_t asvc_t %w %b device 1032.2 2751.8 64054.4 7491.7 0.0 682.4 0.0 180.3 0 5182 c4 21.2 45.5 1158.6 214.8 0.0 3.6 0.0 54.0 0 84 c4t500000E0118AC370d0 28.3 38.4 1494.2 208.8 0.0 4.8 0.0 71.7 0 84 c4t500000E0118B0390d0 11.1 64.7 711.7 302.8 0.0 10.9 0.0 143.2 0 100 c4t500000E0118F1FD0d0 1.0 0.0 64.7 0.0 0.0 0.1 0.0 74.5 0 8 c4t500000E011C19D60d0 1.0 0.0 64.7 0.0 0.0 0.1 0.0 76.2 0 8 c4t500000E0118C3220d0 47.5 2.0 3105.7 35.4 0.0 6.2 0.0 125.6 0 71 c4t500000E011902FA0d0 8.1 64.7 517.6 296.7 0.0 9.9 0.0 135.8 0 100 c4t500000E0118F2190d0 0.0 62.7 0.0 45.5 0.0 6.4 0.0 102.9 0 81 c4t500000E0119091E0d0 54.6 3.0 3558.6 35.4 0.0 8.3 0.0 143.2 0 80 c4t500000E011903120d0 10.1 61.7 647.0 271.4 0.0 6.6 0.0 92.2 0 94 c4t500000E0118F2350d0 1.0 61.7 64.7 45.5 0.0 6.4 0.0 101.8 0 84 c4t500000E0119032A0d0 47.5 2.0 3170.4 35.4 0.0 6.6 0.0 133.8 0 70 c4t500000E011903260d0 2.0 64.7 129.4 48.0 0.0 6.5 0.0 97.9 0 91 c4t500000E011909320d0 0.0 41.4 0.0 35.9 0.0 4.5 0.0 108.5 0 55 c4t500000E011903300d0 2.0 63.7 129.4 47.5 0.0 6.5 0.0 99.5 0 87 c4t500000E011909300d0 2.0 43.5 129.4 36.4 0.0 4.6 0.0 102.2 0 62 c4t500000E011903340d0 45.5 2.0 3041.0 35.4 0.0 6.4 0.0 135.0 0 70 c4t500000E011903320d0 1.0 62.7 64.7 46.5 0.0 6.4 0.0 100.8 0 84 c4t500000E0119095A0d0 22.2 45.5 1223.3 214.8 0.0 3.6 0.0 52.6 0 86 c4t500000E01192B420d0 37.4 2.0 2523.4 35.4 0.0 4.5 0.0 115.0 0 59 c4t500000E01190E6D0d0 9.1 55.6 582.3 283.6 0.0 5.3 0.0 81.4 0 86 c4t500000E01190E6B0d0 57.6 2.0 3752.7 35.4 0.0 8.9 0.0 149.6 0 82 c4t500000E01190E750d0 43.5 2.0 2846.9 35.4 0.0 5.3 0.0 116.4 0 66 c4t500000E01190E7F0d0 56.6 2.0 3752.7 35.4 0.0 8.4 0.0 144.0 0 81 c4t500000E01190E730d0 29.3 44.5 1558.9 212.8 0.0 5.0 0.0 67.1 0 96 c4t500000E01192B540d0 9.1 65.7 582.3 313.9 0.0 11.2 0.0 149.4 0 100 c4t500000E0118EDB20d0 0.0 78.9 0.0 75.3 0.0 35.0 0.0 443.8 0 100 c4t500000E0119495A0d0 1.0 0.0 64.7 0.0 0.0 0.1 0.0 85.8 0 9 c4t500000E01194A6F0d0 28.3 46.5 1611.5 213.8 0.0 5.0 0.0 67.0 0 97 c4t500000E01194A610d0 10.1 63.7 711.7 300.3 0.0 8.5 0.0 114.9 0 100 c4t500000E0118EDCC0d0 12.1 61.7 776.4 300.8 0.0 11.0 0.0 149.7 0 100 c4t500000E0118EDCA0d0 0.0 78.9 0.0 76.3 0.0 35.0 0.0 443.8 0 100 c4t500000E01194A750d0 0.0 78.9 0.0 76.3 0.0 35.0 0.0 443.8 0 100 c4t500000E01194A710d0 0.0 78.9 0.0 79.4 0.0 35.0 0.0 443.8 0 100 c4t500000E01194A730d0 0.0 78.9 0.0 75.8 0.0 35.0 0.0 443.8 0 100 c4t500000E01194A810d0 25.3 42.5 1300.1 208.3 0.0 4.4 0.0 64.3 0 85 c4t500000E0118C3230d0 9.1 58.6 582.3 262.8 0.0 5.9 0.0 86.4 0 88 c4t500000E0118F2060d0 42.5 3.0 2782.2 35.4 0.0 5.7 0.0 125.7 0 66 c4t500000E011902FB0d0 43.5 2.0 2846.9 35.4 0.0 5.3 0.0 115.7 0 65 c4t500000E0119030D0d0 2.0 64.7 129.4 47.5 0.0 6.7 0.0 99.9 0 87 c4t500000E011903030d0 11.1 61.7 711.7 303.3 0.0 9.6 0.0 131.7 0 98 c4t500000E0118F21C0d0 7.1 59.6 452.9 270.9 0.0 5.2 0.0 78.4 0 90 c4t500000E0118F2180d0 43.5 2.0 2846.9 35.4 0.0 5.6 0.0 122.3 0 68 c4t500000E0119030F0d0 60.7 2.0 3946.8 35.4 0.0 10.1 0.0 160.7 0 85 c4t500000E0119031B0d0 1.0 41.4 64.7 33.9 0.0 4.6 0.0 108.7 0 56 c4t500000E011903190d0 3.0 58.6 194.1 43.5 0.0 6.4 0.0 104.1 0 80 c4t500000E0119032D0d0 1.0 62.7 64.7 46.5 0.0 6.5 0.0 102.7 0 83 c4t500000E011903350d0 3.0 59.6 194.1 44.0 0.0 6.5 0.0 103.6 0 82 c4t500000E011903370d0 27.3 37.4 1429.5 207.2 0.0 4.6 0.0 70.7 0 80 c4t500000E01192B150d0 1.0 0.0 64.7 0.0 0.0 0.1 0.0 80.8 0 8 c4t500000E01192B2F0d0 1.0 0.0 64.7 0.0 0.0 0.1 0.0 83.2 0 8 c4t500000E0118ABA70d0 2.0 0.0 129.4 0.0 0.0 0.2 0.0 96.4 0 19 c4t500000E01192B390d0 30.3 44.5 1623.6 212.8 0.0 4.9 0.0 65.8 0 96 c4t500000E01192B3B0d0 28.3 41.4 1494.2 211.3 0.0 4.7 0.0 67.7 0 86 c4t500000E0118ABC50d0 2.0 0.0 129.4 0.0 0.0 0.2 0.0 94.1 0 19 c4t500000E0118B7B10d0 25.3 37.4 1417.4 209.3 0.0 4.0 0.0 63.4 0 80 c4t500000E0119494D0d0 8.1 53.6 517.6 264.4 0.0 4.5 0.0 72.8 0 82 c4t500000E0118EDA10d0 0.0 77.8 0.0 78.3 0.0 35.0 0.0 449.6 0 100 c4t500000E011949570d0 22.2 37.4 1223.3 209.3 0.0 3.5 0.0 59.0 0 75 c4t500000E01194A620d0 0.0 77.8 0.0 77.8 0.0 35.0 0.0 449.6 0 100 c4t500000E01194A660d0 0.0 77.8 0.0 76.3 0.0 35.0 0.0 449.6 0 100 c4t500000E011949630d0 28.3 44.5 1611.5 211.8 0.0 4.4 0.0 60.3 0 95 c4t500000E01194A740d0 0.0 77.8 0.0 74.3 0.0 35.0 0.0 449.6 0 100 c4t500000E01194A780d0 0.0 77.8 0.0 76.3 0.0 35.0 0.0 449.6 0 100 c4t500000E01194A720d0 1.0 0.0 64.7 0.0 0.0 0.1 0.0 77.5 0 8 c4t500000E01194A760d0 2.0 0.0 129.4 0.0 0.0 0.2 0.0 91.1 0 18 c4t500000E01194A8A0d0 0.0 77.8 0.0 74.3 0.0 35.0 0.0 449.6 0 100 c4t500000E01194A8C0d0 0.0 0.0 0.0 0.0 0.0 2.0 0.0 0.0 0 100 c4t500000E01194A840d0 ^C bash-3.00# All pools have atime set to off, and sharenfs is set. Other than that rest parameters are default. bash-3.00# zfs list NAME USED AVAIL REFER MOUNTPOINT p1 752G 51.6G 53K /p1 p1/d5201 383G 17.0G 383G /p1/d5201 p1/d5202 368G 31.5G 368G /p1/d5202 p2 738G 65.4G 53K /p2 p2/d5203 376G 24.2G 376G /p2/d5203 p2/d5204 362G 38.1G 362G /p2/d5204 p3 733G 70.3G 53K /p3 p3/d5205 366G 33.8G 366G /p3/d5205 p3/d5206 367G 33.4G 367G /p3/d5206 p4 665G 71.1G 53K /p4 p4/d5207 328G 71.1G 328G /p4/d5207 p4/d5208 337G 62.9G 337G /p4/d5208 p5 704G 32.2G 53K /p5 p5/d5209 310G 32.2G 310G /p5/d5209 p5/d5210 393G 6.52G 393G /p5/d5210 p6 697G 39.5G 53K /p6 p6/d5211 394G 5.76G 394G /p6/d5211 p6/d5212 302G 39.5G 302G /p6/d5212 bash-3.00# bash-3.00# zpool status pool: p1 state: ONLINE scrub: none requested config: NAME STATE READ WRITE CKSUM p1 ONLINE 0 0 0 raidz ONLINE 0 0 0 c4t500000E011909320d0 ONLINE 0 0 0 c4t500000E011909300d0 ONLINE 0 0 0 c4t500000E011903030d0 ONLINE 0 0 0 c4t500000E011903300d0 ONLINE 0 0 0 c4t500000E0119091E0d0 ONLINE 0 0 0 c4t500000E0119032D0d0 ONLINE 0 0 0 c4t500000E011903370d0 ONLINE 0 0 0 c4t500000E011903190d0 ONLINE 0 0 0 c4t500000E011903350d0 ONLINE 0 0 0 c4t500000E0119095A0d0 ONLINE 0 0 0 c4t500000E0119032A0d0 ONLINE 0 0 0 c4t500000E011903340d0 ONLINE 0 0 0 errors: No known data errors pool: p2 state: ONLINE scrub: none requested config: NAME STATE READ WRITE CKSUM p2 ONLINE 0 0 0 raidz ONLINE 0 0 0 c4t500000E011902FB0d0 ONLINE 0 0 0 c4t500000E0119030F0d0 ONLINE 0 0 0 c4t500000E01190E730d0 ONLINE 0 0 0 c4t500000E01190E7F0d0 ONLINE 0 0 0 c4t500000E011903120d0 ONLINE 0 0 0 c4t500000E01190E750d0 ONLINE 0 0 0 c4t500000E0119031B0d0 ONLINE 0 0 0 c4t500000E0119030D0d0 ONLINE 0 0 0 c4t500000E011903260d0 ONLINE 0 0 0 c4t500000E011903320d0 ONLINE 0 0 0 c4t500000E011902FA0d0 ONLINE 0 0 0 c4t500000E01190E6D0d0 ONLINE 0 0 0 errors: No known data errors pool: p3 state: ONLINE scrub: none requested config: NAME STATE READ WRITE CKSUM p3 ONLINE 0 0 0 raidz ONLINE 0 0 0 c4t500000E01194A620d0 ONLINE 0 0 0 c4t500000E0119494D0d0 ONLINE 0 0 0 c4t500000E0118ABC50d0 ONLINE 0 0 0 c4t500000E0118B0390d0 ONLINE 0 0 0 c4t500000E01194A610d0 ONLINE 0 0 0 c4t500000E01194A740d0 ONLINE 0 0 0 c4t500000E01192B3B0d0 ONLINE 0 0 0 c4t500000E0118C3230d0 ONLINE 0 0 0 c4t500000E0118AC370d0 ONLINE 0 0 0 c4t500000E01192B420d0 ONLINE 0 0 0 c4t500000E01192B540d0 ONLINE 0 0 0 c4t500000E01192B150d0 ONLINE 0 0 0 errors: No known data errors pool: p4 state: ONLINE scrub: none requested config: NAME STATE READ WRITE CKSUM p4 ONLINE 0 0 0 raidz ONLINE 0 0 0 c4t500000E01192B2F0d0 ONLINE 0 0 0 c4t500000E01194A760d0 ONLINE 0 0 0 c4t500000E01192B290d0 ONLINE 0 0 0 c4t500000E011C19D60d0 ONLINE 0 0 0 c4t500000E0118C3220d0 ONLINE 0 0 0 c4t500000E0118ABA70d0 ONLINE 0 0 0 c4t500000E01194A6F0d0 ONLINE 0 0 0 c4t500000E01192B390d0 ONLINE 0 0 0 c4t500000E01194A840d0 ONLINE 0 0 0 c4t500000E0118B7B10d0 ONLINE 0 0 0 c4t500000E01194A8A0d0 ONLINE 0 0 0 errors: No known data errors pool: p5 state: ONLINE scrub: none requested config: NAME STATE READ WRITE CKSUM p5 ONLINE 0 0 0 raidz ONLINE 0 0 0 c4t500000E0118EDCC0d0 ONLINE 0 0 0 c4t500000E0118EDCA0d0 ONLINE 0 0 0 c4t500000E0118F2060d0 ONLINE 0 0 0 c4t500000E0118F2350d0 ONLINE 0 0 0 c4t500000E0118F2180d0 ONLINE 0 0 0 c4t500000E0118F2190d0 ONLINE 0 0 0 c4t500000E0118EDB20d0 ONLINE 0 0 0 c4t500000E0118EDA10d0 ONLINE 0 0 0 c4t500000E01190E6B0d0 ONLINE 0 0 0 c4t500000E0118F21C0d0 ONLINE 0 0 0 c4t500000E0118F1FD0d0 ONLINE 0 0 0 errors: No known data errors pool: p6 state: ONLINE scrub: none requested config: NAME STATE READ WRITE CKSUM p6 ONLINE 0 0 0 raidz ONLINE 0 0 0 c4t500000E01194A810d0 ONLINE 0 0 0 c4t500000E01194A780d0 ONLINE 0 0 0 c4t500000E01194A710d0 ONLINE 0 0 0 c4t500000E011949630d0 ONLINE 0 0 0 c4t500000E01194A730d0 ONLINE 0 0 0 c4t500000E01194A660d0 ONLINE 0 0 0 c4t500000E0119495A0d0 ONLINE 0 0 0 c4t500000E01194A720d0 ONLINE 0 0 0 c4t500000E01194A750d0 ONLINE 0 0 0 c4t500000E01194A8C0d0 ONLINE 0 0 0 c4t500000E011949570d0 ONLINE 0 0 0 errors: No known data errors bash-3.00# -- Best regards, Robert mailto:[EMAIL PROTECTED] http://milek.blogspot.com _______________________________________________ zfs-discuss mailing list zfs-discuss@opensolaris.org http://mail.opensolaris.org/mailman/listinfo/zfs-discuss