Set B 1msec Packet Inter-Arrival Time
The following section shows the difference between automatically allowing the interface to detect the line speed and explicitly setting the line speed of an interface for 1msec packet inter-arrival time.
Explicitly setting the line speed to 10Mbit/sec resulted in the time stamps of the packets being surpisingly less accurate than allowing them to be automatically detected. We can see in Figure 15 that two outliers further out than those in Figure 14 have caused this. The time stamps did, however, improve when the line speed was explicitly set in the 100Mbit/sec tests, although outliers still occured.
Figure 14: Automatic 10Mbit/sec Figure 15: Explicit 10Mbit/sec Figure 16: Automatic 100Mbit/sec Figure 17: Explicit 100Mbit/sec
We can see from Tables 13 to 16, the the slight improvement in variance and standard deviation when the line speed was explicitly set only occured for the 10Mbit/sec test. Outliers were still present, however, in all tests.
Table 13: Automatic 10Mbit/sec (usec)
Mean Variance Standard Deviation 1083.294 1.318 1.148
Table 14: Explicit 10Mbit/sec (usec)
Mean Variance Standard Deviation 1083.294 1.321 1.149
Table 15: Automatic 100Mbit/sec (usec)
Mean Variance Standard Deviation 1008.408 1.811 1.346
Table 16: Explicit 100Mbit/sec (usec)
Mean Variance Standard Deviation 1008.408 1.264 1.124
Last Updated: Thursday 13-May-2004 09:54:30 AEST URL: Maintained by: Ana Pavlicic firstname.lastname@example.org Authorised by: Grenville Armitage email@example.com