Set A 1msec Packet Inter-Arrival Time
The following section shows the difference between automatically allowing the interface to detect the line speed and explicitly setting the line speed of an interface for 1msec packet inter-arrival time.
Explicitly setting the line speed to 10Mbit/sec and 100Mbit/sec results in the time stamps of the packets being accurately closer to the theoretically ideal time stamp. The time stamps of the packets when the interface line speed was configured to auto-detect once again caused a few outliers.
Figure 6: Automatic 10Mbit/sec Figure 7: Explicit 10Mbit/sec Figure 8: Automatic 100Mbit/sec Figure 9: Explicit 100Mbit/sec
We can see from the Tables below that the variance and standard deviation only slightly improved when the line speed was explicitly configured due to the elimination of the few far outliers.
Table 5: Automatic 10Mbit/sec (usec)
Mean Variance Standard Deviation 1083.294 1.238 1.113
Table 6: Explicit 10Mbit/sec (usec)
Mean Variance Standard Deviation 1083.294 1.074 1.036
Table 7: Automatic 100Mbit/sec (usec)
Mean Variance Standard Deviation 1008.408 0.933 0.966
Table 8: Explicit 100Mbit/sec (usec)
Mean Variance Standard Deviation 1008.408 0.882 0.939
Last Updated: Thursday 13-May-2004 09:54:19 AEST URL: Maintained by: Ana Pavlicic firstname.lastname@example.org Authorised by: Grenville Armitage email@example.com