Set D 1msec Packet Inter-Arrival Time
The following section shows the difference between automatically allowing the interface to detect the line speed and explicitly setting the line speed of an interface for 1msec packet inter-arrival time.
We can see that explicitly setting line speed reduced time stamp outliers of only the 100Mbit/sec test. Explicitly setting the line speed to 10Mbit/sec resulted in four far outliers.
Figure 30: Automatic 10Mbit/sec Figure 31: Explicit 10Mbit/sec Figure 32: Automatic 100Mbit/sec Figure 33: Explicit 100Mbit/sec
We can see from Tables 29 to 32 that explicitly setting the line speed in the 10Mbit/sec test did not improve time stamp accuracy. In the 100Mbit/sec test far outliers were eliminated by explicit line speed configuration.
Table 29: Automatic 10Mbit/sec (usec)
Mean Variance Standard Deviation 1083.294 1.221 1.105
Table 30: Explicit 10Mbit/sec (usec)
Mean Variance Standard Deviation 1083.293 1.300 1.140
Table 31: Automatic 100Mbit/sec (usec)
Mean Variance Standard Deviation 1008.407 1.684 1.298
Table 32: Explicit 100Mbit/sec (usec)
Mean Variance Standard Deviation 1008.407 1.226 1.107
Last Updated: Thursday 13-May-2004 09:54:49 AEST URL: Maintained by: Ana Pavlicic firstname.lastname@example.org Authorised by: Grenville Armitage email@example.com