Set C 1msec Packet Inter-Arrival Time
The following section shows the difference between automatically allowing the interface to detect the line speed and explicitly setting the line speed of an interface for 1msec packet inter-arrival time.
We can see that the outliers in Figure 22 are no longer as far from the theoretically ideal time-stamp in Figure 23 when then line speed was explicitly set to 10Mbit/sec. We can see that this is also true for Figures 24 and 25 where outlier distance is reduced by explicitly setting the line speed to 100Mbit/sec.
Figure 22: Automatic 10Mbit/sec Figure 23: Explicit 10Mbit/sec Figure 24: Automatic 100Mbit/sec Figure 25: Explicit 100Mbit/sec
We can see from Tables 21 and 22 that although the far outlier distance from the theoretically ideal time stamp are smaller, the variance and standard deviation has in fact increased, though only slightly. Tables 23 and 24 show that explicitly setting the line speed to 100Mbit/sec instead of allowing automatic detection slightly reduced the variance and standard deviation due to reduction in far outliers.
Table 21: Automatic 10Mbit/sec (usec)
Mean Variance Standard Deviation 1083.294 1.247 1.117
Table 22: Explicit 10Mbit/sec (usec)
Mean Variance Standard Deviation 1083.294 1.264 1.124
Table 23: Automatic 100Mbit/sec (usec)
Mean Variance Standard Deviation 1008.408 1.606 1.267
Table 24: Explicit 100Mbit/sec (usec)
Mean Variance Standard Deviation 1008.408 1.440 1.200
Last Updated: Thursday 13-May-2004 09:54:39 AEST URL: Maintained by: Ana Pavlicic firstname.lastname@example.org Authorised by: Grenville Armitage email@example.com