Set A 500usec Packet Inter-Arrival Time
The following section shows the difference between automatically allowing the interface to detect the line speed and explicitly setting the line speed of an interface for 500usec packet inter-arrival time.
The following Figures show that for both the 10Mbit/sec and 100Mbit/sec line speeds, explicity setting the line speed caused the time stamp values to be slightly more consistent than when allowing the interface to detect the line speed. This was more true for the 10Mbit/sec line speed, as the outliers in Figure 2 are no longer present as seen in Figure 3.
Figure 2: Automatic 10Mbit/sec Figure 3: Explicit 10Mbit/sec Figure 4: Automatic 100Mbit/sec Figure 5: Explicit 100Mbit/sec
The Tables below show the small improvement in variance and standard deviation occuring when the line speed was explicitly set to either 10Mbit/sec or 100Mbit/sec.
Table 1: Automatic 10Mbit/sec (usec)
Mean Variance Standard Deviation 583.251 1.012 1.006
Table 2: Explicit 10Mbit/sec (usec)
Mean Variance Standard Deviation 583.251 0.982 0.991
Table 3: Automatic 100Mbit/sec (usec)
Mean Variance Standard Deviation 508.364 0.973 0.986
Table 4: Explicit 100Mbit/sec (usec)
Mean Variance Standard Deviation 508.364 0.927 0.963
Last Updated: Thursday 13-May-2004 09:55:09 AEST URL: Maintained by: Ana Pavlicic email@example.com Authorised by: Grenville Armitage firstname.lastname@example.org