This page is part of the BART project.
Set A 125usec Packet Inter-Arrival Time
The following section discusses the difference between automatically allowing the interface to detect the line speed and explicitly setting the line speed of an interface for 125usec packet inter-arrival time for Set A.
The following Figures show that for both the 10Mbit/sec and 100Mbit/sec settings, explicity setting the line speed causes the time stamps to be more consistant than when allowing the interface to detect the line speed. We can see in Figures 2 and 4 that there are several instances when a packet time stamp is an outlier at just below 7msec or at 5 or 6usec. This is a large contrast to having a maximum time stamp error of about 40usec when the line speed is statically set (as seen in Figures 3 and 5).
Figure 2: Automatic 10Mbit/sec Figure 3: Explicit 10Mbit/sec Figure 4: Automatic 100Mbit/sec Figure 5: Explicit 100Mbit/sec
The following tables show that configuring the interface to automatically detect line speed causes the varience and standard deviation to be very large. The outliers in the data cause this large range in varience and standard deviation. Note that since the time stamps are calculated by Tcpdump the mean should always be the inter-arrival time (in this case 125usec) plus the time it takes the packet to pass.
Table 1: Automatic 10Mbit/sec (usec)
Mean Varience Standard Deviation 208.00 4074.14 63.83
Table 2: Explicit 10Mbit/sec (usec)
Mean Varience Standard Deviation 208.00 1.28 1.13
Table 3: Automatic 100Mbit/sec (usec)
Mean Varience Standard Deviation 133.32 2712.97 52.09
Table 4: Explicit 100Mbit/sec (usec)
Mean Varience Standard Deviation 133.32 1.22 1.10
Last Updated: Tuesday 2-Dec-2003 09:45:51 AEDT URL: Maintained by: Ana Pavlicic firstname.lastname@example.org Authorised by: Grenville Armitage email@example.com