This page is part of the BART project.
Set A 500usec Packet Inter-Arrival Time
The following section discusses the difference between automatically allowing the interface to detect the line speed and explicitly setting the line speed of an interface for 500usec packet inter-arrival time for Set A.
The following Figures show that again for both the 10Mbit/sec and 100Mbit/sec settings, explicity setting the line speed causes the time stamp values to be more consistant than when allowing the interface to detect the line speed. We can clearly see in Figures 7 and 9 that packets being sent on the 10Mbit/sec line require an additional 83usec to propagate, while packets sent on the 100Mbit/sec line require only 8.3usec.
Figure 6: Automatic 10Mbit/sec Figure 7: Explicit 10Mbit/sec Figure 8: Automatic 100Mbit/sec Figure 9: Explicit 100Mbit/sec
The tables below confirm that configuring the receiving PC interface to automatically detect line speed greatly increases the varience and standard deviation.
Table 5: Automatic 10Mbit/sec (usec)
Mean Varience Standard Deviation 583.21 9577.29 97.86
Table 6: Explicit 10Mbit/sec (usec)
Mean Varience Standard Deviation 583.21 1.08 1.04
Table 7: Automatic 100Mbit/sec (usec)
Mean Varience Standard Deviation 508.33 8279.87 90.99
Table 8: Explicit 100Mbit/sec (usec)
Mean Varience Standard Deviation 508.33 1.07 1.04
Last Updated: Tuesday 2-Dec-2003 09:46:31 AEDT URL: Maintained by: Ana Pavlicic email@example.com Authorised by: Grenville Armitage firstname.lastname@example.org