In: Computer Science
Suppose a TCP client needs to send 3 packets to the
TCP server. Before sending the first packet, the estimated RTT is
50 ms, and the estimated deviation of the sample RTT is 10 ms. The
parameters α= 0.1, and β = 0.2. The measured sample RTT for the
three packets are 60ms, 70 ms, and 40 ms, respectively. Please
compute the time out value that was set for each packet right after
it is being transmitted out.
Solution:
Given,
=>EstimatedRTT_prev = 50 ms
=>DevRTT_prev = 10 ms
=> = 0.1, = 0.2
=>SampleRTT1 = 60 ms, SampleRTT2 = 70 ms, SampleRTT3 = 40 ms
Explanation:
Calculating EstimatedRTT when SampleRTT1:
=>EstimatedRTT = (1-)*EstimatedRTT_prev + *SampleRTT1
=>EstimatedRTT = (1-0.1)*50 ms + 0.1*60 ms
=>EstimatedRTT = 0.9*50 ms + 6 ms
=>EstimatedRTT = 45 ms + 6 ms
=>EstimatedRTT = 51 ms
Calculating DevRTT when SampleRTT1:
=>DevRTT = (1-)*DevRTT_prev + *|SampleRTT1 - EstimatedRTT|
=>DevRTT = (1-0.2)*10 ms + 0.2*|60 ms - 51 ms|
=>DevRTT = 0.8*10 ms + 0.2*9 ms
=>DevRTT = 8 ms + 1.8 ms
=>DevRTT = 9.8 ms
Calculating timeout when SampleRTT1:
=>Timeout = EstimatedRTT + 4*DevRTT
=>Timeout = 51 ms + 4*9.8 ms
=>Timeout = 51 ms + 39.2 ms
=>Timeout = 90.2 ms
Calculating EstimatedRTT when SampleRTT2:
=>EstimatedRTT = (1-)*EstimatedRTT_prev + *SampleRTT2
=>EstimatedRTT = (1-0.1)*51 ms + 0.1*70 ms
=>EstimatedRTT = 0.9*51 ms + 7 ms
=>EstimatedRTT = 45.9 ms + 7 ms
=>EstimatedRTT = 52.9 ms
Calculating DevRTT when SampleRTT2:
=>DevRTT = (1-)*DevRTT_prev + *|SampleRTT2 - EstimatedRTT|
=>DevRTT = (1-0.2)*9.8 ms + 0.2*|70 ms - 52.9 ms|
=>DevRTT = 0.8*9.8 ms + 0.2*17.1 ms
=>DevRTT = 7.84 ms + 3.42 ms
=>DevRTT = 11.26 ms
Calculating timeout when SampleRTT2:
=>Timeout = EstimatedRTT + 4*DevRTT
=>Timeout = 52.9 ms + 4*11.26 ms
=>Timeout = 52.9 ms + 45.04 ms
=>Timeout = 97.94 ms
Calculating EstimatedRTT when SampleRTT3:
=>EstimatedRTT = (1-)*EstimatedRTT_prev + *SampleRTT3
=>EstimatedRTT = (1-0.1)*52.9 ms + 0.1*40 ms
=>EstimatedRTT = 0.9*52.9 ms + 4 ms
=>EstimatedRTT = 47.61 ms + 4 ms
=>EstimatedRTT = 51.61 ms
Calculating DevRTT when SampleRTT3:
=>DevRTT = (1-)*DevRTT_prev + *|SampleRTT3 - EstimatedRTT|
=>DevRTT = (1-0.2)*11.26 ms + 0.2*|40 ms - 51.61 ms|
=>DevRTT = 0.8*11.26 ms + 0.2*11.61 ms
=>DevRTT = 9.008 ms + 2.322 ms
=>DevRTT = 11.33 ms
Calculating timeout when SampleRTT3:
=>Timeout = EstimatedRTT + 4*DevRTT
=>Timeout = 51.61 ms + 4*11.33 ms
=>Timeout = 51.61 ms + 45.31 ms
=>Timeout = 96.93 ms
I have explained each and every part with the help of statements attached to the answer above.