In: Computer Science
Suppose Host A wants to send a large file to Host B. The path from Host A to Host B has three links, of rates R1 = 200 kbps, R2 = 2 Mbps, and R3 = 1 Mbps. Assuming no other traffic in the network, what is the average throughput for the file transfer? Suppose the file is 4 million bytes, on average, how long will it take to transfer the file to Host B? Repeat (a) and (b), but now with R2 reduced to 100 kbps.
Solution
a)
Consider givend data:
R1 = 200 kbps
R2 = 2 Mbps
R3 = 1 Mbps
The throughput for the file transfer
=min{R1,R2,R3}
=min{200 kbps, 2 Mbps, 1 Mbps}
=200 kbps
So, the throughput for the file transfer=200 kbps
b)
Consider given data:
The file size= 4 million bytes
Convert million bytes to bits
=32000000 bits.
From (a),
Throughput for the file transfer=200 Kbps
=200000 bps
Dividing the file size by the throughput,roughly how long will it take to transfer the file to Host B:
=file size/throughput for the file transfer
=32000000 bits/200000 bps
=160 seconds
c)
Consider the given data:
Repeat (a) and (b), but now with R2 reduced to 100 kbps.
That means
R2 = 100 kbps
R1 = 200 kbps
R3 = 1 Mbps
The throughput for the file transfer
=min{R1,R2,R3}
=min{200 kbps, 100 kbps, 1 Mbps}
=100 kbps
So, the throughput for the file transfer=100 kbps
Dividing the file size by the throughput,roughly how long will it take to transfer the file to Host B:
=file size/throughput for the file transfer
=32000000 bits/100000 bps
=320 seconds
--
all the best
please upvote