In: Computer Science
A message consists of 8,000,000 bytes. The sender’s network adapter can transmit at a rate of 100,000 bytes per a second. The receiver’s network adapter can receive 40,000 bytes per second. It takes 20 ms for data to move across the network. Assuming no data is lost on the way, how long does it take for the message to read in by the receiver?
Hi, I would love to answer you with this question. Hope you like an answer and get a clear idea related to the same. So not wasting much time lets get started towards the question.
Let us first what we need to calculate and how, Then we will be moving towards the calculations :
We need to calculate the transmission delay for the sender and then the reciever who will recieve the data fully then this delay would also be calculated and with this the total propogation delay will be summed which is been provided as 20ms.
Now,
Senders Transmission Delay = Data size/Bandwidth 8,000,000 bytes / 100,000 bytes/sec = 80 sec = 80,000 ms
Recievers Transmission Delay = 8,000,000 bytes / 40,000 bytes/sec = 200 sec = 200,000 ms
Propogation Delay = Distance/Transmission speed 20ms(given)
Total time to read full data by the reciever is = 80,000 + 200,000+ 20 = 280,020 ms
This is the total calculation of the above question.
Please like an answer and do comment for any queries.
Thanks and Happy to help :)
HAPPY LEARNING