In: Computer Science
Suppose an application generates chunks 60 bytes of data every
200msec. Assume that each chunk gets put into a TCP packet and that
the TCP packet gets put into IP packets. What is the % of overhead
that is added in because of TCP and IP combines?
1) 40%
2) 10%
3) 20%
4) 70%
Solution:
Given,
=>Data size = 60 bytes
=>Data is traversed through TCP and then IP.
The answer will be an option,
(a) 40%
Explanation:
=>Header size of TCP = 20 bytes to 60 bytes
=>Header size of IP = 20 bytes to 60 bytes
Calculating overhead:
=>By default minimum header size is taken into consideration hence TCP header size = 20 bytes and IP header size = 20 bytes
Packet:
IP header | TCP header | Application data |
20 bytes 20 bytes 60 bytes
=>Overhead in the packet is IP header and TCP header.
=>Overhead percentage = (total header size/total packet size)*100
=>Overhead percentage = (40/100)*100
=>Overhead percentage = 40%
=>Hence on the basis of above statements option (a) is correct and other options are incorrect because overeade percentage is 40% not 10% or 20% or 70%.
I have explained each and every part with the help of statements attached to the answer above.