In: Computer Science
How can i calculate this?
A file of size F = 8 Gbits needs to be distributed to10 peers. Suppose the server has an upload rate of u = 68 Mbps, and that the 10 peers have upload rates of: u1 = 20 Mbps, u2 = 22 Mbps, u3 = 12 Mbps, u4 = 19 Mbps, u5 = 25 Mbps, u6 = 24 Mbps, u7 = 18 Mbps, u8 = 11 Mbps, u9 = 14 Mbps, u10 = 30 Mbps, and download rates of: d1 = 28 Mbps, d2 = 30 Mbps, d3 = 12 Mbps, d4 = 20 Mbps, d5 = 18 Mbps, d6 = 15 Mbps, d7 = 14 Mbps, d8 = 34 Mbps, d9 = 20 Mbps, d10 = 34 Mbps. What is the minimum time needed to distribute this file from the central server to the 10 peers using the client-server model (round to the nearest second)?