In: Computer Science
Consider a 1Mbps transmission channel. The clock at the receiver has a drift of 1 second in one year. How long a sequence of bits(or frame)can be sent before the clock drift could cause a problem? Assume that the sender and receiver are synchronized at the beginning of each frame and that they cannot resynchronize during the frame .Also, assume that the receiver samples the received signal at the middle of each bit duration to detect if it is 0 or 1.
A conversation forms a bi-directional communication link; there
is a
measure of symmetry between the two nodes, and messages pass to and
fro in the form of request and acknowledgement.
The transmission of a stream of bits from one device to another
across a transmission link involves a great deal of cooperation and
agreement between the two sides. One of the most fundamental
requirements is synchronization.
The receiver must know the rate at which bits are being received so
that reciever can sample
the line at appropriate intervals to determine the value of each
received bit.
Two techniques are in common use for this purpose.
1. In asynchronous transmission,
each character of data is treated independently. Each character
begins with a start
bit that alerts the receiver that a character is arriving. The
receiver samples
each bit in the character and then looks for the beginning of the
next character.
This technique would not work well for long blocks of data because
the receiver’s
clock might eventually drift out of synchronization with the
transmitter’s clock.
2. Sending data in large blocks is more efficient than sending
data one
character at a time. For large blocks, synchronous transmission is
used.
Each block of data is formatted as a frame that includes a starting
and ending flag. Some form of synchronization, such as the use of
Manchester encoding, is employed .Error detection is performed by
calculating an error-detecting code
that is a function of the bits being transmitted. The code is
appended to the transmitted bits. The receiver calculates the code
based on the incoming bits and compares it to the incoming code to
check for errors.Error correction
operates in a fashion similar to error detection but is capable of
correcting some errors in a transmitted bit stream.
As given, Assume 1mbps data rate.
Bit time is 1sec
Assume a drift between sender and receiver clock is 1%
First bit will have drift by 0.1 sec
After 50 bits the reciever will detect error sample.