In: Physics
In the game of ScrabbleTM, the letters of the English alphabet are inscribed on tiles and a prescribed number of tiles are provided for each letter. Consider an ensemble of ScrabbleTM tiles with a probability distribution defined by the frequency of tiles in the box. Explain how to calculate the information entropy of this ensemble. It is not necessary to actually Compute the entropy, but you can if you wish.
While taking a microscopic view, entropy can be defined as a measure of randomness in the microscopic level of particles with respect to distribution of energy, velocity etc. When heat is added to ice, it is first converted into water and then into vapor. In the whole process, randomness in the system increases due to addition of heat. Hence there is increase in the entropy of the system.
When randomness in the system increases, the measure of ignorance about the system also increases Ignorance is closely related with information. Thus, entropy is indirectly related with information or information system. Information is converted into digital form in computer. Greater the uniformity in the information, lesser is the number of digits required to represent them. Non-uniform information requires greater number of digits. Thus, information entropy can be assumed as measure of randomness or uncertainty in choosing a symbol for identifying individual state of the particles in the system.
Information entropy can be mathematically represented as follows:
Here, σ is measure of information entropy. p1 is probability of a state according to probability distribution in a given ensemble of states.
In the game of “Scrabble”, a certain number of letters are given in jumbled order and meaningful words are required to be framed using those given words.
Calculate the information entropy of the outcomes of the game of Scrabble using law of probability. If all the letters are different and there are n number of letters, then number of words that can be framed will be n!. Hence, possibility of each word will be 1/n!.
Substitute 1/n! for A in the equation (1) and only one term will be there in summation because possibility of each word is same.
While taking a microscopic view, entropy can be defined as a measure of randomness in the