In: Chemistry
Bragg's Equation Problem
X-rays of wavelength .0960nm are diffracted by a metallic crystal, angle of first order (n=1), is measured to be 17.8. What is the distance (in pm) between the layers of atoms responsible for diffraction?
I do this:
96pm / 2sin17.8 and I keep getting ~173.9pm
The answer is supposed to be 157pm. What am I doing wrong?
According to Bragg's Equation ,
Where
n = order = 1
= wavelength = 0.0960 nm = 0.0960x10 -9 m
d = distance between the layers of atoms = ?
= scattering angle = 17.8 o
Plug the values we get
Since 1 pm = 10 -12 m