In: Physics
Imagine three light beams are "sent" to a lens simultaneously, they start at the same position but move towards the lens at different angles. The first light beam passes the lens at its edge, the second light beam passes exactly through its mid whilst the last light beam passes somewhere in between. Behind the lens, there is a screen at the focal point of the light beams. Which of the three light beams reaches the screen first?
I assume they all reach it at the same moment, but I can't explain why, it's just my intuition. Thus, I hope somebody can help me out here. Thanks for answers in advance.
//e: Sorry for being inaccurate. This is how the setup described should be like:
If the lens focuses perfectly, so that all rays from the given starting point end up at the same final point, then you're right: they all arrive at the same time. (To be precise, this is true within the limits of applicability of geometrical optics. But if we're not willing to assume that geometric optics applies, then it doesn't make sense to talk about individual rays of light anyway.)
The reason is Fermat's principle, otherwise known as the principle of least time. In a situation like this, each ray follows a path of least time joining the given starting and finishing points. If the times weren't the same, then at least one of them wouldn't be a minimum.
You might worry that there's a swindle here. Fermat's principle really only says that the path has to be a critical point, not that it has to be a minimum. But that doesn't affect the conclusion. Take two of the rays, and imagine a family of rays smoothly joining one to the other. Fermat's principle says that the derivative of the travel time is zero all along the family of rays, so there's no difference between the time at the beginning and the time at the end.
That last paragraph is a bit awkwardly phrased. Let me restate it precisely. Let T(r) be the travel time corresponding to ray r. Let r(0) be one of the rays under consideration, r(1) be another, and r(s) for 0<s<1 be a smooth function joining the two together. Each r(s) is a ray striking the lens at a different point, but all obeying the correct laws of optics. Fermat's principle says that ?T=0 is constant for all small variations about one of these rays, so dT(r(s))/ds=0, and T(r(0))=T(r(1)).