In: Mechanical Engineering
The ocean heats during the day by absorbing sunlight and then cools at night by emitting infrared radiation. Over 12 hours of daylight, you can assume the average energy input to the ocean is 700 W/m2. How hot must the ocean surface be to re-radiate that energy every day?
Note: although there is some dispute based on technical considerations we're not going to discuss here, the hottest surface temperature ever recorded on Earth was in Furnace Creek Ranch, Death Valley, California in 1913 of 330 K (56.7 C). If your answer is exceeding that, think about the problem again and redo the calculation.
The thing here is, the ocean receives heat for just 12 hours a day, but it has an entire day to reject this heat.
Therefore. the net amount of heat received
where
Now, the ocean surface has got the entire day to radiate this heat.
Therefore, for radiation, we have:
where
Now, the Atmospheric temperature will not be 27 C (=300 K). This temperature is useful for convection currents and not radiation.
Any body on the earth's surface radiates heat to the upper atmosphere. The following is a graph which shows the variation of mass density and Temp VS altitude.
If we see clearly, the atmospheric mass density decreases a lot of 100 kms and isn't of much use to us. It doesn't matter if the temperature is increasing after this range, because there will be less particles which will absorb the radiations coming from the ocean surface. Therefore, we will maintain a limit of 100 kms for our reference. Thus.
So, we can see that for a range of 100 kms, the average temp is around 240 K.
Thus substituting this in the above equation, we get,
---------------------------------------------------
Kindly upvote if you are satisfied with my efforts. :)