<br><div><span class="gmail_quote">On 10/18/06, <b class="gmail_sendername">John B</b> <<a href="mailto:discwuzit@yahoo.com">discwuzit@yahoo.com</a>> wrote:</span><br><blockquote class="gmail_quote" style="border-left: 1px solid rgb(204, 204, 204); margin: 0pt 0pt 0pt 0.8ex; padding-left: 1ex;">
How hard is it to see a 300 Kelvin object against a 2<br>Kelvin background? And that's assuming only<br>shirtsleeve conditions - no motors or other<br>'hotspots'...</blockquote><div><br>It would be more like a 20-40K background within the solar system and its not hard at all. The detectors in any IR telescope could manage it. Our mid-to-far IR technology isn't great (it isn't at the level of a 10 megapixel visible light camera for example) but that is because the market for such sensors isn't as large not because we can't really figure out how to do it.
<br><br>The real question isn't what temperature is it at as much as it is how much power is it radiating on an areal basis? If its only radiating 0.00001W /m^2 [1] its going to be hard to detect no matter *what* wavelengths the photons have.
<br><br>Robert<br><br></div>1. Thats an arbitrary number. The better way to look at it is whether its radiator size and our detector size are sufficiently matched that we can count photons at specific frequencies at levels significantly greater than the equipment or background noise.
<br></div><br>