Infrared, The Latest Myth In Home Inspection

In 2010 a international television celebrity declared himself guru of the home inspection industry in North America and, because of his popularity, launched a television series promoting the same. There is an old business strategy, “Make yourself different from the others so you stand out.” In this case, he proved the strategy correct when he chose to promote infrared cameras as a major home inspection tool. Along with sci-fi type support from movies and forensic science television programs that focus on entertainment with little concern for facts, he was well on his way to bringing something different to the viewing public. Overall, it was an extremely effective and successful business strategy, which, in this case, resulted in a successful TV series for him and a dramatic increase in sales for the manufacturers of infrared imagers. Yes, very successful for Mr. TV Celebrity, his sponsor and the infrared industry in general; however, not so good for the trusting public or the home inspection industry.

Yes, I.R. imagers are an excellent diagnostic tool having many applications. However, they cannot see through walls, they can’t see through anything, they can’t see water, and they can’t see mold, all they measure is the infrared energy (a type of temperature) of a surface. Further, in order for them to provide any useful information they require a very controlled set of conditions and operating environment, which typically are not available at the time of a home inspection.

So let’s back up here and share a little history. The knowledge of infrared has been around for over a hundred years. Although not a hundred years ago, I can remember the 1960s when Kodak tried to commercialize it by selling infrared film that could be used in my Brownie camera. At the time real I.R. imagers cost more than a half million dollars and required a small van to carry and transport the related equipment. At the end of the Cold War and with the advance of technology, these cameras eventually came down to a price level that some high-profile companies with specific applications could afford. Then, after the turn of the 21st century, a decent imager could be purchased for a mere $20,000. Like everything else revolving around technology and marketing it became a race of, how can these be made cheaper and how can the manufacturers increase sales volume? To date, they’ve succeeded to the point that imagers are available for under $1,000 with rumors that eventually we’ll see them in cell phones. But don’t get too excited. Like any low-end tool, what’s missing and do they really work? Also a big issue here (as with any other tool), does the operator have the required knowledge and skill to use it?

So what is infrared photography? First, it’s not photography and it’s not a camera. It’s called imaging and the tool is called an imager. Although not 100% accurate, this is a very simplified explanation that most people should be able to relate to. We all know what pixels are in our television or digital camera. Although this is not technically what it does, for simplicity, imagine the I.R. imager being able to read thousands of pixels (depending on the quality of the imager) being projected onto a surface and each of those pixels reads like a thermometer recording an individual reflective energy/temperature. Got it? Now let’s start to complicate things a bit and changing the term reflective energy/temperature to “the infrared energy being emitted from the surface or object.” Using this data, the imager then converts it to colors in order that the human eye and brain can readily relate to it. Very simplified, that’s what a thermal image is and the discipline of Thermography starts to get interesting.