IR Thermometer - Any good for checking electrical terminations ?

I was thinking of using an IR thermometer to check the temperature of (heavy) electrical terminations on the equipment in our factory.

(Rather than pay a lot of money for a contractor's survey !)

Has anyone tried this before, & if so, did it prove to be a good idea ?

woodglass...

Reply to
woodglass
Loading thread data ...

After our local hospital paid a very high price for a complete IR analysis by a contractor that used an infrared photographic method I was hired by another contractor to verify the hot spots and fix them, and to re-examine all the terminations at panelboards and transformers. I used an IR thermometer and it worked very well. The photo technique only gave relative temperature indications through color where my thermometer gave precise temperatures. I tested every panelboard and transformer and found a fairly direct correlation between temperature and amperes. I was impressed. As I recall the IR thermometer was made by Fluke. The contractor that I worked for was Power Comm of Fairbanks, Alaska. The photo contractor was Aurora Electric of Anchorage, Alaska.

Reply to
electrician

On 30 Sep 2006 10:35:40 -0700, snipped-for-privacy@electrician2.com Gave us:

I used to make an IR thermometer that had a four inch tube (diameter) and a gold mirror, and a rifle stock and scope on it that electrical power engineers used to use to check insulator temps on HV towers from the ground as well as pole mounted transformer temps.

Now, IR imagery is almost assuredly what is used as they are all over the industrial market now. Back when I made these instruments, our IR imager required LN cooling and was a $90,000 product. We didn't have too many takers.

Imaging works fine as long as one has a good quality instrument and knows how to use it properly.

Reply to
JoeBloe

Ah, perhaps you can answer a question which has been bugging me ever since I bought an IR thermometer...

How do these things measure the temperature of an object which is colder than the thermometer itself? I can see how you do it if the sensor is at LN temperatures, but mine happily measures the freezer temperature (-25C) and cloud temperatures (-55C) when the sensor is at room temperature. So why doesn't the sensor "see" its own temperature drowning out that of the object? Is it done by ensuring the sensor and anything in its view has a very low IR emissivity, which is the only thing I could think of?

Reply to
Andrew Gabriel

On 01 Oct 2006 19:35:15 GMT, snipped-for-privacy@cucumber.demon.co.uk (Andrew Gabriel) Gave us:

The "sensor" is likely a single element transducer known as a "resistor bolometer".

Sure, it has an ambient reference point, but indeed all it "sees" is what the instrument designer "shows" it by way of the optical system used to provide it with a target.

So, a less energetic target will show a lower value. Since ALL matter above absolute zero emits IR, what seems cold to you is merely a different level of output to such a transducer.

Sure, the "baseline noise" is much lower for a cooled device, but it also must exist in a vacuum as condensation must be kept at nil as well. A bit harder to make mechanically speaking (thermally).

But sure, if the electronic have been designed such that the window of operation for the device was calibrated to those temps, then you will get accurate readings at those temps as long as you are certain of the emissivity of the target and have adjusted your instrument or the read values accordingly.

Reply to
JoeBloe

On Sun, 01 Oct 2006 13:29:28 -0700, JoeBloe Gave us:

snip

Forgot to mention that the active area for the single element resistor bolometer transducer device is (typically) about a two millimeter spot.

Reply to
JoeBloe

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.