Calibration of Light Measuring Instruments
In an attempt to bring understanding to the calibration of light measuring instruments, I decided to put this article in the form of questions and answers.
What is the standard used today for calibrating instruments in the 220nm to the near infra-red region?
The best standard to date is a chopped thermopile system which has traceability to NIST (newest name for the Bureau of Standards) of +/- 0.5%.
If there is a +/- .05% standard, why are calibrations so much different from company to company and from calibration time to calibration time.
They should not be different. In using the thermal standard great care has to be made in setup and data taking. I have found that creating secondary silicon detector standards is most helpful since they are not affected by the thermal atmosphere. Once the time has been spent in making the secondary standard agree with the thermal standard then other calibrations become simpler. The +/-3.0 % traceability specification on calibration sheets is to allow for the transfer from the thermal standard to the secondary standard and then to the probe being calibrated.
Then what could cause my instrument to move since the last calibration?
There are several reasons for Instruments to move or change. I am talking more than 1.0%.
- Broken glass- this is rare
- Loose filter glass- or moved filter glass. This is caused by dropping a probe or rough handling. The change could be from 3.0% to 6.0%.
- If the probe uses a calibration pot then it could be settling during the first cycle of calibration. Again a 3.0% change.
- Dirt or foreign substance on probe glass – small aperture probes are more subject to this.
- Non silicon detector going bad- the GaAsP detectors when stressed (too much light) go into a failure mode (drop of signal).
- Deep UV probes can age in time and are more sensitive to environmental conditions.
- Someone made a mistake last time or this time- it happens.
Let me say that in general most probes are within +/-1.0% as they are cycled over 6 month periods.
Do I have to be traceable to NIST?
No. But you do need a reference point in case anything happens. I have many customers that like a certain number (e.g. 100) and I can set them so they get their number. All I have to do is know where they are in reference to NIST so I can keep them on their number. An example is a company like Nikon has a set of numbers that work for their machine. I have created a Nikon standard which directly ties to NIST. It works.
How close can I set probes to each other?
You can set probes to read within 1.0% of each other in the near UV range (310 to 436nm). Many of my customers are held to this window.
The biggest problems are deep UV probes since the filters (dielectric) do not have the same exact spectral curve and therefore they can agree on the light source and disagree on a second light source. Sometimes you see this as the lamp ages.
Another problem is when you are measuring a non collimated source. Mostly all the near UV and visible probes are built with opal glass to compensate for non collimated light.
The deep UV or dielectric filters are more drastically affected by off axis light and will give incorrect readings. You can compensate for this in a measure by using diffused quartz as an input glass. The quartz is not as good as opal glass but it helps.
When measuring an extended source (long tube) at close distance the problem is magnified and it is difficult to have two probes that read exactly the same even though they were set to agree on a collimated source.
How often do I need to calibrate my Radiometer?
It depends on use. If an instrument is used every day then I suggest once every six months. If used once a week then perhaps once a year. If the instrument is off you should ask why?
How long should an instrument last?
We need to look at this in terms of sensors and electronics, simple and complicated.
First a silicon sensor probe with a glass filter design (310nm up to 540nm) should last for twenty to thirty years or more if you do not break the glass (it can be replaced) or destroy the housing. The simple electronics measuring MW/CM2 should also last an equal amount of time if it is a quality instrument.
The more complex electronics can have component failures and some ic’s are not available after a period of time. You can get new electronics to do the same function with your old probe in many situations at a lower cost than buying a new unit.
The deep UV filters do not last as long and it depends on how they are used (taken care of) how long they last.
If I am measuring at high levels of MW/CM2 or W/CM2 and the calibration is done at a lower level is there an error?
You are really asking how linear is my instrument with probe. A silicon sensor with a good set of electronics will have an error of less than 1% over seven decades. So if the calibration is done at 20 MW/CM2 and you are reading 20W/CM2 then this is only three decades and there is no problem.
However I have seen some non silicon sensor systems that have a 1 to 2% error over one decade. The other problem is heat when you get to higher wattage sources. The detectors will drift when they get hot.
If you have questions, please contact us.