How to Reduce Errors in Spectroscopy Measurements

When gathering a measurement using a spectroscopy instrument, the user wants to be confident in the result. Whether the technique is OES, XRF, or LIBS, or if thickness or composition are measured, the result needs to be as accurate as possible. The initial step in trusting the results of the measurements is in understanding accuracy.

Accuracy Defined

When discussing spectroscopy, accuracy is a measure of how close the measured value is to the expected value. (It is hoped that the expected value is the real value, but as the real value cannot be known, the term ‘expected value’ is used.) The measure of accuracy depends on two factors:

  1. Trueness
    If multiple measurements are gathered, does the mean value match the expected value? This is also known as the ‘accuracy of the mean’.
  2. Precision
    The repeatability of measured values. If the same sample is measured multiple times, with the same equipment running the same procedure at the same point, how repeatable are the results?

The below diagram shows how trueness and precision are combined to influence accuracy.

Institut für Informatik - Humboldt-Universität zu Berlin

Institut für Informatik - Humboldt-Universität zu Berlin

It is easy to see that it is viable to have good precision, but poor trueness. It is also possible to have poor precision and high trueness (accuracy of the mean). Both high precision and trueness are needed for a truly accurate result.

Different Types of Error

Before looking at the errors the user can only hope to decrease, discussing the errors we would like to eliminate should be the starting point.

Gross Errors

The first step is to identify and remove gross errors from the measurements. By observing the diagram above, it can be established that gross error would result in a measurement lying fully outside of the green area and would likely be identified as an anomaly. Process errors, like sample contamination during preparation, can lead to gross errors.

Defective samples can also lead to gross errors, for example running the incorrect measurement routine, or cavities in the measurement area. Gross errors can be circumvented through utilizing the correct procedure and training. There are two varieties of errors the user must assume will exist within a measurement system:

Random Errors

Random errors relate to precision. The higher the random variation, the larger the error margin and the less precise the measurement. They are unpredictable and are estimated with statistical techniques, unlike systematic errors.

These measurement fluctuations can be a result of tiny changes in the measurement environment, inhomogeneity of the sample, and the measurement uncertainty of the reference samples utilized for calibration. The goal is to heighten precision as much as possible with well-maintained equipment and good procedures.

Systematic Errors

Usually, systematic errors relate to trueness, and give a consistent offset between the expected result and the mean of the measured sample. These are a result of faults of the equipment, like worn parts, lack of maintenance, or poor calibration.

It is possible to measure the offset and then incorporate a correction factor into your sample measurements, as the offset is consistent for every measurement within a defined area of interest. Systematic errors can be decreased by regular maintenance and calibration.

How to Gather Results You Can Trust

In reality, the only way you can trust the result completely is if you know the error margin of your readings. There is always a margin for error that comes from the limitations of the measurement system and random fluctuations within it for every measurement.

The gross errors should be eliminated in order to get the most accurate readings possible, decrease the systematic and random errors, then accept and calculate the remaining error margin within an agreed confidence level.

In essence the statement: Chromium composition is 20% +/- 0.2% at a 95% confidence level is a trustworthy result. Yet the statement: Chromium composition is 20% cannot be reliable as it is not complete.

This information has been sourced, reviewed and adapted from materials provided by Hitachi High-Tech Analytical Science.

For more information on this source, please visit Hitachi High-Tech Analytical Science.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Hitachi High-Tech Analytical Science. (2019, September 19). How to Reduce Errors in Spectroscopy Measurements. AZoM. Retrieved on December 09, 2019 from https://www.azom.com/article.aspx?ArticleID=18467.

  • MLA

    Hitachi High-Tech Analytical Science. "How to Reduce Errors in Spectroscopy Measurements". AZoM. 09 December 2019. <https://www.azom.com/article.aspx?ArticleID=18467>.

  • Chicago

    Hitachi High-Tech Analytical Science. "How to Reduce Errors in Spectroscopy Measurements". AZoM. https://www.azom.com/article.aspx?ArticleID=18467. (accessed December 09, 2019).

  • Harvard

    Hitachi High-Tech Analytical Science. 2019. How to Reduce Errors in Spectroscopy Measurements. AZoM, viewed 09 December 2019, https://www.azom.com/article.aspx?ArticleID=18467.

Ask A Question

Do you have a question you'd like to ask regarding this article?

Leave your feedback
Submit