Improved Nano-Catalyst DLS Particle Sizing  with Less Time and Effort

Although it offers the capacity to measure particles which are less than 1 nm in size, DLS is favorably sensitive to larger particles, as a result of the 6th power relationship between particle sizes and scattering intensity for particles of a size smaller than submicron.

A DLS measurement will be impaired by the presence of contaminants or aggregates, which are generally of greater size than the particles being studied within the sample. To illustrate, aggregates or agglomerates in a nano-catalyst suspension may result in erroneous sizing of the sample.

Nanoparticle size distribution has been shown to be strongly indicative of, and directly related to, nano-catalyst activity.

Sample preparation should, therefore, be meticulous and the impact of this effect mitigated by filtering. On the other hand, the volume and fragility of the sample may at times render this impractical or even impossible.

Filtering samples can also be an expensive process once both the cost of consumables and the extra preparation time for samples are taken into account. In some instances, filtering can even be the catalyst for aggregation.

Adaptive Correlation is a newly developed data capture process. This innovative technology employs an approach led by statistics in order to offer the best correlation data, and subsequently, the most reliable DLS size data. With this method, there is a reduced requirement for filtering and the DLS results can be considered more reliable.

Any sample which can be measured by DLS is compatible with the algorithm. This article illustrates the value of Adaptive Correlation by using a sample measurement of hen’s egg lysozyme. This is a somewhat difficult choice of sample, as it is small, low scattering and has a tendency to aggregate, and will thus display the benefits of using this method in DLS.

What is Adaptive Correlation?

Adaptive Correlation (AC), a novel method for capturing and processing DLS data, aims to deliver the most dependable correlation function for precise determination of the diffusion coefficient and particle size.

AC employs statistics to gather the most favorable quantity of data in the briefest time period possible and statistical models to identify any data collected that is not characteristic of the sample, such as that resulting from rare dust particles transiting the detection volume of the measurement.

AC does not filter by size and will not discriminate samples which are truly multimodal. It enables consistent or steady-state sized components to be characterized, without data being tainted by intermittent or transient scatters.

With this method, there is not only an improved tolerance to dust but measurements are quicker and greater repeatability is offered, even for clean and stable samples.

Faster Measurements Without Compromise

In the past, recording of correlation data would be carried out for as long as possible to subdue and average out any source of noise and perturbation. Functions of many short sub-runs are averaged with AC. In the end, this increases measurement repeatability in comparison with measurements which are carried out over the same duration but use longer runs.

As part of this new method, evaluation of data quality is carried out during the measurement, and further data is only recorded if it will improve confidence in the final reported size data. Thanks to this process, the measurement time can be just a third of that of traditional correlation function measurement methods.

Particle size distributions for repeat measurements of a single sample of this difficult sample lysozyme, measured with and without AC, can be seen in Figure 1. In the measurements carried out without AC, it can be seen that the main particle size distribution has little definition and varies in position between repeat measurements, while the AC result is repeatable, without the presence of apparent large sized material.

Intensity weighted particle size distributions for a 1 mg/ml dispersion of lysozyme. The top and bottom figures show repeat measurements of the same aliquot of sample performed without and with AC respectively.

Figure 1. Intensity weighted particle size distributions for a 1 mg/ml dispersion of lysozyme. The top and bottom figures show repeat measurements of the same aliquot of sample performed without and with AC respectively.

Estimation of the monomer size would be challenging without AC, and it is likely that repeat sample preparation and measurement would be required.

Reducing Sample Preparation Time and Effort

As a result of DLS’s sensitivity to bigger particles, the filtering of dispersants is advisable. In certain instances, it may also be necessary to filter the sample itself, with smaller pore sized filters. This makes for a tricky sample preparation procedure though and may be prohibitively expensive, as 20 nm syringe filters have a cost of around $10 each.

In Figure 2, we can see data collected with the novel AC algorithm and an alternative ‘dust rejection’ algorithm, which displays particle size distribution for aliquots passed through filters of 100 nm and 20 nm pore sizes.

Intensity weighted particle size distributions for samples of 1mg/ml of lysozyme dispersed in a pH4.0 Acetate buffer, filtered after dispersion using syringe filters of a different pore size and captured using different measurement processes.

Figure 2. Intensity-weighted particle size distributions for samples of 1mg/ml of lysozyme dispersed in a pH4.0 Acetate buffer, filtered after dispersion using syringe filters of different pore size and captured using different measurement processes.

This data indicates that although monomodal peaks with a steady mean position can be created using a 20 nm filter and the alternative algorithm, adaptive correlation yields better resolved and repeatable results from the same sample, but filtered with a 100 nm filter.

The sample aliquot filtered to 100 nm measured with the alternative dust rejection algorithm displays size peaks emerging at a variety of points, and each measurement using the traditional algorithm showed minor components at sizes over 1 µm, which are probably noise artifacts.

How Thorough Does Filtering Need to be?

Although it is not characteristic of a usual sample preparation method, the greater tolerance to aggregated material can also be shown through the mixing of filtered and unfiltered samples, thus creating a varied proportion of aggregates in the sample.  

Figure 3 shows data which was yielded by initially measuring an unfiltered dispersion of lysozyme, which was agitated by extreme mixing during dispersion, and following this, preparing aliquots that were partly filtered with a 20 nm filter. Five measurements were then taken of each aliquot with the Zetasizer Nano ZSP, which engages a different ‘dust rejection’ algorithm and the Zetasizer Ultra with AC.

ZAve particle size reported for samples of lysozyme, filtered by different proportions measured using an alternative dust rejection algorithm and AC. The proportion of 1 shown on the x-axis represents an aggregated and unfiltered sample. The data points represent the mean value from 5 repeat measurements, with standard deviations shown as error bars.

Figure 3. ZAve particle size reported for samples of lysozyme, filtered by different proportions measured using an alternative dust rejection algorithm and AC. The proportion of 1 shown on the x-axis represents an aggregated and unfiltered sample. The data points represent the mean value from 5 repeat measurements, with standard deviations shown as error bars.

In each set of measurements for the unfiltered case, a high ZAve is reported, which signifies a substantial presence of aggregated material. The results collected with AC, however, show greater repeatability.

AC can deliver a repeatable and reliable ZAve with minimal filtering. Conversely, for the alternative algorithm to display an aggregable particle size, the proportion of aggregates had to be reduced by 16 times.

Conclusion

Through the use of statistical analysis and optimized data collection, Adaptive Correlation delivers enhancement in the repeatability of DLS particle size measurements and the capacity to measure primary particle sizes as a separate process to the characterization of rare quantities of aggregated material.

These advances allow for the attainment of quicker measurements with greater precision and with a reduced requirement for filtering of samples and dispersants. This streamlining of sample preparation processes could potentially lower expenditure on lab consumables.

This information has been sourced, reviewed and adapted from materials provided by Particulate Systems.

For more information on this source, please visit Particulate Systems.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Particulate Systems. (2019, May 30). Improved Nano-Catalyst DLS Particle Sizing  with Less Time and Effort. AZoM. Retrieved on June 17, 2019 from https://www.azom.com/article.aspx?ArticleID=17973.

  • MLA

    Particulate Systems. "Improved Nano-Catalyst DLS Particle Sizing  with Less Time and Effort". AZoM. 17 June 2019. <https://www.azom.com/article.aspx?ArticleID=17973>.

  • Chicago

    Particulate Systems. "Improved Nano-Catalyst DLS Particle Sizing  with Less Time and Effort". AZoM. https://www.azom.com/article.aspx?ArticleID=17973. (accessed June 17, 2019).

  • Harvard

    Particulate Systems. 2019. Improved Nano-Catalyst DLS Particle Sizing  with Less Time and Effort. AZoM, viewed 17 June 2019, https://www.azom.com/article.aspx?ArticleID=17973.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this article?

Leave your feedback
Submit