Dynamic Light Scattering, also known as DLS, is a sensitive and robust method for characterizing macromolecules or particles in dispersion, because of its ability to resolve molecular or particle sizes ranging from sub-nanometer to several microns.
This sensitivity also means that DLS is a useful method for characterizing aggregated material, which may occur in far smaller quantities but has immense importance in many applications.
However, the presence of dust (including column shedding, filter spoil, tracer aggregates or material from dirty labware) can usually have detrimental effects on the measurement of smaller-sized particles, and algorithms are available to suppress these effects.
An innovative approach is presented here for handling DLS data which avoids the skewing of data for small particles while retaining insight into the presence of aggregates that may otherwise be lost, whereby it is possible to deduce the relative size and abundance of aggregates.
Figure 1. The appearance of an aggregate, t >8s, in the live data, which can degrade the accuracy of the time-averaged, measurement of the primary peak at 7.6 nm.
Materials and Methods
The results demonstrated in this article were produced by measuring a sample of Hen’s egg lysozyme (Sigma-Aldrich) in a pH 4.0 Acetate buffer, with measurements carried out on a Zetasizer Ultra. The results for a mixed dispersion of NIST traceable polystyrene latex particles dispersed in 10 mM NaCl are also demonstrated. All dispersions were prepared using DI water filtered to 200 nm.
Detecting Aggregates that Aren’t Always Present
The detection volume in a DLS measurement is considerably smaller than the total volume of the sample presented to the instrument. Although DLS can measure a more statistically significant number of particles when compared to nanoparticle tracking analysis (NTA) and scanning electron microscopy (SEM) techniques, distinct particles can possibly diffuse in and out of the detection volume during a measurement.
In a previous article, it was discussed how in an Adaptive Correlation DLS measurement, the data from a series of sub-measurements was grouped into transient and steady state data sets, elucidating particles that are invariably present in the detection volume of the sample and non-representative particles diffusing in and out of the detection volume respectively.
The analysis of the transient data enables better characterization of the transient particles whereas the separation of transient events provides better precision of the primary particle size.
Figure 2 demonstrates the difference in correlation functions between the steady state and transient correlation functions, and also the result if no classification had been applied and all the sub run measurements were averaged together. Within this measurement, the transient data signifies just a small portion of the data collected for the sample, and when all the data has been averaged, the considerable second decay in the transient correlation function is inhibited.
Figure 2. Autocorrelation functions for a sample of lysozyme, showing results for the steady state data, transient data, and unclassified data, i.e. all of the data.
From Figure 3, the result of this suppression and benefit of the transient classification can be further realized with the example data. In this case, both the steady state and unclassified measurements exhibit a lysozyme monomer peak at 3.8 nm, and also an aggregate peak at around 100 nm; however, the transient measurement also demonstrates that another larger size component is present, with a peak at 5 μm.
Figure 3. Steady state, transient and unclassified particle size distributions for a sample of aggregated lysozyme.
This data also shows that Adaptive Correlation is not a data filtering algorithm, since aggregates of a considerably larger size than the primary particle component are reported in the steady state result as they are invariably detected all through the measurement.
Characterizing These Rare Particles
As with any DLS measurement, sub-optimal concentration and sample scattering will restrict the reliability of particle size data; however, Figure 4 shows that the transient data can be employed to deduce reliable properties for any rare large particles, as an exact size is reported for a latex sample doped with particles of a known size.
Figure 4. Intensity-weighted particle size distribution for a sample of 60 nm latex dispersed in 10 mM NaCl, doped with a 1.6 µm latex at a range of different ratios.
How Rare is Rare?
Only sub runs that show a statistically significant difference from the main characteristics of the sample will be detected as transient, and as such, the amount of data categorized as transient will be dependent on the sample.
The importance of transient particles can be evaluated as their detection frequency is captured by the amount of data retained in the steady state result.
The data depicted in Table 1 illustrates a series of size measurements for a sample of 1 mg/ml lysozyme which had been thermally stressed. Due to the presence of only one peak size in the steady state data, all measurements report similar values for the Z-Average size and are all monomeric. However, the run retention, a percentage of the sub runs that were included in the analysis for the steady state result, decreases over time, demonstrating that the detection of transient scatterers is turning out to be more significant and indicating that this dispersion is not entirely stable, while still allowing you to report with some confidence on the protein’s monomeric hydrodynamic size.
Table 1. Numerical results for a series of measurements for a sample of 1mg/ml lysozyme
By applying a statistical approach to categorize data from a plurality of sub-measurements, it is possible to reliably create a measure of particles that are invariably present in the detection volume of a measurement, and are thus representative of the sample, and those that are not. Thanks to this classification, the steady state data can be reported without the effect of transient scatterers that can otherwise distort the size analysis results, and also all the characterization of transient scatterers with a resolution that otherwise would not be attained without data classification.
It has been shown that the size results from this transient data are reliable by measuring samples doped with particles of a known size, and the proportion of data categorized as transient can provide a better understanding of sample stability and the presence of rare aggregates, before they exist in such an abundance to become a part of the steady state result.
This information has been sourced, reviewed and adapted from materials provided by Micromeritics Instrument Corporation.
For more information on this source, please visit Micromeritics Instrument Corporation.