Although Dynamic Light Scattering (DLS) has the ability to measure particles below 1 nm, it is preferentially sensitive to larger particles because of the 6th power relationship between scattering intensity and particle radius.
This suggests that sample preparation usually has to be dependable, particularly for low scattering samples such as biological molecules and proteins. The contribution to contaminants such as dust and aggregates can be alleviated by filtering; however, this may not always be practical or feasible based on the fragility and volume of the sample. Sample filtration can also create a financial burden, both in terms of extra sample preparation time and consumables costs.
A new DLS data capture process has been created known as Adaptive Correlation, which uses a statistically driven method to generate the best correlation data, which in turn offers more reliable DLS size data. This will diminish the need for filtering and give further confidence in DLS results.
The algorithm is valid for all samples suitable for measurement by DLS, although to establish the method, this article will cover measurements of Hen’s egg lysozyme which provides a challenging case as a sample that is low scattering, small, and prone to aggregation.
What is Adaptive Correlation?
Adaptive Correlation (AC) is a new method for recording and processing DLS data which aims to create the most reliable correlation function for exact determination of the diffusion coefficient and particle size.
In addition to using statistics to record the ideal amount of data in as short a time as possible, AC also employs statistical models to emphasize any captured data that is not illustrative of the sample, such as a rare dust particle transiting the detection volume of the measurement.
AC is not, however, a size filter and will not differentiate genuinely multimodal samples, but permits the characterization of steady or consistent state size components without data being lopsided by sporadic or transient scatters. This method suggests that not only has the tolerance to dust been enhanced, but measurements are faster and repeatability enriched, even for stable and clean samples.
Faster Measurements Without Compromise
Besides the statistical approach to detecting transient scattering events, the process for AC measurements has transformed. Conventionally, correlation data would be captured for as long as possible to subdue and average out any source of noise and perturbation. With AC, many short sub-runs can be performed and the resultant correlation functions can be averaged to attain this goal. This eventually increases measurement repeatability compared with measurements carried out over the same period but using longer sub-runs.
Within the new process, data quality is evaluated during the measurement, and additional data is only recorded if it will enhance confidence in the final reported size data. This process suggests that the measurement time can be accelerated up to three times compared to alternative correlation approaches.
Figure 1 illustrates particle-size distributions for repeat measurements of the same sample of lysozyme, calculated with and without AC. The measurements performed without AC reveal that the core particle size distribution is poorly demarcated and differs in position between repeat measurements, while the adaptive correlation result is repeatable without the need for an apparent large-sized material. Without AC, it would be tough to estimate the monomer size, and repeat sample preparation and measurement would probably be required.
Figure 1. Intensity-weighted particle size distributions for a 1 mg/ml dispersion of lysozyme. Top and bottom figures show repeat measurements of the same aliquot of sample performed with and without AC, respectively.
Reducing Sample Preparation Time and Effort
Due to the sensitivity of DLS to large material, it is advisable to filter dispersants and in certain cases even the sample, with filters with a smaller pore size clearly providing the best filtering. This, however, can make sample preparation hard and constitutes a financial burden with 20 nm syringe filters costing about $10 each.
For a protein sample, the sample may also need to be filtered after dispersing the dry protein so as to measure a monomodal dispersion, as the lysozyme may not completely disperse, and the use of too much dispersal energy can result in the formation of aggregates. With an anticipated hydrodynamic size of 3.8 nm, a 20 nm filter will be the ideal filter. Figure 2 illustrates particle-size distributions for aliquots filtered using filters of 100 and 20 nm pore size, with data captured using the latest AC algorithm and an alternative “dust rejection” algorithm.
Figure 2. Intensity-weighted particle size distributions for samples of 1 mg/ml of lysozyme dispersed in a pH4.0 Acetate buffer, filtered after dispersion using syringe filters of a different pore size and captured using different measurement processes.
This data illustrates that even as monomodal peaks with a steady mean position can be created using a 20 nm filter and the alternative algorithm, adaptive correlation creates superior resolved and repeatable results from the same sample but filtered using a 100 nm filter. The sample aliquot filtered to 100 nm measured using the alternative dust rejection algorithm reveals size peaks appearing over a range of positions, and both measurements using the alternative algorithm reveal small components at sizes beyond 1 µm, which are likely to be noise artifacts in this occurrence.
How Thorough Does Filtering Need to be?
Although it does not signify a characteristic sample preparation method, it can also establish the improved tolerance to aggregated material by blending filtered and unfiltered samples and thus varying the amount of aggregates present in the sample.
The data in Figure 3 was put together by first measuring an unfiltered dispersion of lysozyme, which was agitated by extreme mixing during dispersion, and then preparing aliquots that were partly filtered using a 20 nm filter.
Figure 3. ZAve particle size reported for samples of lysozyme, filtered by different proportions measured using an alternative dust rejection algorithm and AC. The proportion of 1 shown on the x-axis represents an aggregated and unfiltered sample. The data points represent the mean value from 5 repeat measurements, with standard deviations shown as error bars.
Each aliquot was then measured five times with the Zetasizer Nano ZSP, which uses a varied “dust rejection” algorithm and the Zetasizer Ultra with AC.
Both sets of measurements for the unfiltered case report a high ZAve, suggestive of the substantial presence of aggregated material; however, the results collected using AC reveal better repeatability.
AC is able to report a reliable and repeatable ZAve with least filtering, while a 16-fold reduction in the proportion of aggregates was required for the alternative algorithm to report an agreeable particle size.
By employing statistical analysis and optimized data collection, Adaptive Correlation offers an improvement in the repeatability of DLS particle size measurements and the ability to measure primary particle sizes individually for the characterization of rare quantities of aggregated material. These enhancements mean that faster, higher precision measurements may be realized with minimal necessity for filtering of dispersants and samples, indicating that streamlined sample preparation processes may be used and with the potential to lessen incurred costs for lab consumables.
This information has been sourced, reviewed and adapted from materials provided by Micromeritics Instrument Corporation.
For more information on this source, please visit Micromeritics Instrument Corporation.