Underestimation of risk associations due to regression dilution in long-term follow-up of prospective studies.
Clarke R., Shipley M., Lewington S., Youngman L., Collins R., Marmot M., Peto R.
In prospective studies, disease rates during follow-up are typically analyzed with respect to the values of factors measured during an initial baseline survey. However, because of "regression dilution," this generally tends to underestimate the real associations of disease rates with the "usual" levels of such risk factors during some particular exposure period. The "regression dilution ratio" describes the ratio of the steepness of the uncorrected association to that of the real association. To assess the relevance of the usual value of a risk factor during particular exposure periods (e.g., first, second, and third decades) to disease risks, regression dilution ratios can be derived by relating baseline measurements of the risk factor to replicate measurements from a reasonably representative sample of study participants after an interval equivalent to about the midpoint of each exposure period (e.g., at 5, 15, and 25 years, respectively). This report illustrates the impact of this time interval on the magnitude of the regression dilution ratios for blood pressure and blood cholesterol. The analyses were based on biennial remeasurements over 30 years for participants in the Framingham Study (Framingham, Massachusetts) and a 26-year resurvey for a sample of men in the Whitehall Study (London, England). They show that uncorrected associations of disease risk with baseline measurements underestimate the strength of the real associations with usual levels of these risk factors during the first decade of exposure by about one-third, the second decade by about one-half, and the third decade by about two-thirds. Hence, to correct appropriately for regression dilution, replicate measurements of such risk factors may be required at varying intervals after baseline for at least a sample of participants.