HealthDay News — There is a longitudinal association of neighborhood-level disadvantage with cortical thinning and cognitive decline, according to a study published online April 14 in Neurology.
Jack F. V. Hunt, Ph.D., from the University of Wisconsin School of Medicine and Public Health in Madison, and colleagues collected longitudinal magnetic resonance imaging and cognitive testing data from 601 cognitively unimpaired individuals (mean baseline age, 59 years) to examine whether neighborhood-level disadvantage is associated with neurodegeneration and cognitive decline.
The researchers found that living in the 20 percent most disadvantaged neighborhoods relative to state of residence was associated with cortical thinning in Alzheimer signature regions and decline in the Preclinical Alzheimer’s Cognitive Composite, especially the Trails-Making Test Part B, but not with the Rey Auditory Verbal Learning Test or Story Memory Delayed Recall subtests. After researchers controlled for racial and demographic differences between neighborhood-level disadvantage groups, the associations were attenuated but remained significant. The association between neighborhood-level disadvantage and cognitive decline was partially mediated by cortical thinning.
“The longitudinal structural degeneration and cognitive decline observed in individuals from the most disadvantaged neighborhoods suggests that increased clinical vigilance for early signs of dementia may be particularly important in this vulnerable population,” the authors write. “Further elucidation of the social and biological pathways linking neighborhood-level disadvantage, neurodegeneration and cognitive decline may aid clinicians, researchers, and policymakers in identifying effective avenues for prevention and intervention in Alzheimer disease and related dementia.”
Abstract/Full Text (subscription or payment may be required)