For the past few decades, nonmetropolitan areas have been facing a phenomenon called the “rural mortality penalty.” This penalty was once associated with cities, as crowding and poor sanitation helped contagious diseases spread quickly in urban environments. But this burden has shifted to rural areas over the last 40 years. The difference between urban and rural excess deaths, or the number of deaths that exceed the predicted amount each year, grew almost ten-fold from the 1980s to the early 2000s. Further, the death rate in the US in 2014 was the lowest in the country’s history, yet rural deaths by cancer and heart-disease between 1999 and 2014 declined at a slower rate than those in urban areas.
A new report in the American Journal of Public Health highlights changes in death rates in the United States between 1970 and 2016. The researchers consistently found poverty, education, race, and income to be associated with deaths by all causes over the 47-year study period.
The breakdown of mortality by poverty levels in urban and rural areas is depicted in the figure above. Rural high poverty areas had the highest mortality rates in 2016, at 900 deaths for every 100,000 people, and urban low poverty areas had the lowest rates, at 700 deaths for every 100,000 people. This gap in 2016 between the two areas is the widest since 1970. Looking specifically within rural areas, the gap between high and low poverty areas is also the widest it has been since 1970, with a difference of around 150 deaths per 100,000 people.
The rural mortality penalty that first became evident in the 1980s continues to exist, and seems to impact high-poverty rural areas the most. Rural America may be at particular risk because healthcare is more limited than in urban America.
Databyte via Arthur G. Cosby, M. Maya McDoom-Echebiri, Wesley James, Hasna Khandekar, Willie Brown, and Heather L. Hanna, Growth and Persistence of Place-Based Mortality in the United States: The Rural Mortality Penalty. American Journal of Public Health (AJPH).