Much of the need for financial wellness programs is tied to policy changes stemming from the Great Depression. Over the next few weeks, I will explore different public policy initiatives, and how those policies affected American workers and contributed to the need for employer sponsored financial wellness programs. In my last post, I focused on retirement policy changes, including the Social Security Act and how it affected the financial habits of American workers. This, week I shift the focus to agricultural policies developed during the 1930s.
The Agricultural Adjustment Act
One focus for the federal government during the Great Depression was supporting the struggling farm sector by stabilizing and maintaining crop prices. The Agriculture Adjustment Act of 1933 was a subsidy initially intended to pay farmers to either not plant crops, or only plant partial plots of their land, to hold down supply and drive up prices for crops. Crops, and more specifically corn, were the backbone of the American economy. The agricultural sector in the United States was transformed by these subsidies, from a largely open market, to one that was directly subsidized and supported by the federal government. Today, these transfers and subsidies add up to between $15 and $35 billion annually.
While providing needed economic support to farmers and the economy during the Great Depression, this policy’s long-term legacy has been to incentivize farmers to grow a small number of subsidized crops likes corn, soybeans, and cattle feed, at the expense of more nutritious crops, like fruits and vegetables. Over time, these incentives distorted the market for crops, raised the price of more nutritious foods, and artificially held down the price of soybeans, grain, and corn.
The Long Term Effect on Americans
Among other effects, this flooded the market with new, innovative products developed with the help of these subsidies, such as high-fructose corn syrup, along with hydrogenated fats from soybeans and corn-fed beef and pork products. This supported the growth of fast food, soft drinks, and other heavily processed, pre-made food in the American diet. For instance, compared to 1950, the average person in 2000 consumed nearly 200 percent more corn, and 675 percent more corn sweeteners. In total, 85 pounds of corn sweeteners are now consumed per capita in the U.S. every year, which makes up over half of the total caloric sweetener market in the U.S.
The Effect on Employers
As workers consumed more corn and sweeteners they became unhealthier and more expensive to insure. As one sign of that, the average U.S. worker added 30 pounds of weight between 1950 and 2000. Similarly, the obesity rate nearly tripled among adults, now adding up to more than two-thirds of all adults in the United States. In fact, data indicates that population adjusted deaths attributable to heart disease tripled since passage of the Agriculture Adjustment Act in the 1930s. While this trend abated in recent years through medical advances and societal changes, current rates are some of the highest on record. To be sure, there are plenty of additional reasons why U.S. workers became unhealthier during this period, including changes in activity levels, increased reliance on personal cars, and work norms, among others. But, changing diets are undoubtedly a key driver.
Employers, in turn, have faced sharp inflation in healthcare premiums, as more of their workers incurred health-related expenses. Although historical data here is less readily available, the Henry J. Kaiser Foundation has surveyed employers every year since 1999 about their healthcare costs. They found that between 1999 and 2014, for instance, the premium employers paid for single beneficiaries increased by nearly 300 percent. For families during that same period, employers faced rates that increased by more than 200 percent. Importantly, these costs inflated for more reasons than just a progressively less healthy U.S. workforce. Once recent literature review found, for instance, that between 38-62 percent of the cost of inflation between 1940 and 1990 was attributable to technological innovations in medical devices and drugs.
The Need for Financial Wellness
As these costs grew alongside of the constant pressure among companies to maximize profits, it constrained the ability of employers to increase real wages, since more dollars allocated for human capital were going to benefits instead of compensation. This created a new incentive for financial wellness programs, as employers strive to help their employees make the most out of their income and better understand the economic value of their benefit investments.
Rising costs also led to healthcare plan innovation that further supported the emergence of financial wellness programs. In particular, a growing number of employers have adopted consumer driven healthcare plans (CDHPs) to reduce the moral hazard in their insurance plans, or the incentive employees have to over-consume healthcare and underinvest in their own health, since others largely bear these costs. Although these new plans often cost less for both employers and employees, only 30 percent of employees willingly adopt those plans, citing concerns over managing a new financial responsibility. Worse, those that are forced into them through a full replacement are often not saving enough for health expenses, largely because they feel like they cannot afford to save more. For these reasons, employers have sought out financial wellness programs to help workers realistically manage this new financial responsibility, all the while balancing all of their other routine financial responsibilities that take precedence for workers. These programs can provide personalized guidance to workers to help them figure out how much to save for healthcare. They also help workers find the money to afford these new health savings responsibilities.