Full House: A History of American Multigenerational Living

While multigenerational living is often discussed as a new trend driven by recent economic factors, its roots actually extend much farther back. In fact, for most of American history, multigenerational living has been the norm, not the exception.

Throughout the 19th century, most Americans lived in a multigenerational household, with a majority of elderly Americans living with an adult child. The main driver of this living arrangement was the country’s agrarian economy. For farmers, there was an incentive to have many children, as this meant more help around the farm. It was common for one child to remain at the farm after reaching adulthood to continue working with the anticipation of eventually inheriting it. If more than one child stayed, the land was sometimes divided between children, forming smaller farms. This formed a natural aging plan for the parents who, if they lived long enough, stayed on the farm when they retired and were cared for by their children. Overall, the multigenerational phase was a normal stage of the pre-industrial family lifecycle.

Yet, as the country developed and the population grew, land became much more expensive in the eastern United States and there were only so many times a plot of land could be subdivided. This forced subsequent generations and newly arriving immigrants to look for opportunities elsewhere. Some decided to move west, in search of cheaper land, but had to embark on journeys that were often treacherous, forcing them to leave elders at home. Others began moving to cities, where new jobs were being created in factories and other industrial settings. But urban living was expensive, so it was not always feasible to bring along elders who could not work and contribute financially. By the end of the 19th century, multi-generational households began to decline in popularity.

It was at the turn of the 20th century that the first institutional buildings were built to house the elderly that were living alone. This was mostly the poor, elderly, and mentally ill who did not have children who could take care of them. These new institutions were mostly state-run “poorhouses” that were notorious for their poor living conditions for residents housed in big, factory-like buildings.

Locally, the post Civil War economy boom attracted an influx of immigrants to New England, causing the population to triple between the middle of the 19th and the beginning of the 20th century. This naturally caused a strain on housing, producing problems for municipalities throughout the region. Boston responded with an estimated 15,000 three-deckers being constructed between 1880 and 1930, and many other cities and towns in the region followed suit. This type of housing was popular with immigrants as it offered an affordable path to homeownership, as a nuclear family could live in one unit and rent out the other two, often renting these units to relatives. Thus, these buildings became a popular and economically viable example of multigenerational housing throughout the region.

But as this type of housing became associated with immigrants, three-deckers became a target of nativist and anti-immigrant sentiment. In Boston, zoning prevented the building of three-deckers in the affluent Downtown and Back Bay neighborhoods. And between 1910-1930, as anti-immigrant policies were enacted throughout the country, cities and towns in New England passed laws and zoning that limited the building of three-deckers, effectively freezing the area’s stock. Over the years, three-deckers have been demolished and replaced with smaller dwellings, such as single-family houses.

Nationally, the trend of separate living for elderly parents accelerated in 1935 with the introduction of the Social Security Act, which began to provide monthly payments to the elderly, allowing them to secure their own housing. This created a new market in which for-profit businesses offered elderly housing and in some cases, basic medical care. The first nursing homes were converted rooms in people’s homes, often nurses, that were rented out. But soon, buildings were specifically built and renovated for this purpose. Additionally, some elders decided to use these payments to remain in their homes, paying for nursing services when necessary.

After WWII, the popularity of multi-generational living reached its lowest historical levels and resulted in the creation of some of today’s norms. The causes for this decrease include the increasing popularity of the automobile, cheaper airfare, and introduction of Medicare, which provided seniors with health care, increased the percentage of the elderly living on their own, either at home or at an institution. Medicare made it more financially viable to live alone and better transportation made it easier for families to visit each other. Cultural trends pushed the average age of young adults leaving home consistently down throughout the middle of the 20th century. These trends continued through 1980 when only 12% of the US population lived in a multigenerational household, the lowest in history.

But since 1980, multigenerational living has become consistently more popular, with one-in-five Americans living in a multigenerational household in 2016. After decades of Americans living more apart, new factors are shifting housing back towards the historic trend.