Early Cold War Suburbanization, 1945-1970
by James Passarelli
Post-war suburbanization was the result of a complex web of political and economic circumstances that scholars have yet to adequately explore. One of the most important of these factors is also one of the most overlooked: the anxiety-filled onset of the Cold War. Though often cited in passing as an influence on certain aspects of suburbanization, the Cold War is rarely given the serious and microscopic treatment it deserves. It is understandable why historians and urbanists would shy away from a topic as complex as the war, about which much has been written outside a suburban context. This essay attempts to outline some of the most important effects of Cold War sentiments on American suburbanization in the first decades after World War II.
The Atomic Threat: Residential and Industrial Dispersion
The most obvious and well-documented way in which the Cold War cloud played a part in suburbanization in the 1950s and 1960s was the ever-present atomic threat. Beginning in the aftermath of the 1945 bombings of Hiroshima and Nagasaki, and increasing with the onset of the Cold War, the American public began to speculate on the possible effects of an atomic attack. Obviously, the most devastating impact would be felt in America’s dense urban centers, thus making them the most likely targets. Public officials began offering tax exemptions for suburban property in an effort to encourage decentralization and a move away from cities, a process that was known as “dispersion.”[1] As one suburban historian points out, it is hard to isolate the effects of the dispersion campaign, given its relation to economic and political factors that predate World War II. Public and corporate fear of an atomic threat, however, could be witnessed everywhere.
“There is no known military defense against the atomic bomb itself, except space,” stated a report from the National Security Resources Board. Likewise, architect Eliel Saarinen, when designing a masterplan for the city of Detroit in 1942, explained, “In the event of future aerial bombardment, such [spread out] planning would provide a ‘dispersion’ factor.”[2] Legislation was passed to further facilitate the decentralization, like the 1956 Federal Aid Highway Act, which set up a system of “defense highways” in case of the need for emergency dispersal. Perhaps the strongest statement regarding the pervasive atomic anxiety came from the Journal of the American Institute of Architects: “[A]tomic bombs and concentrated cities cannot exist in the same world.”[3] Hyperbolic, perhaps, but the sentiment was not uncommon.
Residential anxiety was high, but corporations had even more to lose in the event of a large-scale military attack. Though most companies were reluctant to express atomic anxiety, a 1952 Fortune report found that, of twenty-two companies privately interviewed, every one expressed the desire to move away from the city.[4] Among the major companies that followed through with the move were General Motors, General Foods, and General Electric, three of the most influential corporations of the twentieth century.
Safety, Privacy, and Access
Another often-overlooked residual effect of the Cold War was the increased concern with privacy and access. Human and architectural costs were not the only (and, one might argue, not even the primary) concerns for businesses in post-war America. Executives also began to plan to make sure their documents were kept safe, and in areas where they were not easily damaged. Further concern about access also affected the layout of company buildings and campuses. Corporations like AT&T capitalized on their large, sprawling campuses, using long driveways and abundant guard booths to keep a close watch on everyone who entered the property, while remaining “under the guise of giving friendly directions to large sites.”[5] Paranoia thus manifested itself not just in strictly spatial terms, but also in the installment of subtle defense techniques, taking the cue from the United States’ obsession with defense against outsiders during the Cold War.
Cold War Knowledge
A less obvious, and more indirect, result of the Cold War was the way in which it greatly changed Americans’ notions of knowledge and educational institutions during the beginning of the second half of the twentieth century. The most prominent of these shifts in the perception of knowledge institutions was the elevation of the sciences. The 1950s saw both an increase in prestige for scientists and an increased public interest in popular science publications like National Geographic and Science.[6] As competition with the Soviet Union became more cultural than military (although the two were always closely intertwined), scientists’ role in the stand-off became central in two ways: symbolically, as the future of American cultural and educational dominance; and practically, in directly competitive endeavors like the space race. New Masters and PhD programs in the sciences began springing up all over the country, and universities also had to deal with new competition as corporations attempted to lure them away from university settings.[7]
Concurrent with the elevation of scientists in the public eye was a shift in the class implications of the war. Throughout the two World Wars, and in the beginning of the 50s, the war effort was mainly associated with blue collar workers, who were instrumental in the large scale production of conventional military machinery and weaponry. As this machinery became exponentially more technologically advanced, however, the need for a highly skilled white collar labor force superseded earlier production needs.[8] The increase in white collar education went outside the realm of the hard sciences, and even spilled into architecture. In addition to a renewed appreciation of certain easy-to-use architectural materials like concrete, plastic, and laminated wood, an architectural writer observed in 1946, World War II had “spurred investigation and education.”[9] In addition to physicists and engineers, architects were also considered important progenitors of the new structural symbols of the triumph of American capitalism over competing socialist and communist regimes.[10]
Besides educational importance, the Cold War also spurred the development of what one scholar has deemed “new and influential kinds of urban ecosystems” of the 1960s. These clusters of technology firms complicate visions of decentralization because, instead of locating themselves throughout the suburban landscape, they clustered together in fringe regions, reflecting many corporate executives’ continued belief in close physical proximity as a boon to knowledge exchange and innovation.[11] As startup companies continue to influence the lives of those in and outside the United States, a more thorough investigation of them is crucial. They are one of the many important remnants of Cold War ideology’s shaping of corporate structures, both geographical and architectural.
Notes:
[1] Margaret O’Mara, “Uncovering the City Suburb: Cold War Politics, Scientific Elites, and High-Tech Spaces,” in The New Suburban History (Chicago, 2006), 57-79.
[2] Both quoted in Mozingo, Pastoral Capitalism (Cambridge: MIT Press, 2011), 24.
[3] Quoted in Beauregard, Voices of Decline: The Postwar Fate of U.S. Cities (Cambridge: Blackwell, 1993), 123.
[4] Ibid. 26.
[5] Ibid. 27.
[6] O’Mara, “Uncovering the City Suburb,” 67.
[7] Ibid. 66.
[8] Ibid. 67.
[9] Creighton, “Pearl Harbor to Nagasaki,” Progressive Architecture, 78.
[10] Rebecca S. Lowen, Creating the Cold War University (Berkley: University of California Press, 1997), 53.
[11] O’Mara, “Uncovering the City Suburb,” 66.
Post-war suburbanization was the result of a complex web of political and economic circumstances that scholars have yet to adequately explore. One of the most important of these factors is also one of the most overlooked: the anxiety-filled onset of the Cold War. Though often cited in passing as an influence on certain aspects of suburbanization, the Cold War is rarely given the serious and microscopic treatment it deserves. It is understandable why historians and urbanists would shy away from a topic as complex as the war, about which much has been written outside a suburban context. This essay attempts to outline some of the most important effects of Cold War sentiments on American suburbanization in the first decades after World War II.
The Atomic Threat: Residential and Industrial Dispersion
The most obvious and well-documented way in which the Cold War cloud played a part in suburbanization in the 1950s and 1960s was the ever-present atomic threat. Beginning in the aftermath of the 1945 bombings of Hiroshima and Nagasaki, and increasing with the onset of the Cold War, the American public began to speculate on the possible effects of an atomic attack. Obviously, the most devastating impact would be felt in America’s dense urban centers, thus making them the most likely targets. Public officials began offering tax exemptions for suburban property in an effort to encourage decentralization and a move away from cities, a process that was known as “dispersion.”[1] As one suburban historian points out, it is hard to isolate the effects of the dispersion campaign, given its relation to economic and political factors that predate World War II. Public and corporate fear of an atomic threat, however, could be witnessed everywhere.
“There is no known military defense against the atomic bomb itself, except space,” stated a report from the National Security Resources Board. Likewise, architect Eliel Saarinen, when designing a masterplan for the city of Detroit in 1942, explained, “In the event of future aerial bombardment, such [spread out] planning would provide a ‘dispersion’ factor.”[2] Legislation was passed to further facilitate the decentralization, like the 1956 Federal Aid Highway Act, which set up a system of “defense highways” in case of the need for emergency dispersal. Perhaps the strongest statement regarding the pervasive atomic anxiety came from the Journal of the American Institute of Architects: “[A]tomic bombs and concentrated cities cannot exist in the same world.”[3] Hyperbolic, perhaps, but the sentiment was not uncommon.
Residential anxiety was high, but corporations had even more to lose in the event of a large-scale military attack. Though most companies were reluctant to express atomic anxiety, a 1952 Fortune report found that, of twenty-two companies privately interviewed, every one expressed the desire to move away from the city.[4] Among the major companies that followed through with the move were General Motors, General Foods, and General Electric, three of the most influential corporations of the twentieth century.
Safety, Privacy, and Access
Another often-overlooked residual effect of the Cold War was the increased concern with privacy and access. Human and architectural costs were not the only (and, one might argue, not even the primary) concerns for businesses in post-war America. Executives also began to plan to make sure their documents were kept safe, and in areas where they were not easily damaged. Further concern about access also affected the layout of company buildings and campuses. Corporations like AT&T capitalized on their large, sprawling campuses, using long driveways and abundant guard booths to keep a close watch on everyone who entered the property, while remaining “under the guise of giving friendly directions to large sites.”[5] Paranoia thus manifested itself not just in strictly spatial terms, but also in the installment of subtle defense techniques, taking the cue from the United States’ obsession with defense against outsiders during the Cold War.
Cold War Knowledge
A less obvious, and more indirect, result of the Cold War was the way in which it greatly changed Americans’ notions of knowledge and educational institutions during the beginning of the second half of the twentieth century. The most prominent of these shifts in the perception of knowledge institutions was the elevation of the sciences. The 1950s saw both an increase in prestige for scientists and an increased public interest in popular science publications like National Geographic and Science.[6] As competition with the Soviet Union became more cultural than military (although the two were always closely intertwined), scientists’ role in the stand-off became central in two ways: symbolically, as the future of American cultural and educational dominance; and practically, in directly competitive endeavors like the space race. New Masters and PhD programs in the sciences began springing up all over the country, and universities also had to deal with new competition as corporations attempted to lure them away from university settings.[7]
Concurrent with the elevation of scientists in the public eye was a shift in the class implications of the war. Throughout the two World Wars, and in the beginning of the 50s, the war effort was mainly associated with blue collar workers, who were instrumental in the large scale production of conventional military machinery and weaponry. As this machinery became exponentially more technologically advanced, however, the need for a highly skilled white collar labor force superseded earlier production needs.[8] The increase in white collar education went outside the realm of the hard sciences, and even spilled into architecture. In addition to a renewed appreciation of certain easy-to-use architectural materials like concrete, plastic, and laminated wood, an architectural writer observed in 1946, World War II had “spurred investigation and education.”[9] In addition to physicists and engineers, architects were also considered important progenitors of the new structural symbols of the triumph of American capitalism over competing socialist and communist regimes.[10]
Besides educational importance, the Cold War also spurred the development of what one scholar has deemed “new and influential kinds of urban ecosystems” of the 1960s. These clusters of technology firms complicate visions of decentralization because, instead of locating themselves throughout the suburban landscape, they clustered together in fringe regions, reflecting many corporate executives’ continued belief in close physical proximity as a boon to knowledge exchange and innovation.[11] As startup companies continue to influence the lives of those in and outside the United States, a more thorough investigation of them is crucial. They are one of the many important remnants of Cold War ideology’s shaping of corporate structures, both geographical and architectural.
Notes:
[1] Margaret O’Mara, “Uncovering the City Suburb: Cold War Politics, Scientific Elites, and High-Tech Spaces,” in The New Suburban History (Chicago, 2006), 57-79.
[2] Both quoted in Mozingo, Pastoral Capitalism (Cambridge: MIT Press, 2011), 24.
[3] Quoted in Beauregard, Voices of Decline: The Postwar Fate of U.S. Cities (Cambridge: Blackwell, 1993), 123.
[4] Ibid. 26.
[5] Ibid. 27.
[6] O’Mara, “Uncovering the City Suburb,” 67.
[7] Ibid. 66.
[8] Ibid. 67.
[9] Creighton, “Pearl Harbor to Nagasaki,” Progressive Architecture, 78.
[10] Rebecca S. Lowen, Creating the Cold War University (Berkley: University of California Press, 1997), 53.
[11] O’Mara, “Uncovering the City Suburb,” 66.