Women and the Car
So where did the two sexes fit into this new level of automobility? Did they want and need the same service from automobiles, or did they have separate aims and objectives? And for those families who only owned one car, who was the prime driver and why? Work locations were slower to relocate to the suburbs than were domestic residences in the 1950s and early 1960s. Males needed to travel to work. As they were the breadwinners in most suburban families, they either drove themselves to the office or the factory or were driven to and from the local train station to continue their commute to the city. When the former happened, women often became socially isolated and frustrated, because the demand for public transit was too low to encourage private enterprise to provide service to downtown or to local retail centers. Housewives who retained control of the car during the day or those who managed to persuade their husbands that their lives would be difficult, lonely, and miserable without personal transport or that in their new upward mobility they deserved their own vehicle, whether new or second-hand, gained independence as well as mobility. For both of these female suburbanites, the car became almost a second home.
Initially such women got behind the wheel to shop. Retail outlets might be grouped a mile or more away from their homes, but these were gradually shifted into more purpose-built shopping centers. Such off-street retail complexes were frequently dedicated to domestic, if not female-oriented, consumption, and they were planned as primarily female meccas because market analysts estimated that women did between 67 and 92% of family shopping and spent considerable time at the stores. Designed to facilitate car access and to offer a variety of services in a one-stop journey, shopping centers increased from 8 in 1945 to 3,840 by 1960. Frequently anchored by department stores, they accommodated a number of different shops and services, thereby drawing in consumers from both the neighborhood and the region. If supermarkets were not located in these shopping centers, they required a separate visit from the motorized suburban housewife who wanted to take the weekly supply of food home in one journey. Indeed the supermarket came of age in the decades of post-war affluence. In 1950 it accounted for 35% of American food sales; a decade later this percentage had doubled. Not only did the low prices appeal to consumers' sense of value, but the supermarket also fitted the new automobile-led middle-class way of life. Supermarkets responded to their growing popularity by becoming bigger and better, by carrying new branded products and more sizes of pre-packaged foods, and by having larger parking lots.
Suburban women also perceived the growing need to transport children by car, initially to school. As more children’s activities became organized into such groups as Girl Scouts or Little League, in addition to the more traditional piano or dancing classes or school bands, mothers became major transporters of their offspring. While the yellow school bus might pick up children from the edge of the suburban complex, it was not privatized to adjust to the individual interests of specific families. Mothers wanted and needed to run their children from pastime to pastime. Indeed William Chafe has observed that the suburban family was essentially run by children. Even the advent of television, which by the mid-1950s was a feature of 66% of homes, did not prevent this child-centered mobility. Add to the shopping and children’s activities trips to doctors’ and dentists’ offices and visits to friends, families, or clubs, then it is easy to see that many suburban housewives were becoming dependent on the automobile to carry out domestic responsibilities and to pursue their own interests.
Suburban housewives might be synonymous with the growing numbers of white middle-class women in these decades, but other women were also interested in automobility in the post-war years. Increasing numbers of females were gainfully employed and were working wives or working mothers. By 1960 over twice as many women were employed as in 1940. The remarkable shift in this female labor force participation took place among married women with husbands present, revealing a 139% increase in women aged 35-44 and a 254% increase among the 45-54 year olds. Married women, frequently in older age groups and from the middle class, were leading the surge in the growth of employment in the service sector, where many women were already working. Whether they were suburban or metropolitan, they not only carried out their domestic responsibilities but also had to get to work. They were interested in multi-tasked journeys and faced complicated travel patterns that could not be accommodated by public transport. They, too, became gainfully employed commuters who wanted to drive themselves to work.
Impact on Mother Earth
Some pollution crises in the postwar years were harbingers of things to come. In 1948, a temperature inversion kept a dense smoke cloud of sulfur dioxide and particulate matter close to the ground for six days in the steel mill town of Donora, Pennsylvania. On the fifth day, October 30, seventeen people died, followed by two more deaths twenty-four hours later. Almost 43 percent of the townspeople became ill, with more than 10 percent (1,440) "severely affected." The tragedy at Donora made postwar Americans aware of the health hazards of air pollution. Those dangers were reconfirmed by the "killer smog" that hit London in 1952 (4,000 deaths) and the serious smog attack in New York City in 1953 (200 deaths). Congress enacted the National Air Pollution Control Act in 1955 to generate research on air pollution, but how automobile emissions fit into the story took several years to evaluate and even longer to address.
A relatively new source of air contamination, automobile emissions posed different problems than manufacturing discharges such as coal smoke. Before the Industrial Revolution, levels of toxic chemicals in the air were relatively low, but increased fossil-fuel production and use dramatically decreased air quality. The addition of many thousands of cars on the road in the years after World War II intensified the spread of air pollution, added more and newer sources of pollutants, and most immediately threatened many major cities.
In the 1940s, citizens of the car-dominated Los Angeles basin complained about a white or sometimes yellow-brown haze that made their eyes tear. They referred to this irritation as "smog." The word was taken from a combination of "smoke" and "fog," a term purportedly coined in 1905 by Dr. H.A. Des Voeux of London ’s Coal Smoke Abatement Society. The more recent version of smog, primarily from automobile emissions, is composed of a complex of carbon monoxide, hydrocarbons, sulfur oxides, nitrogen oxides, waste heat, and aerosols (liquid droplets, solid particles, and other various mixtures of liquids and solids suspended in air). Tropospheric ozone, located a few feet above ground, is another significant component of smog. In the late 1980s, at least 60 million people in North America regularly breathed air that failed to achieve federal air quality standards established ten years earlier; during the summer heat wave in 1988, the number rose to 120 million. Ozone is clearly one of the worst offenders, especially in cities such as Houston, Los Angeles, Baltimore, New York, Philadelphia, Washington, D.C., and Toronto.
Individually or together, the various components pose a health hazard to humans. Auto emissions can cause headaches, contribute to lung cancer, emphysema, and various other respiratory and cardiovascular problems, and have been linked to low birth weight in infants. They also modify weather conditions, damage vegetation, and eat away at rubber, textiles, dyes, and other materials.
The use of tetraethyl lead as a gasoline additive in 1923 introduced yet another toxic substance to automobile emissions that threatened human health. Concerns among public health officials about the poisonous nature of the substance did not deter General Motors and others from promoting leaded gasoline. As environmental historian Ted Steinberg noted, “With the burning of huge quantities of gasoline (especially in the three decades after 1950), lead was deposited on the soil and, unknowingly, tracked into houses across the nation. Infants crawling on the floor then picked it up on their fingers and ingested it, interfering with the development of their nervous systems and contributing to hyperactivity and hearing loss, among other effects, although it would be decades . . . before the full scope of the problem became evident.” Unfortunately, lead does not break down once released into the air, and between the 1920s and 1986—when it finally was being phased out as a gasoline additive—seven million tons of lead was spewed out by cars across the country.
While air pollution from cars was a growing problem throughout the immediate postwar period, it was not an issue among automobile manufacturers, oil companies, or the public. Los Angeles, the "smog capital of America," was probably the first city to raise major public concern over auto emissions, and became the living laboratory for studying the causes and effects of massive doses of smog. The State of California also was the first state to establish new-car emission standards.
As early as 1959, eye irritation was reported in Los Angeles County on 187 days; in 1962, 212 days. A typical car produced in 1963 (without pollution control devices) discharged 520 pounds of hydrocarbons, 1,700 pounds of carbon monoxide, and 90 pounds of nitrogen oxide for every 10,000 miles traveled. In 1966, 86 million of approximately 146 million tons of pollutants discharged into the air in the United States was attributable to motor vehicle traffic.
Beginning in 1947 Los Angeles had reduced sulfur dioxide emissions by banning the use of coal and fuel oils for industrial purposes, but the smog problem continued to increase. In the 1950s suspicions were being raised about the contribution of motor vehicles to the air pollution problem of the area. Dr. A.J. Haagen-Smit and other scientists conducting pioneering chemistry research at the California Institute of Technology discovered that nitrogen oxides and hydrocarbons exposed to sunlight produced secondary pollutants (photochemical smog or PCS) that caused eye and throat irritations and reduced visibility in the Los Angeles area. Further studies indicated that the complex and various pollutants existing in automobile emissions came from four sources: engine exhaust, crankcase blowby (through the engine ventilation system), the carburetor, and the fuel tank. These investigations were central to the development of various emissions-control technologies.
Multiplied by thousands of cars, the smog problem in Los Angeles was critical. California became the logical testing ground for several emissions-control devices and some pioneering legislation. Initially, neither the automobile industry nor the petroleum industry was a willing participant in addressing the problem. For its part, the auto industry was not interested in committing time or money to redesigning its cars, and only reluctantly and largely because of new legislation was forced to retrofit cars with emission-control devices. (Interestingly, little serious consideration was given to encouraging or requiring motorists to alter their driving habits.)
As early as 1953, Los Angeles County Supervisor Kenneth Hahn inquired of Detroit automobile makers as to whether research was being conducted to eliminate emissions. The response was vague. With the threat of mandatory federal regulations, the auto industry began to install crankcase blowby devices (which returned unburned gases to the combustion chambers) on their cars. This was a significant advance because crankcase blowby produced 25 percent of the engine's hydrocarbon emissions. This equipment became mandatory on all cars sold in California beginning with the 1963 models.
This was only a start, since no effort was made to control exhaust emissions that were responsible for 55 percent of the hydrocarbons, most of the waste heat, and all of the carbon monoxide, nitrogen oxides, and lead emissions. Once again the industry balked, but in 1966 California required exhaust-control devices on all new cars. However, the 12 percent drop in hydrocarbon emissions and reduction in carbon monoxide experienced in Los Angeles between 1965 and 1968 was accompanied by a 28 percent rise in nitrogen oxides. By 1968, nitrogen dioxide, which is highly poisonous, exceeded the "adverse" level on 132 days. The serious increases in nitrogen oxides were due to the inability of available antiemissions technology to act on them, as well as to the increase in automobiles and rising gasoline consumption. A new technical fix was sought from the automobile industry and, in response, catalytic exhaust devices were developed to convert nitrogen oxides into harmless by-products. Catalytic converters were required on all 1975 cars sold in California. Leaded gasoline, however, played havoc with the catalysts. One solution was to use lead-free or unleaded gasoline. (Another was the unauthorized removal of the devices by motorists.) While non-leaded gas became available, the complete phase-out of leaded gasoline, as stated earlier, did not commence until 1986.
The “Footprint” of the Automobile on the American City
Modern American cities bear a powerful physical imprint of automobiles and other motorized vehicles. It is estimated that as much as one half of a modern American city’s land area is dedicated to streets and roads, parking lots, service stations, driveways, signals and traffic signs, automobile-oriented businesses, car dealerships, and more. Equally significant, space allocated for other forms of transportation ultimately shrank or disappeared. For example, sidewalks—normally considered essential to separate pedestrians from various transportation modes—were less often constructed along many urban roads and streets in the automobile era. Walking seemed increasingly incidental in moving people from place to place. Bicycle lanes, quite common in several European cities, were late-comers or non-existent in American cities as competitive forms of transportation were squeezed out by an increasing dependence on cars.
Nothing better illustrated the growing dominance of motorized vehicles than its imprint on the land-use patterns of cities. A parking study conducted in California stated that about 59 percent of the ground area in Los Angeles’ central business district (CBD) in 1960 was devoted to streets and parking, with about 35 percent for roads, streets, alleys, and sidewalks, and 24 percent for parking lots and garages not included in buildings with other purposes. During roughly the same period, acreage devoted to streets and parking in other urban cores was similar in scale or slightly less. In Detroit (1953), streets and parking made up 49.5 percent of the central city; in Chicago (1956), 40.7 percent; in Minneapolis (1958), 48.3 percent; Nashville (1959), 39 percent; and in Dallas (1961), 41.4 percent.
Ironically, motor traffic in the central cities tended to require less street space than was necessary for other forms of transportation before the rise of the automobile. Urban freeways, for example, require less than 3 percent of the land in the areas they serve. On the other hand, as automobiles and trucks ventured into areas not served by public transit, the need for more streets necessitated more construction. Also street and parking data do not include businesses or services devoted wholly or in part to the automobile, and do not give the broadest picture of how automobiles have remade the urban landscape well beyond their eighteenth- and nineteenth-century counterparts.
In the long run, core cities were clearly affected by the automobile, its major physical changes, and the flight of the middle class to the suburbs. Accommodating to the automobile most often required adapting cores to the needs of the car, be it changing the road system or adding gas stations, repair shops, auto parts stores, car washes, and automobile dealerships. However, adaptation did not mean remaking. In most cases, an automobile infrastructure was superimposed over cities that had undergone a variety of changes through time. Nevertheless, building new roads and highways within cities or adding automobile-related services did its share of changing—and in some cases destroying—human and animal habitats. Neighborhoods were cleaved, disrupted, or even eliminated. Plants and wildlife were threatened or dislocated.
From “Walking Cities” to “Automobile Cities”
A look at the chronology of urban growth in America—with transportation as a key variable—shows how automobiles have transformed cities. Historians have mapped out a three- or four-stage transportation chronology for the American city: walking city (pre-1880), streetcar city (1880-1920), and automobile city (post-1920). One historian has divided the latter period into a “recreational vehicle” period (1920-1945) and a “freeway” period (post-1945).
The first stage—“the walking city”—was marked by highly compact cities and towns; an intermingling of residences and workplaces; a short journey to work for those employed in a variety of tasks; mixed patterns of land use; and the location of elite residences at the city centers. In this era, many cities and towns had large central squares that served as meeting places, open markets for buying and selling goods, and parade grounds for special occasions. For the most part, streets were narrow, meandering, and unpaved. Until the nineteenth century, there were few means other than walking or on horseback to traverse American cities. In 1674 there was only one carriage in New York City, and as late as 1761 only 18 in Philadelphia. In the 1830s, steam locomotives provided opportunity for commuting in major cities, but they were too large, too noisy, and too expensive to incorporate effectively into the urban landscape as a form of inner-city transportation. Late in this period, omnibuses, cablecars, and horsecars (horse-drawn streetcars on fixed track) appeared as the forerunners to the widely adopted electric streetcars that revolutionized mass transit in the United States.
During “the streetcar era,” many cities were feeling the impact of industrialization in terms of factory migration to cities and the influx of thousands of European immigrants and rural American migrants. The industrial cities were focused at the core, especially for various business ventures, commerce and trade, retailing, hotel accommodations, and cultural activities. The separation between work and residence for the middle and upper classes was much more pronounced than in the walking city, as these groups increasingly fled the central cities for the suburbs; the working classes, by contrast, remained near the core and close to industrial workplaces since they had little or no access to public transportation and had to live by the clock or lose their jobs. By the 1880s, low-fare electric streetcars replaced the slower and less reliable horsecars, offering public transportation to a growing ridership. Streetcars, however, still relied on light-rail tracks radiating out from the central business districts into the surrounding areas and newly forming suburbs. Urban development thus congregated close to the streetcar lines.
The intimacy and homogeneity of the walking city declined as the extension of efficient transit allowed the more affluent urbanites to move from the central cities to the more spacious suburbs, escaping inner-city congestion, pollution, and a rising tide of various newcomers. This tendency produced more well-defined residential subdivisions divided primarily along income lines. The smallish walking cities (probably less than a mile in diameter) made way for more expansive urban areas featuring clearer areas of residential segregation and more specialized land uses. Downtowns still had a hold on the big department stores, theaters, and office buildings, but the growing urban periphery saw dynamic population growth and its own demand for public and private services. By the 1920s, the United States could boast of metropolises with suburbs extending more than 20 miles from the core.
The most recent stage—“the automobile (and truck) city”—arose after World War I and began to erode the patterns of the streetcar city through deconcentration of business functions; the weakening of the core as a magnet for social and cultural life; and the dispersal of population into the suburbs. In some respects, the advent of the automobile continued the process of metropolitan growth promoted by the electric streetcars—hastening the decentralization of the population and pushing the suburbs further into the hinterland. Motorized vehicles alone did not produce the new urban patterns. Long distance communications (such as the telegraph and telephone), the extension of city services, and real estate development also contributed mightily to the changes taking place in the late-nineteenth and early-twentieth centuries. Yet the emergence of the automobile gave individuals (and road builders), not fixed tracks, a major role in what became a less clearly defined and less rigidly geometric pattern of urban growth.
Many critics viewed the changes in the automobile era as detrimental to well-planned, rationally organized communities, but others saw motorized vehicles as particularly well-suited to cities. Like the horse and buggy, the car could be privately owned and operated and not restricted to rigid transportation lines. In some places, such as in Los Angeles, Denver, Phoenix, Houston, Jacksonville, and Atlanta, the automobile played a central role in creating low density, expansive, and some would say muddled and fragmented, urban development. In others, modifications in the “urban fabric” occurred at a snail’s pace or not at all.
Adopting the automobile, however, did not guarantee that an appropriate infrastructure would follow. Cars and trucks often had to adjust to urban environments not planned for them. Automobiles in some cities like Paris, for example, often parked on sidewalks along particularly narrow streets. In contrast, in cities that successfully adapted their infrastructure to motorized vehicles, other urban resources were lost and lifestyles were altered. In cities like Los Angeles or Houston, pedestrians have become endangered species.
Faster Food
As more Americans began driving cars, entirely new categories of businesses came into being to allow them to enjoy their products and services without having to leave their cars. This includes the drive-in restaurant, and later the drive-through window. Still today, the Sonic Drive-In restaurant chain provides primarily drive-in service by carhop in 3,561 restaurants within 43 U.S. states, serving approximately 3 million customers per day.[42][43] Known for its use of carhops on roller skates, the company annually hosts a competition to determine the top skating carhop in its system.[44]
A number of other successful "drive up" businesses have their roots in the 1950s, including McDonald's, which had no dine-in facilities, requiring customers to park and walk up to the window, taking their order "to go". Automation and the lack of dining facilities allowed McDonald's to sell burgers for 15 cents each, instead of the typical 35 cents, and people were buying them by the bagful. By 1948, they had fired their car hops, installed larger grills, reduced their menu and radically changed the industry by introducing assembly-line methods of food production similar to the auto industry, dubbing it the "Speedee Service System".[45] They redesigned their sign specifically to make it easier to see from the road, creating the now familiar double arch.[46] Businessman Ray Kroc joined McDonald's as a franchise agent in 1955. He subsequently purchased the chain from the McDonald brothers and oversaw its worldwide growth.[47]
Other chains were created to serve the increasingly mobile patron. Carl Karcher opened his first Carl's Jr. in 1956, and rapidly expanded, locating his restaurants near California's new freeway off-ramps.[48] These restaurant models initially relied on the common ownership of automobiles, and the willingness of patrons to dine in their automobiles, and today drive-through service accounts for 65 percent of their profits.[49]
Drive-in movies
The drive-in theater is a form of cinema structure consisting of a large outdoor movie screen, a projection booth, a concession stand and a large parking area for automobiles, where patrons view the movie from the comfort of their cars and listen via a speaker placed at each parking spot.
Although drive-in movies first appeared in 1933,[50] it was not until well after the post-war era that they became popular, enjoying their greatest success in the 1950s, reaching a peak of more than 4,000 theaters in the United States alone.[51][52] Drive-in theaters have been romanticized in popular culture with the movie American Graffiti and the television show Happy Days. They developed a reputation for showing B movies, typically monster or horror films, and as "passion pits", a place for teenagers to kiss.[52] While drive-in theaters are rarer today with only 366 remaining[51] and no longer unique to America, they are still associated as part of the 1950s' American car culture.[53] Drive-in movies have seen somewhat of a resurgence in popularity in the 21st century, due in part to baby boomer nostalgia.[52]
Robert Schuller started the nation's first drive-in church in 1955 in Garden Grove, California. After his regular 9:30 am service in the chapel four miles away, he would travel to the drive-in for a second Sunday service.[54] Worshipers listened to his sermon from the comfort of their cars, using the movie theater's speaker boxes.[55]
Malls
The first modern shopping malls were built in the 1950s,[56] such as Bergen Mall, which was the first to use the term "mall" to describe the business model. Other early malls moved retailing away from the dense, commercial downtowns into the largely residential suburbs. Northgate in Seattle is credited as being the first modern mall design, with two rows of businesses facing each other and a walkway separating them. It opened in 1950. Shopper's World in Framingham, Massachusetts was the two-story mall, and opened a year later.[56] The design was modified again in 1954 when Northland Center in Detroit, Michigan, used a centralized design with an anchor store in the middle of the mall, ringed by other stores. This was the first mall to have an ample parking lot that completely surrounds the shopping center to make it easy and pleasant to get there, and to provide central heat and air-conditioning[56]
In 1956, Southdale Center opened in Edina, Minnesota, just outside of Minneapolis. It was the first to combine all these modern elements, being enclosed with a two-story design, central heat and air-conditioning plus a comfortable common area. It also featured two large department stores as anchors. Most industry professionals consider Southdale Center to be the first modern regional mall.[56]
This formula (enclosed space with stores attached, away from downtown and accessible only by automobile) became a popular way to build retail across the world. Victor Gruen, one of the pioneers in mall design,[57] came to abhor this effect of his new design; he decried the creation of enormous "land wasting seas of parking" and the spread of suburban sprawl.[58]
The road trip is an American phenomenon. It was most popular in the 1950s when the popularity of family vacations was highest as well. The allure of the road trip today translates into movies and songs directed at teens and adults in their twenties. It is an adventure of your own creation. This distinctly American trend came into being because of three major factors in American history. The automobile, the vacation, and the interstate highway system. These three parts of American history created change for the entire country, in many different ways. One of these ways was the road trip, and all the excitement it still entails.
Transportation has gone through many evolutions. From walking to flying, from horses to horsepower. One of the most innovative and world changing evolutions was the creation of the automobile. It distorted space and time. It changed how we express ourselves, and how we build cities, towns, and homes; it gave us almost unlimited mobility. A car is often viewed as more than an object and in some instances we have relationships with it, we consider it part of the family, we take care of it and worship it. A car is an expression of freedom to sixteen-year-olds with a brand new license. It is a world of possibility on four wheels. Before the car we were limited to walking, bicycling, riding horses or trains and eventually horse-drawn carriages.
Cars were practical in many situations - traveling to and from work or church; going to the store or to school events; taking family vacations. Before the ability to take a flight from Seattle to Orange County there was the ability to drive. Between trains and planes taking vacationers to their destinations, there were long car trips in the family vehicle. Whether these vacations were to Disneyland, or to Washington D.C., or just to extended family members in other cities or states you could pile the whole family into a car and take a drive.
Leisure time was created by the labor movement for the middle and lower classes. Before mandated shorter work days, no one had time for vacations except the wealthy.15 The wealthy had summer homes, and country cottages. They took vacations to the beach and went camping. They could choose to go to resorts, or quaint inns in rural villages. During the first half of the twentieth century, progressive reformers that championed better conditions for the working class also pushed for vacations for working-class people.16 This new vacationing population would cause change in all aspects of life for the nation as a whole. It was a nation-wide phenomenon that affected the food and lodging services, civic identity, and economic livelihoods of people and towns across the nation.
The new ability of the middle classes and below to take vacations meant quite a few things for society. It meant that they would be making the use of many kinds of transportation, more so traveling by car than by train after the 1920s. It meant that they would need places to stay that were more affordable than those that the rich stayed in. They began by “autocamping”. Autocamping, also called gypsying, was when families and groups would simply park their cars along the side of the road and set up tents. They did not have to pay and often this became a problem because of a lack of responsibility. The autocampers would leave litter strewn about where they had slept the night before, and drank dirty water which caused illness since they had little experience living in nature. In order to combat this problem, towns started charging fees for use of the land which opened the field to private entrepreneurs who built cabins. These cabins quickly evolved into motels.17
Hotels were around for the more wealthy traveler for some time, however the case against hotels for the middle class vacation was strong. Hotels had dress codes. The food was more expensive and more formal than cooking your own. Hotel staff was often “disagreeable.” And the cost of a hotel for a group or a family was often beyond the financial reach of the middle class traveler.18 Motels were a new phenomenon.
Motels (“motor-hotels”) began as roadside cabins, which had started as companions to other roadside vendors. They added to the draw of family owned roadside grocery stores, gas stations, and food stands.19 One chain of motels that is still popular today is the Holiday Inn, which started in 1952. Kemmons Wilson's inspiration to open the first Holiday Inn was the cost of taking children on vacation. He felt that if it cost an extra ten dollars a night to take your children then children would be excluded from vacations.20 Wilson was a father who had taken family vacations as a child and wanted to continue the tradition with his own family.21
Post World War II, family vacations became a national tradition.22 The reason for this surge of popularity of family vacations was the result of a culture that prized family ideals. The most common vacation was a vacation of “mobile citizenship” meant to “cultivate a sense of civic identity and attachment to American history.”23 For the most part, these vacations were road trips. Riding the wave of postwar consumerism Americans bought roomy family cars, which meant they could go anywhere, and they could do it in comfort. They traveled to the west for its Wild West styled adventure. They traveled to theme parks for safe thrills. They traveled to national parks for natural wonders and affordable lodgings. They traveled to national monuments and places rich in American history in order to claim citizenship in the national geography.
The automobile became the family car (or minivan), allowing families to take a vacation to wherever they wanted on their own schedule. The higher wages and leisure time that came out of the progressive social movements in the early 20th century allowed vacations to take place for the majority. The roads that were built to cross the nation many times over in every direction gave everyone somewhere to go.
All of these intertwined events and changes in American society wrought one more phenomenon; the road trip. There is a certain mystique about the idea of a road trip. It is a combination of freedom, adventure, ownership, and discovery. Travelers bond over miles of paved highway, getting to know or better understand their fellow countrymen.
Transportation has undergone many changes over the centuries. For an extended period, horses were the main means of transportation. They pulled carriages and were ridden to and from anywhere one needed to go. Long range transport was often by water, in some form of ship. Ships evolved into riverboats, and then boats that traveled canals. Horses were replaced by trains for long range overland movement. With the advent of the car came the need for a new type of track. Cities had streets, or muddy paths for streets throughout them for carriages and foot traffic, but beyond city limits there was little more than dirt paths. Bicyclists lobbied in 1880 for improved roads, but these requests had little power over the federal government when backed with so little support.33 In fact, in 1893 the Office of Road Inquiry determined that the United States had 2,151,570 miles of highway and only 141 miles of that was considered acceptable for automobile traffic.34 In order for the automobile to get anywhere, roads would need improvement.
Thomas Harris MacDonald was the mind behind the interstate system. He began its planning in 1904. He was elected chief of the Federal Bureau of Public Roads under Woodrow Wilson in 1919.35 MacDonald's two principles were cooperation and technical expertise. Cooperation was often spearheaded by the lobbyist group the American Association of State Highway Officials.36 MacDonald used this group to foster cooperation between the state and federal governments. In 1916 Congress had allotted $75 million for five years through the Federal-Aid Road Act. By 1921, the Federal-Aid Highway Act began allotting $75 million per year.37
Two kinds of highways were laid. In the east highways were concrete or asphalt, since the traffic was so much heavier, and in the west highways were a mixture of gravel and sand-clay.38 By 1936, MacDonald had supported the building of nearly 225,000 miles of automobile-safe roads, adding 12,000 miles each year.39 The Pennsylvania Turnpike opened in December of 1940 and it was the first of what can be termed “modern highways”. It was sixty miles long, consisting of reinforced concrete, four lanes wide with better sight distance and wide curves so the need to slow down was minimized. It had bridges and underpasses and cut five hours out of the drive between Pittsburgh and Harrisburg. It was a “highway built to truly master nature.”40 The highway was funded privately, and the profits it made from its tolls opened a new door for the federal government. In order to fund its highways from 1956 and beyond, it levied taxes on gas, diesel, lubricating oils, tires, and heavy trucks.41 The construction was on a “pay-as-you-go” basis, leaving the federal government all but free from the responsibilities of funding the highway system.
Railroads were impacted negatively, though they were the only transportation system that suffered. MacDonald dismissed the railroad companies' concerns at first by ignoring the drop in profits from passenger business, and later saying that freight usage for railroads was sure to increase to cover the gap.42 Since railroads were already well established, the Depression era saw an increase in funding for road building because it meant increased job opportunities for the many unemployed people. In contrast, railroad and affiliate companies had been shortening track and laying people off left and right to make up for its losses. There was progress to be had in roads, not rails. The railroads never truly recovered, especially after transport was made possible on large trucks, over more direct routes via the highways.
Road building affected everything about American life. The highways changed space and time. It was as if every destination was closer than it had been before and more accessible no matter where the trip began. Roadside tourism became an economic blessing to any town near a highway. From giant balls of twine to restaurant chains, curiosity and good advertising would bring paying customers in, even if they weren't sure what they would be paying for- like House on the Rock in Spring Green, Wisconsin and Rock City and Lookout Mountain in Georgia; like the Ave Maria Grotto in Alabama and Frontierland in Florida. Attractions followed the roads, giving travelers a place to stop and stretch their legs, and giving them something to see while they did.
Share with your friends: |