They say love is blind…and marriage is an institution. Well, I’m not ready for an institution for the blind just yet.

Mae West

 
 
 
 
 
Tác giả: Barack Obama
Biên tập: Yen
Upload bìa: Ngô Trà
Language: English
Số chương: 13
Phí download: 3 gạo
Nhóm đọc/download: 0 / 1
Số lần đọc/download: 1268 / 63
Cập nhật: 2015-08-02 14:09:10 +0700
Link download: epubePub   PDF A4A4   PDF A5A5   PDF A6A6   - xem thông tin ebook
 
 
 
 
Chapter 5 - Opportunity
NE THING ABOUT being a U.S. senator—you fly a lot. There are the flights back and forth from Washington at least once a week. There are the trips to other states to deliver a speech, raise money, or campaign for your colleagues. If you represent a big state like Illinois, there are flights upstate or downstate, to attend town meetings or ribbon cuttings and to make sure that the folks don’t think you’ve forgotten them.
Most of the time I fly commercial and sit in coach, hoping for an aisle or window seat and crossing my fingers that the guy in front of me doesn’t want to recline.
But there are times when—because I’m making multiple stops on a West Coast swing, say, or need to get to another city after the last commercial flight has left—I fly on a private jet. I hadn’t been aware of this option at first, assuming the cost would be prohibitive. But during the campaign, my staff explained that under Senate rules, a senator or candidate could travel on someone else’s jet and just pay the equivalent of a first-class airfare. After looking at my campaign schedule and thinking about all the time I would save, I decided to give private jets a try.
It turns out that the flying experience is a good deal different on a private jet. Private jets depart from privately owned and managed terminals, with lounges that feature big soft couches and big-screen TVs and old aviation photographs on the walls. The restrooms are generally empty and spotless, and have those mechanical shoe-shine machines and mouthwash and mints in a bowl. There’s no sense of hurriedness at these terminals; the plane is waiting for you if you’re late, ready for you if you’re early. A lot of times you can bypass the lounge altogether and drive your car straight onto the tarmac. Otherwise the pilots will greet you in the terminal, take your bags, and walk you out to the plane.
And the planes, well, they’re nice. The first time I took such a flight, I was on a Citation X, a sleek, compact, shiny machine with wood paneling and leather seats that you could pull together to make a bed anytime you decided you wanted a nap. A shrimp salad and cheese plate occupied the seat behind me; up front, the minibar was fully stocked. The pilots hung up my coat, offered me my choice of newspapers, and asked me if I was comfortable. I was.
Then the plane took off, its Rolls-Royce engines gripping the air the way a well-made sports car grips the road. Shooting through the clouds, I turned on the small TV monitor in front of my seat. A map of the United States appeared, with the image of our plane tracking west, along with our speed, our altitude, our time to destination, and the temperature outside. At forty thousand feet, the plane leveled off, and I looked down at the curving horizon and the scattered clouds, the geography of the earth laid out before me—first the flat, checkerboard fields of western Illinois, then the python curves of the Mississippi, then more farmland and ranch land and eventually the jagged Rockies, still snow-peaked, until the sun went down and the orange sky narrowed to a thin red line that was finally consumed by night and stars and moon.
I could see how people might get used to this.
The purpose of that particular trip was fund-raising, mostly—in preparation for my general election campaign, several friends and supporters had organized events for me in L.A., San Diego, and San Francisco. But the most memorable part of the trip was a visit that I paid to the town of Mountain View, California, a few miles south of Stanford University and Palo Alto, in the heart of Silicon Valley, where the search engine company Google maintains its corporate headquarters.
Google had already achieved iconic status by mid-2004, a symbol not just of the growing power of the Internet but of the global economy’s rapid transformation. On the drive down from San Francisco, I reviewed the company’s history: how two Stanford Ph.D. candidates in computer science, Larry Page and Sergey Brin, had collaborated in a dorm room to develop a better way to search the web; how in 1998, with a million dollars raised from various contacts, they had formed Google, with three employees operating out of a garage; how Google figured out an advertising model—based on text ads that were nonintrusive and relevant to the user’s search—that made the company profitable even as the dot-com boom went bust; and how, six years after the company’s founding, Google was about to go public at stock prices that would make Mr. Page and Mr. Brin two of the richest people on earth.
Mountain View looked like a typical suburban California community—quiet streets, sparkling new office parks, unassuming homes that, because of the unique purchasing power of Silicon Valley residents, probably ran a cool million or more. We pulled in front of a set of modern, modular buildings and were met by Google’s general counsel, David Drummond, an African American around my age who’d made the arrangements for my visit.
“When Larry and Sergey came to me looking to incorporate, I figured they were just a couple of really smart guys with another start-up idea,” David said. “I can’t say I expected all this.”
He took me on a tour of the main building, which felt more like a college student center than an office—a café on the ground floor, where the former chef of the Grateful Dead supervised the preparation of gourmet meals for the entire staff; video games and a Ping-Pong table and a fully equipped gym. (“People spend a lot of time here, so we want to keep them happy.”) On the second floor, we passed clusters of men and women in jeans and T-shirts, all of them in their twenties, working intently in front of their computer screens, or sprawled on couches and big rubber exercise balls, engaged in animated conversation.
Eventually we found Larry Page, talking to an engineer about a software problem. He was dressed like his employees and, except for a few traces of early gray in his hair, didn’t look any older. We spoke about Google’s mission—to organize all of the world’s information into a universally accessible, unfiltered, and usable form—and the Google site index, which already included more than six billion web pages. Recently the company had launched a new web-based email system with a built-in search function; they were working on technology that would allow you to initiate a voice search over the telephone, and had already started the Book Project, the goal of which was to scan every book ever published into a web-accessible format, creating a virtual library that would store the entirety of human knowledge.
Toward the end of the tour, Larry led me to a room where a three-dimensional image of the earth rotated on a large flat-panel monitor. Larry asked the young Indian American engineer who was working nearby to explain what we were looking at.
“These lights represent all the searches that are going on right now,” the engineer said. “Each color is a different language. If you move the toggle this way”—he caused the screen to alter—“you can see the traffic patterns of the entire Internet system.”
The image was mesmerizing, more organic than mechanical, as if I were glimpsing the early stages of some accelerating evolutionary process, in which all the boundaries between men—nationality, race, religion, wealth—were rendered invisible and irrelevant, so that the physicist in Cambridge, the bond trader in Tokyo, the student in a remote Indian village, and the manager of a Mexico City department store were drawn into a single, constant, thrumming conversation, time and space giving way to a world spun entirely of light. Then I noticed the broad swaths of darkness as the globe spun on its axis—most of Africa, chunks of South Asia, even some portions of the United States, where the thick cords of light dissolved into a few discrete strands.
My reverie was broken by the appearance of Sergey, a compact man perhaps a few years younger than Larry. He suggested that I go with them to their TGIF assembly, a tradition that they had maintained since the beginning of the company, when all of Google’s employees got together over beer and food and discussed whatever they had on their minds. As we entered a large hall, throngs of young people were already seated, some drinking and laughing, others still typing into PDAs or laptops, a buzz of excitement in the air. A group of fifty or so seemed more attentive than the rest, and David explained that these were the new hires, fresh from graduate school; today was their induction into the Google team. One by one, the new employees were introduced, their faces flashing on a big screen alongside information about their degrees, hobbies, and interests. At least half of the group looked Asian; a large percentage of the whites had Eastern European names. As far as I could tell, not one was black or Latino. Later, walking back to my car, I mentioned this to David and he nodded.
“We know it’s a problem,” he said, and mentioned efforts Google was making to provide scholarships to expand the pool of minority and female math and science students. In the meantime, Google needed to stay competitive, which meant hiring the top graduates of the top math, engineering, and computer science programs in the country—MIT, Caltech, Stanford, Berkeley. You could count on two hands, David told me, the number of black and Latino kids in those programs.
In fact, according to David, just finding American-born engineers, whatever their race, was getting harder—which was why every company in Silicon Valley had come to rely heavily on foreign students. Lately, high-tech employers had a new set of worries: Since 9/11 a lot of foreign students were having second thoughts about studying in the States due to the difficulties in obtaining visas. Top-notch engineers or software designers didn’t need to come to Silicon Valley anymore to find work or get financing for a start-up. High-tech firms were setting up operations in India and China at a rapid pace, and venture funds were now global; they would just as readily invest in Mumbai or
Shanghai as in California. And over the long term, David explained, that could spell trouble for the U.S. economy.
“We’ll be able to keep attracting talent,” he said, “because we’re so well branded. But for the start-ups, some of the less established companies, the next Google, who knows? I just hope somebody in Washington understands how competitive things have become. Our dominance isn’t inevitable.”
AROUND THE SAME time that I visited Google, I took another trip that made me think about what was happening with the economy. This one was by car, not jet, along miles of empty highway, to a town called Galesburg, forty-five minutes or so from the Iowa border in western Illinois.
Founded in 1836, Galesburg had begun as a college town when a group of Presbyterian and Congregational ministers in New York decided to bring their blend of social reform and practical education to the Western frontier. The resulting school, Knox College, became a hotbed of abolitionist activity before the Civil War—a branch of the Underground Railroad had run through Galesburg, and Hiram Revels, the nation’s first black U.S. senator, attended the college’s prep school before moving back to Mississippi. In 1854, the Chicago, Burlington & Quincy railroad line was completed through Galesburg, causing a boom in the region’s commerce. And four years later, some ten thousand people gathered to hear the fifth of the Lincoln-Douglas debates, during which Lincoln first framed his opposition to slavery as a moral issue.
It wasn’t this rich history, though, that had taken me to Galesburg. Instead, I’d gone to meet with a group of union leaders from the Maytag plant, for the company had announced plans to lay off 1,600 employees and shift operations to Mexico. Like towns all across central and western Illinois, Galesburg had been pounded by the shift of manufacturing overseas. In the previous few years, the town had lost industrial parts makers and a rubber-hose manufacturer; it was now in the process of seeing Butler Manufacturing, a steelmaker recently bought by Australians, shutter its doors. Already, Galesburg’s unemployment rate hovered near 8 percent. With the Maytag plant’s closing, the town stood to lose another 5 to 10 percent of its entire employment base.
Inside the machinists’ union hall, seven or eight men and two or three women had gathered on metal folding chairs, talking in muted voices, a few smoking cigarettes, most of them in their late forties or early fifties, all of them dressed in jeans or khakis, T-shirts or plaid work shirts. The union president, Dave Bevard, was a big, barrel-chested man in his mid-fifties, with a dark beard, tinted glasses, and a fedora that made him look like a member of the band ZZ Top. He explained that the union had tried every possible tactic to get Maytag to change its mind—talking to the press, contacting shareholders, soliciting support from local and state officials. The Maytag management had been unmoved.
“It ain’t like these guys aren’t making a profit,” Dave told me. “And if you ask ’em, they’ll tell you we’re one of the most productive plants in the company. Quality workmanship. Low error rates. We’ve taken cuts in pay, cuts in benefits, layoffs. The state and the city have given Maytag at least $10 million in tax breaks over the last eight years, based on their promise to stay. But it’s never enough. Some CEO who’s already making millions of dollars decides he needs to boost the company stock price so he can cash in his options, and the easiest way to do that is to send the work to Mexico and pay the workers there a sixth of what we make.”
I asked them what steps state or federal agencies had taken to retrain workers, and almost in unison the room laughed derisively. “Retraining is a joke,” the union vice president, Doug Dennison, said. “What are you going to retrain for when there aren’t any jobs out there?” He talked about how an employment counselor had suggested that he try becoming a nursing aide, with wages not much higher than what Wal-Mart paid their floor clerks. One of the younger men in the group told me a particularly cruel story: He had made up his mind to retrain as a computer technician, but a week into his courses, Maytag called him back. The Maytag work was temporary, but according to the rules, if this man refused to accept Maytag’s offer, he’d no longer be eligible for retraining money. If, on the other hand, he did go back to Maytag and dropped out of the courses he was already taking, then the federal agency would consider him to have used up his one-time training opportunity and wouldn’t pay for any retraining in the future.
I told the group that I’d tell their story during the campaign and offered a few proposals that my staff had developed—amending the tax code to eliminate tax breaks for companies who shifted operations offshore; revamping and better funding federal retraining programs. As I was getting ready to go, a big, sturdy man in a baseball cap spoke up. He said his name was Tim Wheeler, and he’d been the head of the union at the nearby Butler steel plant. Workers had already received their pink slips there, and Tim was collecting unemployment insurance, trying to figure out what to do next. His big worry now was health-care coverage.
“My son Mark needs a liver transplant,” he said grimly. “We’re on the waiting list for a donor, but with my health-care benefits used up, we’re trying to figure out if Medicaid will cover the costs. Nobody can give me a clear answer, and you know, I’ll sell everything I got for Mark, go into debt, but I still…” Tim’s voice cracked; his wife, sitting beside him, buried her head in her hands. I tried to assure them that we would find out exactly what Medicaid would cover. Tim nodded, putting his arm around his wife’s shoulder.
On the drive back to Chicago, I tried to imagine Tim’s desperation: no job, an ailing son, his savings running out.
Those were the stories you missed on a private jet at forty thousand feet.
YOU’LL GET LITTLE argument these days, from either the left or the right, with the notion that we’re going through a fundamental economic transformation. Advances in digital technology, fiber optics, the Internet, satellites, and transportation have effectively leveled the economic barriers between countries and continents. Pools of capital scour the earth in search of the best returns, with trillions of dollars moving across borders with only a few keystrokes. The collapse of the Soviet Union, the institution of market-based reforms in India and China, the lowering of trade barriers, and the advent of big-box retailers like Wal-Mart have brought several billion people into direct competition with American companies and American workers. Whether or not the world is already flat, as columnist and author Thomas Friedman says, it is certainly getting flatter every day.
There’s no doubt that globalization has brought significant benefits to American consumers. It’s lowered prices on goods once considered luxuries, from big-screen TVs to peaches in winter, and increased the purchasing power of low-income Americans. It’s helped keep inflation in check, boosted returns for the millions of Americans now invested in the stock market, provided new markets for U.S. goods and services, and allowed countries like China and India to dramatically reduce poverty, which over the long term makes for a more stable world.
But there’s also no denying that globalization has greatly increased economic instability for millions of ordinary Americans. To stay competitive and keep investors happy in the global marketplace, U.S.-based companies have automated, downsized, outsourced, and offshored. They’ve held the line on wage increases, and replaced defined-benefit health and retirement plans with 401(k)s and Health Savings Accounts that shift more cost and risk onto workers.
The result has been the emergence of what some call a “winner-take-all” economy, in which a rising tide doesn’t necessarily lift all boats. Over the past decade, we’ve seen strong economic growth but anemic job growth; big leaps in productivity but flatlining wages; hefty corporate profits, but a shrinking share of those profits going to workers. For those like Larry Page and Sergey Brin, for those with unique skills and talents and for the knowledge workers—the engineers, lawyers, consultants, and marketers—who facilitate their work, the potential rewards of a global marketplace have never been greater. But for those like the workers at Maytag, whose skills can be automated or digitized or shifted to countries with cheaper wages, the effects can be dire—a future in the ever-growing pool of low-wage service work, with few benefits, the risk of financial ruin in the event of an illness, and the inability to save for either retirement or a child’s college education.
The question is what we should do about all this. Since the early nineties, when these trends first began to appear, one wing of the Democratic Party—led by Bill Clinton—has embraced the new economy, promoting free trade, fiscal discipline, and reforms in education and training that will help workers to compete for the high-value, high-wage jobs of the future. But a sizable chunk of the Democratic base—particularly blue-collar union workers like Dave Bevard—has resisted this agenda. As far as they’re concerned, free trade has served the interests of Wall Street but has done little to stop the hemorrhaging of good-paying American jobs.
The Republican Party isn’t immune from these tensions. With the recent uproar around illegal immigration, for example, Pat Buchanan’s brand of “America first” conservatism may see a resurgence within the GOP, and present a challenge to the Bush Administration’s free trade policies. And in his 2000 campaign and early in his first term, George W. Bush suggested a legitimate role for government, a “compassionate conservatism” that, the White House argues, has expressed itself in the Medicare prescription drug plan and the educational reform effort known as No Child Left Behind—and that has given small-government conservatives heartburn.
For the most part, though, the Republican economic agenda under President Bush has been devoted to tax cuts, reduced regulation, the privatization of government services—and more tax cuts. Administration officials call this the Ownership Society, but most of its central tenets have been staples of laissez-faire economics since at least the 1930s: a belief that a sharp reduction—or in some cases, elimination—of taxes on incomes, large estates, capital gains, and dividends will encourage capital formation, higher savings rates, more business investment, and greater economic growth; a belief that government regulation inhibits and distorts the efficient working of the market; and a belief that government entitlement programs are inherently inefficient, breed dependency, and reduce individual responsibility, initiative, and choice.
Or, as Ronald Reagan succinctly put it: “Government is not the solution to our problem; government is the problem.”
So far, the Bush Administration has only achieved one-half of its equation; the Republican-controlled Congress has pushed through successive rounds of tax cuts, but has refused to make tough choices to control spending—special interest appropriations, also known as earmarks, are up 64 percent since Bush took office. Meanwhile, Democratic lawmakers (and the public) have resisted drastic cuts in vital investments—and outright rejected the Administration’s proposal to privatize Social Security. Whether the Administration actually believes that the resulting federal budget deficits and ballooning national debt don’t matter is unclear. What is clear is that the sea of red ink has made it more difficult for future administrations to initiate any new investments to address the economic challenges of globalization or to strengthen America’s social safety net.
I don’t want to exaggerate the consequences of this stalemate. A strategy of doing nothing and letting globalization run its course won’t result in the imminent collapse of the U.S. economy. America’s GDP remains larger than China’s and India’s combined. For now, at least, U.S.-based companies continue to hold an edge in such knowledge-based sectors as software design and pharmaceutical research, and our network of universities and colleges remains the envy of the world.
But over the long term, doing nothing probably means an America very different from the one most of us grew up in. It will mean a nation even more stratified economically and socially than it currently is: one in which an increasingly prosperous knowledge class, living in exclusive enclaves, will be able to purchase whatever they want on the marketplace—private schools, private health care, private security, and private jets—while a growing number of their fellow citizens are consigned to low-paying service jobs, vulnerable to dislocation, pressed to work longer hours, dependent on an underfunded, overburdened, and underperforming public sector for their health care, their retirement, and their children’s educations.
It will mean an America in which we continue to mortgage our assets to foreign lenders and expose ourselves to the whims of oil producers; an America in which we underinvest in the basic scientific research and workforce training that will determine our long-term economic prospects and neglect potential environmental crises. It will mean an America that’s more politically polarized and more politically unstable, as economic frustration boils over and leads people to turn on each other.
Worst of all, it will mean fewer opportunities for younger Americans, a decline in the upward mobility that’s been at the heart of this country’s promise since its founding.
That’s not the America we want for ourselves or our children. And I’m confident that we have the talent and the resources to create a better future, a future in which the economy grows and prosperity is shared. What’s preventing us from shaping that future isn’t the absence of good ideas. It’s the absence of a national commitment to take the tough steps necessary to make America more competitive—and the absence of a new consensus around the appropriate role of government in the marketplace.
TO BUILD THAT consensus, we need to take a look at how our market system has evolved over time. Calvin Coolidge once said that “the chief business of the American people is business,” and indeed, it would be hard to find a country on earth that’s been more consistently hospitable to the logic of the marketplace. Our Constitution places the ownership of private property at the very heart of our system of liberty. Our religious traditions celebrate the value of hard work and express the conviction that a virtuous life will result in material reward. Rather than vilify the rich, we hold them up as role models, and our mythology is steeped in stories of men on the make—the immigrant who comes to this country with nothing and strikes it big, the young man who heads West in search of his fortune. As Ted Turner famously said, in America money is how we keep score.
The result of this business culture has been a prosperity that’s unmatched in human history. It takes a trip overseas to fully appreciate just how good Americans have it; even our poor take for granted goods and services—electricity, clean water, indoor plumbing, telephones, televisions, and household appliances—that are still unattainable for most of the world. America may have been blessed with some of the planet’s best real estate, but clearly it’s not just our natural resources that account for our economic success. Our greatest asset has been our system of social organization, a system that for generations has encouraged constant innovation, individual initiative, and the efficient allocation of resources.
It should come as no surprise, then, that we have a tendency to take our free-market system as a given, to assume that it flows naturally from the laws of supply and demand and Adam Smith’s invisible hand. And from this assumption, it’s not much of a leap to assume that any government intrusion into the magical workings of the market—whether through taxation, regulation, lawsuits, tariffs, labor protections, or spending on entitlements—necessarily undermines private enterprise and inhibits economic growth. The bankruptcy of communism and socialism as alternative means of economic organization has only reinforced this assumption. In our standard economics textbooks and in our modern political debates, laissez-faire is the default rule; anyone who would challenge it swims against the prevailing tide.
It’s useful to remind ourselves, then, that our free-market system is the result neither of natural law nor of divine providence. Rather, it emerged through a painful process of trial and error, a series of difficult choices between efficiency and fairness, stability and change. And although the benefits of our free-market system have mostly derived from the individual efforts of generations of men and women pursuing their own vision of happiness, in each and every period of great economic upheaval and transition we’ve depended on government action to open up opportunity, encourage competition, and make the market work better.
In broad outline, government action has taken three forms. First, government has been called upon throughout our history to build the infrastructure, train the workforce, and otherwise lay the foundations necessary for economic growth. All the Founding Fathers recognized the connection between private property and liberty, but it was Alexander Hamilton who also recognized the vast potential of a national economy—one based not on America’s agrarian past but on a commercial and industrial future. To realize this potential, Hamilton argued, America needed a strong and active national government, and as America’s first Treasury secretary he set about putting his ideas to work. He nationalized the Revolutionary War debt, which not only stitched together the economies of the individual states but helped spur a national system of credit and fluid capital markets. He promoted policies—from strong patent laws to high tariffs—to encourage American manufacturing, and proposed investment in roads and bridges needed to move products to market.
Hamilton encountered fierce resistance from Thomas Jefferson, who feared that a strong national government tied to wealthy commercial interests would undermine his vision of an egalitarian democracy tied to the land. But Hamilton understood that only through the liberation of capital from local landed interests could America tap into its most powerful resource—namely the energy and enterprise of the American people. This idea of social mobility constituted one of the great early bargains of American capitalism; industrial and commercial capitalism might lead to greater instability, but it would be a dynamic system in which anyone with enough energy and talent could rise to the top. And on this point, at least, Jefferson agreed—it was based on his belief in a meritocracy, rather than a hereditary aristocracy, that Jefferson would champion the creation of a national, government-financed university that could educate and train talent across the new nation, and that he considered the founding of the University of Virginia to be one of his greatest achievements.
This tradition, of government investment in America’s physical infrastructure and in its people, was thoroughly embraced by Abraham Lincoln and the early Republican Party. For Lincoln, the essence of America was opportunity, the ability of “free labor” to advance in life. Lincoln considered capitalism the best means of creating such opportunity, but he also saw how the transition from an agricultural to an industrial society was disrupting lives and destroying communities.
So in the midst of civil war, Lincoln embarked on a series of policies that not only laid the groundwork for a fully integrated national economy but extended the ladders of opportunity downward to reach more and more people. He pushed for the construction of the first transcontinental railroad. He incorporated the National Academy of Sciences, to spur basic research and scientific discovery that could lead to new technology and commercial applications. He passed the landmark Homestead Act of 1862, which turned over vast amounts of public land across the western United States to settlers from the East and immigrants from around the world, so that they, too, could claim a stake in the nation’s growing economy. And then, rather than leave these homesteaders to fend for themselves, he created a system of land grant colleges to instruct farmers on the latest agricultural techniques, and to provide them the liberal education that would allow them to dream beyond the confines of life on the farm.
Hamilton’s and Lincoln’s basic insight—that the resources and power of the national government can facilitate, rather than supplant, a vibrant free market—has continued to be one of the cornerstones of both Republican and Democratic policies at every stage of America’s development. The Hoover Dam, the Tennessee Valley Authority, the interstate highway system, the Internet, the Human Genome Project—time and again, government investment has helped pave the way for an explosion of private economic activity. And through the creation of a system of public schools and institutions of higher education, as well as programs like the GI Bill that made a college education available to millions, government has helped provide individuals the tools to adapt and innovate in a climate of constant technological change.
Aside from making needed investments that private enterprise can’t or won’t make on its own, an active national government has also been indispensable in dealing with market failures—those recurring snags in any capitalist system that either inhibit the efficient workings of the market or result in harm to the public. Teddy Roosevelt recognized that monopoly power could restrict competition, and made “trust busting” a centerpiece of his administration. Woodrow Wilson instituted the Federal Reserve Bank, to manage the money supply and curb periodic panics in the financial markets. Federal and state governments established the first consumer laws—the Pure Food and Drug Act, the Meat Inspection Act—to protect Americans from harmful products.
But it was during the stock market crash of 1929 and the subsequent Depression that the government’s vital role in regulating the marketplace became fully apparent. With investor confidence shattered, bank runs threatening the collapse of the financial system, and a downward spiral in consumer demand and business investment, FDR engineered a series of government interventions that arrested further economic contraction. For the next eight years, the New Deal administration experimented with policies to restart the economy, and although not all of these interventions produced their intended results, they did leave behind a regulatory structure that helps limit the risk of economic crisis: a Securities and Exchange Commission to ensure transparency in the financial markets and protect smaller investors from fraud and insider manipulation; FDIC insurance to provide confidence to bank depositors; and countercyclical fiscal and monetary policies, whether in the form of tax cuts, increased liquidity, or direct government spending, to stimulate demand when business and consumers have pulled back from the market.
Finally—and most controversially—government has helped structure the social compact between business and the American worker. During America’s first 150 years, as capital became more concentrated in trusts and limited liability corporations, workers were prevented by law and by violence from forming unions that would increase their own leverage. Workers had almost no protections from unsafe or inhumane working conditions, whether in sweatshops or meatpacking plants. Nor did American culture have much sympathy for workers left impoverished by capitalism’s periodic gales of “creative destruction”—the recipe for individual success was greater toil, not pampering from the state. What safety net did exist came from the uneven and meager resources of private charity.
Again, it took the shock of the Great Depression, with a third of all people finding themselves out of work, ill housed, ill clothed, and ill fed, for government to correct this imbalance. Two years into office, FDR was able to push through Congress the Social Security Act of 1935, the centerpiece of the new welfare state, a safety net that would lift almost half of all senior citizens out of poverty, provide unemployment insurance for those who had lost their jobs, and provide modest welfare payments to the disabled and the elderly poor. FDR also initiated laws that fundamentally changed the relationship between capital and labor: the forty-hour workweek, child labor laws, and minimum wage laws; and the National Labor Relations Act, which made it possible to organize broad-based industrial unions and forced employers to bargain in good faith.
Part of FDR’s rationale in passing these laws came straight out of Keynesian economics: One cure for economic depression was putting more disposable income in the pockets of American workers. But FDR also understood that capitalism in a democracy required the consent of the people, and that by giving workers a larger share of the economic pie, his reforms would undercut the potential appeal of government-managed, command-and-control systems—whether fascist, socialist, or communist—that were gaining support all across Europe. As he would explain in 1944, “People who are hungry, people who are out of a job are the stuff of which dictatorships are made.”
For a while this seemed to be where the story would end—with FDR saving capitalism from itself through an activist federal government that invests in its people and infrastructure, regulates the marketplace, and protects labor from chronic deprivation. And in fact, for the next twenty-five years, through Republican and Democratic administrations, this model of the American welfare state enjoyed a broad consensus. There were those on the right who complained of creeping socialism, and those on the left who believed FDR had not gone far enough. But the enormous growth of America’s mass production economy, and the enormous gap in productive capacity between the United States and the war-torn economies of Europe and Asia, muted most ideological battles. Without any serious rivals, U.S. companies could routinely pass on higher labor and regulatory costs to their customers. Full employment allowed unionized factory workers to move into the middle class, support a family on a single income, and enjoy the stability of health and retirement security. And in such an environment of steady corporate profits and rising wages, policy makers found only modest political resistance to higher taxes and more regulation to tackle pressing social problems—hence the creation of the Great Society programs, including Medicare, Medicaid, and welfare, under Johnson; and the creation of the Environmental Protection Agency and Occupational Health and Safety Administration under Nixon.
There was only one problem with this liberal triumph—capitalism would not stand still. By the seventies, U.S. productivity growth, the engine of the postwar economy, began to lag. The increased assertiveness of OPEC allowed foreign oil producers to lop off a much bigger share of the global economy, exposing America’s vulnerability to disruptions in energy supplies. U.S. companies began to experience competition from low-cost producers in Asia, and by the eighties a flood of cheap imports—in textiles, shoes, electronics, and even automobiles—had started grabbing big chunks of the domestic market. Meanwhile, U.S.-based multinational corporations began locating some of their production facilities overseas—partly to access these foreign markets, but also to take advantage of cheap labor.
In this more competitive global environment, the old corporate formula of steady profits and stodgy management no longer worked. With less ability to pass on higher costs or shoddy products to consumers, corporate profits and market share shrank, and corporate shareholders began demanding more value. Some corporations found ways to improve productivity through innovation and automation. Others relied primarily on brutal layoffs, resistance to unionization, and a further shift of production overseas. Those corporate managers who didn’t adapt were vulnerable to corporate raiders and leveraged buyout artists, who would make the changes for them, without any regard for the employees whose lives might be upended or the communities that might be torn apart. One way or another, American companies became leaner and meaner—with old-line manufacturing workers and towns like Galesburg bearing the brunt of this transformation.
It wasn’t just the private sector that had to adapt to this new environment. As Ronald Reagan’s election made clear, the people wanted the government to change as well.
In his rhetoric, Reagan tended to exaggerate the degree to which the welfare state had grown over the previous twenty-five years. At its peak, the federal budget as a total share of the U.S. economy remained far below the comparable figures in Western Europe, even when you factored in the enormous U.S. defense budget. Still, the conservative revolution that Reagan helped usher in gained traction because Reagan’s central insight—that the liberal welfare state had grown complacent and overly bureaucratic, with Democratic policy makers more obsessed with slicing the economic pie than with growing the pie—contained a good deal of truth. Just as too many corporate managers, shielded from competition, had stopped delivering value, too many government bureaucracies had stopped asking whether their shareholders (the American taxpayer) and their consumers (the users of government services) were getting their money’s worth.
Not every government program worked the way it was advertised. Some functions could be better carried out by the private sector, just as in some cases market-based incentives could achieve the same results as command-and-control-style regulations, at a lower cost and with greater flexibility. The high marginal tax rates that existed when Reagan took office may not have curbed incentives to work or invest, but they did distort investment decisions—and did lead to a wasteful industry of setting up tax shelters. And while welfare certainly provided relief for many impoverished Americans, it did create some perverse incentives when it came to the work ethic and family stability.
Forced to compromise with a Democrat-controlled Congress, Reagan would never achieve many of his most ambitious plans for reducing government. But he fundamentally changed the terms of the political debate. The middle-class tax revolt became a permanent fixture in national politics and placed a ceiling on how much government could expand. For many Republicans, noninterference with the marketplace became an article of faith.
Of course, many voters continued to look to the government during economic downturns, and Bill Clinton’s call for more aggressive government action on the economy helped lift him to the White House. After the politically disastrous defeat of his health-care plan and the election of a Republican Congress in 1994, Clinton had to trim his ambitions but was able to put a progressive slant on some of Reagan’s goals. Declaring the era of big government over, Clinton signed welfare reform into law, pushed tax cuts for the middle class and working poor, and worked to reduce bureaucracy and red tape. And it was Clinton who would accomplish what Reagan never did, putting the nation’s fiscal house in order even while lessening poverty and making modest new investments in education and job training. By the time Clinton left office, it appeared as if some equilibrium had been achieved—a smaller government, but one that retained the social safety net FDR had first put into place.
Except capitalism is still not standing still. The policies of Reagan and Clinton may have trimmed some of the fat of the liberal welfare state, but they couldn’t change the underlying realities of global competition and technological revolution. Jobs are still moving overseas—not just manufacturing work, but increasingly work in the service sector that can be digitally transmitted, like basic computer programming. Businesses continue to struggle with high health-care costs. America continues to import far more than it exports, to borrow far more than it lends.
Without any clear governing philosophy, the Bush Administration and its congressional allies have responded by pushing the conservative revolution to its logical conclusion—even lower taxes, even fewer regulations, and an even smaller safety net. But in taking this approach, Republicans are fighting the last war, the war they waged and won in the eighties, while Democrats are forced to fight a rearguard action, defending the New Deal programs of the thirties.
Neither strategy will work anymore. America can’t compete with China and India simply by cutting costs and shrinking government—unless we’re willing to tolerate a drastic decline in American living standards, with smog-choked cities and beggars lining the streets. Nor can America compete simply by erecting trade barriers and raising the minimum wage—unless we’re willing to confiscate all the world’s computers.
But our history should give us confidence that we don’t have to choose between an oppressive, government-run economy and a chaotic and unforgiving capitalism. It tells us that we can emerge from great economic upheavals stronger, not weaker. Like those who came before us, we should be asking ourselves what mix of policies will lead to a dynamic free market and widespread economic security, entrepreneurial innovation and upward mobility. And we can be guided throughout by Lincoln’s simple maxim: that we will do collectively, through our government, only those things that we cannot do as well or at all individually and privately.
In other words, we should be guided by what works.
WHAT MIGHT SUCH a new economic consensus look like? I won’t pretend to have all the answers, and a detailed discussion of U.S. economic policy would fill up several volumes. But I can offer a few examples of where we can break free of our current political stalemate; places where, in the tradition of Hamilton and Lincoln, we can invest in our infrastructure and our people; ways that we can begin to modernize and rebuild the social contract that FDR first stitched together in the middle of the last century.
Let’s start with those investments that can make America more competitive in the global economy: investments in education, science and technology, and energy independence.
Throughout our history, education has been at the heart of a bargain this nation makes with its citizens: If you work hard and take responsibility, you’ll have a chance for a better life. And in a world where knowledge determines value in the job market, where a child in Los Angeles has to compete not just with a child in Boston but also with millions of children in Bangalore and Beijing, too many of America’s schools are not holding up their end of the bargain.
In 2005 I paid a visit to Thornton Township High School, a predominantly black high school in Chicago’s southern suburbs. My staff had worked with teachers there to organize a youth town hall meeting—representatives of each class spent weeks conducting surveys to find out what issues their fellow students were concerned about and then presented the results in a series of questions to me. At the meeting they talked about violence in the neighborhoods and a shortage of computers in their classrooms. But their number one issue was this: Because the school district couldn’t afford to keep teachers for a full school day, Thornton let out every day at 1:30 in the afternoon. With the abbreviated schedule, there was no time for students to take science lab or foreign language classes.
How come we’re getting shortchanged? they asked me. Seems like nobody even expects us to go to college, they said.
They wanted more school.
We’ve become accustomed to such stories, of poor black and Latino children languishing in schools that can’t prepare them for the old industrial economy, much less the information age. But the problems with our educational system aren’t restricted to the inner city. America now has one of the highest high school dropout rates in the industrialized world. By their senior year, American high school students score lower on math and science tests than most of their foreign peers. Half of all teenagers can’t understand basic fractions, half of all nine-year-olds can’t perform basic multiplication or division, and although more American students than ever are taking college entrance exams, only 22 percent are prepared to take college-level classes in English, math, and science.
I don’t believe government alone can turn these statistics around. Parents have the primary responsibility for instilling an ethic of hard work and educational achievement in their children. But parents rightly expect their government, through the public schools, to serve as full partners in the educational process—just as it has for earlier generations of Americans.
Unfortunately, instead of innovation and bold reform of our schools—the reforms that would allow the kids at Thornton to compete for the jobs at Google—what we’ve seen from government for close to two decades has been tinkering around the edges and a tolerance for mediocrity. Partly this is a result of ideological battles that are as outdated as they are predictable. Many conservatives argue that money doesn’t matter in raising educational achievement; that the problems in public schools are caused by hapless bureaucracies and intransigent teachers’ unions; and that the only solution is to break up the government’s education monopoly by handing out vouchers. Meanwhile, those on the left often find themselves defending an indefensible status quo, insisting that more spending alone will improve educational outcomes.
Both assumptions are wrong. Money does matter in education—otherwise why would parents pay so much to live in well-funded suburban school districts?—and many urban and rural schools still suffer from overcrowded classrooms, outdated books, inadequate equipment, and teachers who are forced to pay out of pocket for basic supplies. But there’s no denying that the way many public schools are managed poses at least as big a problem as how well they’re funded.
Our task, then, is to identify those reforms that have the highest impact on student achievement, fund them adequately, and eliminate those programs that don’t produce results. And in fact we already have hard evidence of reforms that work: a more challenging and rigorous curriculum with emphasis on math, science, and literacy skills; longer hours and more days to give children the time and sustained attention they need to learn; early childhood education for every child, so they’re not already behind on their first day of school; meaningful, performance-based assessments that can provide a fuller picture of how a student is doing; and the recruitment and training of transformative principals and more effective teachers.
This last point—the need for good teachers—deserves emphasis. Recent studies show that the single most important factor in determining a student’s achievement isn’t the color of his skin or where he comes from, but who the child’s teacher is. Unfortunately, too many of our schools depend on inexperienced teachers with little training in the subjects they’re teaching, and too often those teachers are concentrated in already struggling schools. Moreover, the situation is getting worse, not better: Each year, school districts are hemorrhaging experienced teachers as the Baby Boomers reach retirement, and two million teachers must be recruited in the next decade just to meet the needs of rising enrollment.
The problem isn’t that there’s no interest in teaching; I constantly meet young people who’ve graduated from top colleges and have signed up, through programs like Teach for America, for two-year stints in some of the country’s toughest public schools. They find the work extraordinarily rewarding; the kids they teach benefit from their creativity and enthusiasm. But by the end of two years, most have either changed careers or moved to suburban schools—a consequence of low pay, a lack of support from the educational bureaucracy, and a pervasive feeling of isolation.
If we’re serious about building a twenty-first-century school system, we’re going to have to take the teaching profession seriously. This means changing the certification process to allow a chemistry major who wants to teach to avoid expensive additional course work; pairing up new recruits with master teachers to break their isolation; and giving proven teachers more control over what goes on in their classrooms.
It also means paying teachers what they’re worth. There’s no reason why an experienced, highly qualified, and effective teacher shouldn’t earn $100,000 annually at the peak of his or her career. Highly skilled teachers in such critical fields as math and science—as well as those willing to teach in the toughest urban schools—should be paid even more.
There’s just one catch. In exchange for more money, teachers need to become more accountable for their performance—and school districts need to have greater ability to get rid of ineffective teachers.
So far, teacher’s unions have resisted the idea of pay for performance, in part because it could be disbursed at the whim of a principal. The unions also argue—rightly, I think—that most school districts rely solely on test scores to measure teacher performance, and that test scores may be highly dependent on factors beyond any teacher’s control, like the number of low-income or special-needs students in their classroom.
But these aren’t insoluble problems. Working with teacher’s unions, states and school districts can develop better measures of performance, ones that combine test data with a system of peer review (most teachers can tell you with amazing consistency which teachers in their schools are really good, and which are really bad). And we can make sure that nonperforming teachers no longer handicap children who want to learn.
Indeed, if we’re to make the investments required to revamp our schools, then we will need to rediscover our faith that every child can learn. Recently, I had the chance to visit Dodge Elementary School, on the West Side of Chicago, a school that had once been near the bottom on every measure but that is in the midst of a turnaround. While I was talking to some of the teachers about the challenges they faced, one young teacher mentioned what she called the “These Kids Syndrome”—the willingness of society to find a million excuses for why “these kids” can’t learn; how “these kids come from tough backgrounds” or “these kids are too far behind.”
“When I hear that term, it drives me nuts,” the teacher told me. “They’re not ‘these kids.’ They’re our kids.”
How America’s economy performs in the years to come may depend largely on how well we take such wisdom to heart.
OUR INVESTMENT IN education can’t end with an improved elementary and secondary school system. In a knowledge-based economy where eight of the nine fastest-growing occupations this decade require scientific or technological skills, most workers are going to need some form of higher education to fill the jobs of the future. And just as our government instituted free and mandatory public high schools at the dawn of the twentieth century to provide workers the skills needed for the industrial age, our government has to help today’s workforce adjust to twenty-first-century realities.
In many ways, our task should be easier than it was for policy makers a hundred years ago. For one thing, our network of universities and community colleges already exists and is well equipped to take on more students. And Americans certainly don’t need to be convinced of the value of a higher education—the percentage of young adults getting bachelor’s degrees has risen steadily each decade, from around 16 percent in 1980 to almost 33 percent today.
Where Americans do need help, immediately, is in managing the rising cost of college—something with which Michelle and I are all too familiar (for the first ten years of our marriage, our combined monthly payments on our undergraduate and law school debt exceeded our mortgage by a healthy margin). Over the last five years, the average tuition and fees at four-year public colleges, adjusted for inflation, have risen 40 percent. To absorb these costs, students have been taking on ever-increasing debt levels, which discourages many undergraduates from pursuing careers in less lucrative fields like teaching. And an estimated two hundred thousand college-qualified students each year choose to forgo college altogether because they can’t figure out how to pay the bills.
There are a number of steps we can take to control costs and improve access to higher education. States can limit annual tuition increases at public universities. For many nontraditional students, technical schools and online courses may provide a cost-effective option for retooling in a constantly changing economy. And students can insist that their institutions focus their fund-raising efforts more on improving the quality of instruction than on building new football stadiums.
But no matter how well we do in controlling the spiraling cost of education, we will still need to provide many students and parents with more direct help in meeting college expenses, whether through grants, low-interest loans, tax-free educational savings accounts, or full tax deductibility of tuition and fees. So far, Congress has been moving in the opposite direction, by raising interest rates on federally guaranteed student loans and failing to increase the size of grants for low-income students to keep pace with inflation. There’s no justification for such policies—not if we want to maintain opportunity and upward mobility as the hallmark of the U.S. economy.
There’s one other aspect of our educational system that merits attention—one that speaks to the heart of America’s competitiveness. Since Lincoln signed the Morrill Act and created the system of land grant colleges, institutions of higher learning have served as the nation’s primary research and development laboratories. It’s through these institutions that we’ve trained the innovators of the future, with the federal government providing critical support for the infrastructure—everything from chemistry labs to particle accelerators—and the dollars for research that may not have an immediate commercial application but that can ultimately lead to major scientific breakthroughs.
Here, too, our policies have been moving in the wrong direction. At the 2006 Northwestern University commencement, I fell into a conversation with Dr. Robert Langer, an Institute Professor of chemical engineering at MIT and one of the nation’s foremost scientists. Langer isn’t just an ivory tower academic—he holds more than five hundred patents, and his research has led to everything from the development of the nicotine patch to brain cancer treatments. As we waited for the procession to begin, I asked him about his current work, and he mentioned his research in tissue engineering, research that promised new, more effective methods of delivering drugs to the body. Remembering the recent controversies surrounding stem cell research, I asked him whether the Bush Administration’s limitation on the number of stem cell lines was the biggest impediment to advances in his field. He shook his head.
“Having more stem cell lines would definitely be useful,” Langer told me, “but the real problem we’re seeing is significant cutbacks in federal grants.” He explained that fifteen years ago, 20 to 30 percent of all research proposals received significant federal support. That level is now closer to 10 percent. For scientists and researchers, this means more time spent raising money and less time spent on research. It also means that each year, more and more promising avenues of research are cut off—especially the high-risk research that may ultimately yield the biggest rewards.
Dr. Langer’s observation isn’t unique. Each month, it seems, scientists and engineers visit my office to discuss the federal government’s diminished commitment to funding basic scientific research. Over the last three decades federal funding for the physical, mathematical, and engineering sciences has declined as a percentage of GDP—just at the time when other countries are substantially increasing their own R & D budgets. And as Dr. Langer points out, our declining support for basic research has a direct impact on the number of young people going into math, science, and engineering—which helps explain why China is graduating eight times as many engineers as the United States every year.
If we want an innovation economy, one that generates more Googles each year, then we have to invest in our future innovators—by doubling federal funding of basic research over the next five years, training one hundred thousand more engineers and scientists over the next four years, or providing new research grants to the most outstanding early-career researchers in the country. The total price tag for maintaining our scientific and technological edge comes out to approximately $42 billion over five years—real money, to be sure, but just 15 percent of the most recent federal highway bill.
In other words, we can afford to do what needs to be done. What’s missing is not money, but a national sense of urgency.
THE LAST CRITICAL investment we need to make America more competitive is in an energy infrastructure that can move us toward energy independence. In the past, war or a direct threat to national security has shaken America out of its complacency and led to bigger investments in education and science, all with an eye toward minimizing our vulnerabilities. That’s what happened at the height of the Cold War, when the launching of the satellite Sputnik led to fears that the Soviets were slipping ahead of us technologically. In response, President Eisenhower doubled federal aid to education and provided an entire generation of scientists and engineers the training they needed to lead revolutionary advances. That same year, the Defense Advanced Research Projects Agency, or DARPA, was formed, providing billions of dollars to basic research that would eventually help create the Internet, bar codes, and computer-aided design. And in 1961, President Kennedy would launch the Apollo space program, further inspiring young people across the country to enter the New Frontier of science.
Our current situation demands that we take the same approach with energy. It’s hard to overstate the degree to which our addiction to oil undermines our future. According to the National Commission on Energy Policy, without any changes to our energy policy U.S. demand for oil will jump 40 percent over the next twenty years. Over the same period, worldwide demand is expected to jump at least 30 percent, as rapidly developing countries like China and India expand industrial capacity and add 140 million cars to their roads.
Our dependence on oil doesn’t just affect our economy. It undermines our national security. A large portion of the $800 million we spend on foreign oil every day goes to some of the world’s most volatile regimes—Saudi Arabia, Nigeria, Venezuela, and, indirectly at least, Iran. It doesn’t matter whether they are despotic regimes with nuclear intentions or havens for madrassas that plant the seeds of terror in young minds—they get our money because we need their oil.
What’s worse, the potential for supply disruption is severe. In the Persian Gulf, Al Qaeda has been attempting attacks on poorly defended oil refineries for years; a successful attack on just one of the Saudis’ major oil complexes could send the U.S. economy into a tailspin. Osama bin Laden himself advises his followers to “focus your operations on [oil], especially in Iraq and the Gulf area, since this will cause them to die off.”
And then there are the environmental consequences of our fossil fuel–based economy. Just about every scientist outside the White House believes climate change is real, is serious, and is accelerated by the continued release of carbon dioxide. If the prospect of melting ice caps, rising sea levels, changing weather patterns, more frequent hurricanes, more violent tornadoes, endless dust storms, decaying forests, dying coral reefs, and increases in respiratory illness and insect-borne diseases—if all that doesn’t constitute a serious threat, I don’t know what does.
So far, the Bush Administration’s energy policy has been focused on subsidies to big oil companies and expanded drilling—coupled with token investments in the development of alternative fuels. This approach might make economic sense if America harbored plentiful and untapped oil supplies that could meet its needs (and if oil companies weren’t experiencing record profits). But such supplies don’t exist. The United States has 3 percent of the world’s oil reserves. We use 25 percent of the world’s oil. We can’t drill our way out of the problem.
What we can do is create renewable, cleaner energy sources for the twenty-first century. Instead of subsidizing the oil industry, we should end every single tax break the industry currently receives and demand that 1 percent of the revenues from oil companies with over $1 billion in quarterly profits go toward financing alternative energy research and the necessary infrastructure. Not only would such a project pay huge economic, foreign policy, and environmental dividends—it could be the vehicle by which we train an entire new generation of American scientists and engineers and a source of new export industries and high-wage jobs.
Countries like Brazil have already done this. Over the last thirty years, Brazil has used a mix of regulation and direct government investment to develop a highly efficient biofuel industry; 70 percent of its new vehicles now run on sugar-based ethanol instead of gasoline. Without the same governmental attention, the U.S. ethanol industry is just now catching up. Free-market proponents argue that the heavy-handed approach of the
Brazilian government has no place in the more market-oriented U.S. economy. But regulation, if applied with flexibility and sensitivity to market forces, can actually spur private sector innovation and investment in the energy sector.
Take the issue of fuel-efficiency standards. Had we steadily raised those standards over the past two decades, when gas was cheap, U.S. automakers might have invested in new, fuel-efficient models instead of gas-guzzling SUVs—making them more competitive as gas prices rose. Instead, we’re seeing Japanese competitors run circles around Detroit. Toyota plans to sell one hundred thousand of their popular Priuses in 2006, while GM’s hybrid won’t even hit the market until 2007. And we can expect companies like Toyota to outcompete U.S automakers in the burgeoning Chinese market since China already has higher fuel-efficiency standards than we do.
The bottom line is that fuel-efficient cars and alternative fuels like E85, a fuel formulated with 85 percent ethanol, represent the future of the auto industry. It is a future American car companies can attain if we start making some tough choices now. For years U.S. automakers and the UAW have resisted higher fuel-efficiency standards because retooling costs money, and Detroit is already struggling under huge retiree health-care costs and stiff competition. So during my first year in the Senate I proposed legislation I called “Health Care for Hybrids.” The bill makes a deal with U.S. automakers: In exchange for federal financial assistance in meeting the health-care costs of retired autoworkers, the Big Three would reinvest these savings into developing more fuel-efficient vehicles.
Aggressively investing in alternative fuel sources can also lead to the creation of thousands of new jobs. Ten or twenty years down the road, that old Maytag plant in Galesburg could reopen its doors as a cellulosic ethanol refinery. Down the street, scientists might be busy in a research lab working on a new hydrogen cell. And across the way, a new auto company could be busy churning out hybrid cars. The new jobs created could be filled by American workers trained with new skills and a world-class education, from elementary school to college.
But we can’t afford to hesitate much longer. I got a glimpse of what a nation’s dependence on foreign energy can do in the summer of 2005, when Senator Dick Lugar and I visited Ukraine and met with the country’s newly elected president, Viktor Yushchenko. The story of Yushchenko’s election had made headlines around the world: Running against a ruling party that for years had catered to the wishes of neighboring Russia, Yushchenko survived an assassination attempt, a stolen election, and threats from Moscow, before the Ukrainian people finally rose up in an “Orange Revolution”—a series of peaceful mass demonstrations that ultimately led to Yushchenko’s installation as president.
It should have been a heady time in the former Soviet state, and indeed, everywhere we went there was talk of democratic liberalization and economic reform. But in our conversations with Yushchenko and his cabinet, we soon discovered that Ukraine had a major problem—it continued to be entirely dependent on Russia for all its oil and natural gas. Already, Russia had indicated that it would end Ukraine’s ability to purchase this energy at below-world-market prices, a move that would lead to a tripling of home heating oil prices during the winter months leading up to parliamentary elections. Pro-Russian forces inside the country were biding their time, aware that for all the soaring rhetoric, the orange banners, the demonstrations, and Yushchenko’s courage, Ukraine still found itself at the mercy of its former patron.
A nation that can’t control its energy sources can’t control its future. Ukraine may have little choice in the matter, but the wealthiest and most powerful nation on earth surely does.
EDUCATION. SCIENCE AND technology. Energy. Investments in these three key areas would go a long way in making America more competitive. Of course, none of these investments will yield results overnight. All will be subject to controversy. Investment in R & D and education will cost money at a time when our federal budget is already stretched. Increasing the fuel efficiency of American cars or instituting performance pay for public-school teachers will involve overcoming the suspicions of workers who already feel embattled. And arguments over the wisdom of school vouchers or the viability of hydrogen fuel cells won’t go away anytime soon.
But while the means we use to accomplish these ends should be subject to vigorous and open debate, the ends themselves shouldn’t be in dispute. If we fail to act, our competitive position in the world will decline. If we act boldly, then our economy will be less vulnerable to economic disruption, our trade balance will improve, the pace of U.S. technological innovation will accelerate, and the American worker will be in a stronger position to adapt to the global economy.
Still, will that be enough? Assuming we’re able to bridge some of our ideological differences and keep the U.S. economy growing, will I be able to look squarely in the eyes of those workers in Galesburg and tell them that globalization can work for them and their children?
That was the question on my mind during the 2005 debate on the Central American Free Trade Agreement, or CAFTA. Viewed in isolation, the agreement posed little threat to American workers—the combined economies of the Central American countries involved were roughly the same as that of New Haven, Connecticut. It opened up new markets for U.S. agricultural producers, and promised much-needed foreign investment in poor countries like Honduras and the Dominican Republic. There were some problems with the agreement, but overall, CAFTA was probably a net plus for the U.S. economy.
When I met with representatives from organized labor, though, they were having none of it. As far as they were concerned, NAFTA had been a disaster for U.S. workers, and CAFTA just promised more of the same. What was needed, they said, was not just free trade but fair trade: stronger labor protections in countries that trade with the United States, including rights to unionize and bans on child labor; improved environmental standards in these same countries; an end to unfair government subsidies to foreign exporters and nontariff barriers on U.S. exports; stronger protections for U.S. intellectual property; and—in the case of China in particular—an end to an artificially devalued currency that put U.S. companies at a perpetual disadvantage.
Like most Democrats, I strongly support all these things. And yet, I felt obliged to say to the union reps that none of these measures would change the underlying realities of globalization. Stronger labor or environmental provisions in a trade bill can help put pressure on countries to keep improving worker conditions, as can efforts to obtain agreements from U.S. retailers to sell goods produced at a fair wage. But they won’t eliminate the enormous gap in hourly wages between U.S. workers and workers in Honduras, Indonesia, Mozambique, or Bangladesh, countries where work in a dirty factory or overheated sweatshop is often considered a step up on the economic ladder.
Likewise, China’s willingness to let its currency rise might modestly raise the price on goods manufactured there, thereby making U.S. goods somewhat more competitive. But when all is said and done, China will still have more surplus labor in its countryside than half the entire population of the United States—which means Wal-Mart will be keeping suppliers there busy for a very, very long time.
We need a new approach to the trade question, I would say, one that acknowledges these realities.
And my union brothers and sisters would nod and say that they were interested in talking to me about my ideas—but in the meantime, could they mark me as a “no” vote on CAFTA?
In fact, the basic debate surrounding free trade has hardly changed since the early 1980s, with labor and its allies generally losing the fight. The conventional wisdom among policy makers, the press, and the business community these days is that free trade makes everyone better off. At any given time, so the argument goes, some U.S. jobs may be lost to trade and cause localized pain and hardship—but for every one thousand manufacturing jobs lost due to a plant closure, the same or an even greater number of jobs will be created in the new and expanding service sectors of the economy.
As the pace of globalization has picked up, though, it’s not just unions that are worrying about the long-term prospects for U.S. workers. Economists have noted that throughout the world—including China and India—it seems to take more economic growth each year to produce the same number of jobs, a consequence of ever-increasing automation and higher productivity. Some analysts question whether a U.S. economy more dominated by services can expect to see the same productivity growth, and hence rising living standards, as we’ve seen in the past. In fact, over the past five years, statistics consistently show that the wages of American jobs being lost are higher than the wages of American jobs being created.
And while upgrading the education levels of American workers will improve their ability to adapt to the global economy, a better education alone won’t necessarily protect them from growing competition. Even if the United States produced twice as many computer programmers per capita as China, India, or any Eastern European country, the sheer number of new entrants into the global marketplace means a lot more programmers overseas than there are in the United States—all of them available at one-fifth the salary to any business with a broadband link.
In other words, free trade may well grow the worldwide economic pie—but there’s no law that says workers in the United States will continue to get a bigger and bigger slice.
Given these realities, it’s easy to understand why some might want to put a stop to globalization—to freeze the status quo and insulate ourselves from economic disruption. On a stop to New York during the CAFTA debate, I mentioned some of the studies I’d been reading to Robert Rubin, the former U.S. Treasury secretary under Clinton whom I had gotten to know during my campaign. It would be hard to find a Democrat more closely identified with globalization than Rubin—not only had he been one of Wall Street’s most influential bankers for decades, but for much of the nineties he had helped chart the course of world finance. He also happens to be one of the more thoughtful and unassuming people I know. So I asked him whether at least some of the fears I’d heard from the Maytag workers in Galesburg were well founded—that there was no way to avoid a long-term decline in U.S. living standards if we opened ourselves up entirely to competition with much cheaper labor around the world.
“That’s a complicated question,” Rubin said. “Most economists will tell you that there’s no inherent limit to the number of good new jobs that the U.S. economy can generate, because there’s no limit to human ingenuity. People invent new industries, new needs and wants. I think the economists are probably right. Historically, it’s been the case. Of course, there’s no guarantee that the pattern holds this time. With the pace of technological change, the size of the countries we’re competing against, and the cost differentials with those countries, we may see a different dynamic emerge. So I suppose it’s possible that even if we do everything right, we could still face some challenges.”
I suggested that the folks in Galesburg might not find his answer reassuring.
“I said it’s possible, not probable,” he said. “I tend to be cautiously optimistic that if we get our fiscal house in order and improve our educational system, their children will do just fine. Anyway, there’s one thing that I would tell the people in Galesburg is certain. Any efforts at protectionism will be counterproductive—and it will make their children worse off in the bargain.”
I appreciated Rubin’s acknowledgment that American workers might have legitimate cause for concern when it came to globalization; in my experience, most labor leaders have thought deeply about the issue and can’t be dismissed as kneejerk protectionists.
Still, it was hard to deny Rubin’s basic insight: We can try to slow globalization, but we can’t stop it. The U.S. economy is now so integrated with the rest of the world, and digital commerce so widespread, that it’s hard to even imagine, much less enforce, an effective regime of protectionism. A tariff on imported steel may give temporary relief to U.S. steel producers, but it will make every U.S. manufacturer that uses steel in its products less competitive on the world market. It’s tough to “buy American” when a video game sold by a U.S. company has been developed by Japanese software engineers and packaged in Mexico. U.S. Border Patrol agents can’t interdict the services of a call center in India, or stop an electrical engineer in Prague from sending his work via email to a company in Dubuque. When it comes to trade, there are few borders left.
This doesn’t mean, however, that we should just throw up our hands and tell workers to fend for themselves. I would make this point to President Bush toward the end of the
CAFTA debate, when I and a group of other senators were invited to the White House for discussions. I told the President that I believed in the benefits of trade, and that I had no doubt the White House could squeeze out the votes for this particular agreement. But I said that resistance to CAFTA had less to do with the specifics of the agreement and more to do with the growing insecurities of the American worker. Unless we found strategies to allay those fears, and sent a strong signal to American workers that the federal government was on their side, protectionist sentiment would only grow.
The President listened politely and said that he’d be interested in hearing my ideas. In the meantime, he said, he hoped he could count on my vote.
He couldn’t. I ended up voting against CAFTA, which passed the Senate by a vote of 55 to 45. My vote gave me no satisfaction, but I felt it was the only way to register a protest against what I considered to be the White House’s inattention to the losers from free trade. Like Bob Rubin, I am optimistic about the long-term prospects for the U.S. economy and the ability of U.S. workers to compete in a free trade environment—but only if we distribute the costs and benefits of globalization more fairly across the population.
THE LAST TIME we faced an economic transformation as disruptive as the one we face today, FDR led the nation to a new social compact—a bargain between government, business, and workers that resulted in widespread prosperity and economic security for more than fifty years. For the average American worker, that security rested on three pillars: the ability to find a job that paid enough to support a family and save for emergencies; a package of health and retirement benefits from his employer; and a government safety net—Social Security, Medicaid and Medicare, unemployment insurance, and to a lesser extent federal bankruptcy and pension protections—that could cushion the fall of those who suffered setbacks in their lives.
Certainly the impulse behind this New Deal compact involved a sense of social solidarity: the idea that employers should do right by their workers, and that if fate or miscalculation caused any one of us to stumble, the larger American community would be there to lift us up.
But this compact also rested on an understanding that a system of sharing risks and rewards can actually improve the workings of the market. FDR understood that decent wages and benefits for workers could create the middle-class base of consumers that would stabilize the U.S. economy and drive its expansion. And FDR recognized that we would all be more likely to take risks in our lives—to change jobs or start new businesses or welcome competition from other countries—if we knew that we would have some measure of protection should we fail.
That’s what Social Security, the centerpiece of New Deal legislation, has provided—a form of social insurance that protects us from risk. We buy private insurance for ourselves in the marketplace all the time, because as self-reliant as we may be, we recognize that things don’t always work out as planned—a child gets sick, the company we work for shuts its doors, a parent contracts Alzheimer’s, the stock market portfolio turns south. The bigger the pool of insured, the more risk is spread, the more coverage provided, and the lower the cost. Sometimes, though, we can’t buy insurance for certain risks on the marketplace—usually because companies find it unprofitable. Sometimes the insurance we get through our job isn’t enough, and we can’t afford to buy more on our own. Sometimes an unexpected tragedy strikes and it turns out we didn’t have enough insurance. For all these reasons, we ask the government to step in and create an insurance pool for us—a pool that includes all of the American people.
Today the social compact FDR helped construct is beginning to crumble. In response to increased foreign competition and pressure from a stock market that insists on quarterly boosts in profitability, employers are automating, downsizing, and offshoring, all of which makes workers more vulnerable to job loss and gives them less leverage to demand increased pay or benefits. Although the federal government offers a generous tax break for companies that provide health insurance, companies have shifted the skyrocketing costs onto employees in the form of higher premiums, copayments, and deductibles; meanwhile, half of small businesses, where millions of Americans work, can’t afford to offer their employees any insurance at all. In similar fashion, companies are shifting from the traditional defined-benefit pension plan to 401(k)s, and in some cases using bankruptcy court to shed existing pension obligations.
The cumulative impact on families is severe. The wages of the average American worker have barely kept pace with inflation over the past two decades. Since 1988, the average family’s health insurance costs have quadrupled. Personal savings rates have never been lower. And levels of personal debt have never been higher.
Rather than use the government to lessen the impact of these trends, the Bush Administration’s response has been to encourage them. That’s the basic idea behind the Ownership Society: If we free employers of any obligations to their workers and dismantle what’s left of New Deal, government-run social insurance programs, then the magic of the marketplace will take care of the rest. If the guiding philosophy behind the traditional system of social insurance could be described as “We’re all in it together,” the philosophy behind the Ownership Society seems to be “You’re on your own.”
It’s a tempting idea, one that’s elegant in its simplicity and that frees us of any obligations we have toward one another. There’s only one problem with it. It won’t work—at least not for those who are already falling behind in the global economy.
Take the Administration’s attempt to privatize Social Security. The Administration argues that the stock market can provide individuals a better return on investment, and in the aggregate at least they are right; historically, the market outperforms Social Security’s cost-of-living adjustments. But individual investment decisions will always produce winners and losers—those who bought Microsoft early and those who bought Enron late. What would the Ownership Society do with the losers? Unless we’re willing to see seniors starve on the street, we’re going to have to cover their retirement expenses one way or another—and since we don’t know in advance which of us will be losers, it makes sense for all of us to chip in to a pool that gives us at least some guaranteed income in our golden years. That doesn’t mean we shouldn’t encourage individuals to pursue higher-risk, higher-return investment strategies. They should. It just means that they should do so with savings other than those put into Social Security.
The same principles are at work when it comes to the Administration’s efforts to encourage a shift from employer- or government-based health-care plans to individual Health Savings Accounts. The idea might make sense if the lump sum each individual received were enough to buy a decent health-care plan through his employer, and if that lump sum kept pace with inflation of health-care costs. But what if you work for an employer who doesn’t offer a health-care plan? Or what if the Administration’s theory on health-care inflation turns out to be wrong—if it turns out that health-care costs aren’t due to people’s cavalier attitude toward their health or an irrational desire to purchase more than they need? Then “freedom to choose” will mean that employees bear the brunt of future increases in health care, and the amount of money in their Health Savings Accounts will buy less and less coverage each year.
In other words, the Ownership Society doesn’t even try to spread the risks and rewards of the new economy among all Americans. Instead, it simply magnifies the uneven risks and rewards of today’s winner-take-all economy. If you are healthy or wealthy or just plain lucky, then you will become more so. If you are poor or sick or catch a bad break, you will have nobody to look to for help. That’s not a recipe for sustained economic growth or the maintenance of a strong American middle class. It’s certainly not a recipe for social cohesion. It runs counter to those values that say we have a stake in each other’s success.
It’s not who we are as a people.
FORTUNATELY, THERE’S AN alternative approach, one that recasts FDR’s social compact to meet the needs of a new century. In each area where workers are vulnerable—wages, job loss, retirement, and health care—there are good ideas, some old and some new, that would go a long way toward making Americans more secure.
Let’s start with wages. Americans believe in work—not just as a means of supporting themselves but as a means of giving their lives purpose and direction, order and dignity. The old welfare program, Aid to Families with Dependent Children, too often failed to honor this core value, which helps explain not only its unpopularity with the public but also why it often isolated the very people it was supposed to help.
On the other hand, Americans also believe that if we work full-time, we should be able to support ourselves and our kids. For many people on the bottom rungs of the economy—mainly low-skilled workers in the rapidly growing service sector—this basic promise isn’t being fulfilled.
Government policies can help these workers, with little impact on market efficiency. For starters, we can raise the minimum wage. It may be true—as some economists argue—that any big jumps in the minimum wage discourage employers from hiring more workers. But when the minimum wage hasn’t been changed in nine years and has less purchasing power in real dollars than it did in 1955, so that someone working full-time today in a minimum-wage job doesn’t earn enough to rise out of poverty, such arguments carry less force. The Earned Income Tax Credit, a program championed by Ronald Reagan that provides low-wage workers with supplemental income through the tax code, should also be expanded and streamlined so more families can take advantage of it.
To help all workers adapt to a rapidly changing economy, it’s also time to update the existing system of unemployment insurance and trade adjustment assistance. In fact, there are a slew of good ideas out there on how to create a more comprehensive system of adjustment assistance. We could extend such assistance to service industries, create flexible education accounts that workers could use to retrain, or provide retraining assistance for workers in sectors of the economy vulnerable to dislocation before they lose their jobs. And in an economy where the job you lose often paid more than the new job you gain, we could also try the concept of wage insurance, which provides 50 percent of the difference between a worker’s old wage and his new wage for anywhere from one to two years.
Finally, to help workers gain higher wages and better benefits, we need once again to level the playing field between organized labor and employers. Since the early 1980s, unions have been steadily losing ground, not just because of changes in the economy but also because today’s labor laws—and the make-up of the National Labor Relations Board—have provided workers with very little protection. Each year, more than twenty thousand workers are fired or lose wages simply for trying to organize and join unions. That needs to change. We should have tougher penalties to prevent employers from firing or discriminating against workers involved in organizing efforts. Employers should have to recognize a union if a majority of employees sign authorization cards choosing the union to represent them. And federal mediation should be available to help an employer and a new union reach agreement on a contract within a reasonable amount of time.
Business groups may argue that a more unionized workforce will rob the U.S. economy of flexibility and its competitive edge. But it’s precisely because of a more competitive global environment that we can expect unionized workers to want to cooperate with employers—so long as they are getting their fair share of higher productivity.
Just as government policies can boost workers’ wages without hurting the competitiveness of U.S. firms, so can we strengthen their ability to retire with dignity. We should start with a commitment to preserve Social Security’s essential character and shore up its solvency. The problems with the Social Security trust fund are real but manageable. In 1983, when facing a similar problem, Ronald Reagan and House Speaker Tip O’Neill got together and shaped a bipartisan plan that stabilized the system for the next sixty years. There’s no reason we can’t do the same today.
With respect to the private retirement system, we should acknowledge that defined-benefit pension plans have been declining, but insist that companies fulfill any outstanding promises to their workers and retirees. Bankruptcy laws should be amended to move pension beneficiaries to the front of the creditor line so that companies can’t just file for Chapter 11 to stiff workers. Moreover, new rules should force companies to properly fund their pension funds, in part so taxpayers don’t end up footing the bill.
And if Americans are going to depend on defined-contribution plans like 401(k)s to supplement Social Security, then the government should step in to make them more broadly available to all Americans and more effective in encouraging savings. Former
Clinton economic adviser Gene Sperling has suggested the creation of a universal 401(k), in which the government would match contributions made into a new retirement account by low-and moderate-income families. Other experts have suggested the simple (and cost-free) step of having employers automatically enroll their employees in their 401(k) plans at the maximum allowable level; people could still choose to contribute less than the maximum or not participate at all, but evidence shows that by changing the default rule, employee participation rates go up dramatically. As a complement to Social Security, we should take the best and most affordable of these ideas and begin moving toward a beefed-up, universally available pension system that not only promotes savings but gives all Americans a bigger stake in the fruits of globalization.
As vital as it may be to raise the wages of American workers and improve their retirement security, perhaps our most pressing task is to fix our broken health-care system. Unlike Social Security, the two main government-funded health-care programs—Medicare and Medicaid—really are broken; without any changes, by 2050 these two entitlements, along with Social Security, could grow to consume as large a share of our national economy as the entire federal budget does today. The addition of a hugely expensive prescription drug benefit that provides limited coverage and does nothing to control the cost of drugs has only made the problem worse. And the private system has evolved into a patchwork of inefficient bureaucracies, endless paperwork, overburdened providers, and dissatisfied patients.
In 1993, President Clinton took a stab at creating a system of universal coverage, but was stymied. Since then, the public debate has been deadlocked, with some on the right arguing for a strong dose of market discipline through Health Savings Accounts, others on the left arguing for a single-payer national health-care plan similar to those that exist in Europe and Canada, and experts across the political spectrum recommending a series of sensible but incremental reforms to the existing system.
It’s time we broke this impasse by acknowledging a few simple truths.
Given the amount of money we spend on health care (more per capita than any other nation), we should be able to provide basic coverage to every single American. But we can’t sustain current rates of health-care inflation every year; we have to contain costs for the entire system, including Medicare and Medicaid.
With Americans changing jobs more frequently, more likely to go through spells of unemployment, and more likely to work part-time or to be self-employed, health insurance can’t just run through employers anymore. It needs to be portable.
The market alone can’t solve our health-care woes—in part because the market has proven incapable of creating large enough insurance pools to keep costs to individuals affordable, in part because health care is not like other products or services (when your child gets sick, you don’t go shopping for the best bargain).
And finally, whatever reforms we implement should provide strong incentives for improved quality, prevention, and more efficient delivery of care.
With these principles in mind, let me offer just one example of what a serious health-care reform plan might look like. We could start by having a nonpartisan group like the
National Academy of Science’s Institute of Medicine (IOM) determine what a basic, high-quality health-care plan should look like and how much it should cost. In designing this model plan, the IOM would examine which existing health-care programs deliver the best care in the most cost-effective manner. In particular, the model plan would emphasize coverage of primary care, prevention, catastrophic care, and the management of chronic conditions like asthma and diabetes. Overall, 20 percent of all patients account for 80 percent of the care, and if we can prevent diseases from occurring or manage their effects through simple interventions like making sure patients control their diets or take their medicines regularly, we can dramatically improve patient outcomes and save the system a great deal of money.
Next, we would allow anyone to purchase this model health-care plan either through an existing insurance pool like the one set up for federal employees, or through a series of new pools set up in every state. Private insurers like Blue Cross Blue Shield and Aetna would compete to provide coverage to participants in these pools, but whatever plan they offered would have to meet the criteria for high quality and cost controls set forth by IOM.
To further drive down costs, we would require that insurers and providers who participate in Medicare, Medicaid, or the new health plans have electronic claims, electronic records, and up-to-date patient error reporting systems—all of which would dramatically cut down on administrative costs, and the number of medical errors and adverse events (which in turn would reduce costly medical malpractice lawsuits). This simple step alone could cut overall health-care costs by up to 10 percent, with some experts pointing to even greater savings.
With the money we save through increased preventive care and lower administrative and malpractice costs, we would provide a subsidy to low-income families who wanted to purchase the model plan through their state pool, and immediately mandate coverage for all uninsured children. If necessary, we could also help pay for these subsidies by restructuring the tax break that employers use to provide health care to their employees: They would continue to get a tax break for the plans typically offered to workers, but we could examine a tax break for fancy, gold-plated executive health-care plans that fail to provide any additional health benefits.
The point of this exercise is not to suggest that there’s an easy formula for fixing our health-care system—there isn’t. Many details would have to be addressed before we moved forward on a plan like the one outlined above; in particular, we would have to make sure that the creation of a new state pool does not cause employers to drop the health-care plans that they are already providing their employees. And, there may be other more cost-effective and elegant ways to improve the health-care system.
The point is that if we commit ourselves to making sure everybody has decent health care, there are ways to accomplish it without breaking the federal treasury or resorting to rationing.
If we want Americans to accept the rigors of globalization, then we will need to make that commitment. One night five years ago, Michelle and I were awakened by the sound of our younger daughter, Sasha, crying in her room. Sasha was only three months old at the time, so it wasn’t unusual for her to wake up in the middle of the night. But there was something about the way she was crying, and her refusal to be comforted, that concerned us. Eventually we called our pediatrician, who agreed to meet us at his office at the crack of dawn. After examining her, he told us that she might have meningitis and sent us immediately to the emergency room.
It turned out that Sasha did have meningitis, although a form that responded to intravenous antibiotics. Had she not been diagnosed in time, she could have lost her hearing or possibly even died. As it was, Michelle and I spent three days with our baby in the hospital, watching nurses hold her down while a doctor performed a spinal tap, listening to her scream, praying she didn’t take a turn for the worse.
Sasha is fine now, as healthy and happy as a five-year-old should be. But I still shudder when I think of those three days; how my world narrowed to a single point, and how I was not interested in anything or anybody outside the four walls of that hospital room—not my work, not my schedule, not my future. And I am reminded that unlike Tim Wheeler, the steelworker I met in Galesburg whose son needed a liver transplant, unlike millions of Americans who’ve gone through a similar ordeal, I had a job and insurance at the time.
Americans are willing to compete with the world. We work harder than the people of any other wealthy nation. We are willing to tolerate more economic instability and are willing to take more personal risks to get ahead. But we can only compete if our government makes the investments that give us a fighting chance—and if we know that our families have some net beneath which they cannot fall.
That’s a bargain with the American people worth making.
INVESTMENTS TO MAKE America more competitive, and a new American social compact—if pursued in concert, these broad concepts point the way to a better future for our children and grandchildren. But there’s one last piece to the puzzle, a lingering question that presents itself in every single policy debate in Washington.
How do we pay for it?
At the end of Bill Clinton’s presidency, we had an answer. For the first time in almost thirty years, we enjoyed big budget surpluses and a rapidly declining national debt. In fact, Federal Reserve Chairman Alan Greenspan expressed concern that the debt might get paid down too fast, thereby limiting the Reserve System’s ability to manage monetary policy. Even after the dot-com bubble burst and the economy was forced to absorb the shock of 9/11, we had the chance to make a down payment on sustained economic growth and broader opportunity for all Americans.
But that’s not the path we chose. Instead, we were told by our President that we could fight two wars, increase our military budget by 74 percent, protect the homeland, spend more on education, initiate a new prescription drug plan for seniors, and initiate successive rounds of massive tax cuts, all at the same time. We were told by our congressional leaders that they could make up for lost revenue by cutting out government waste and fraud, even as the number of pork barrel projects increased by an astonishing 64 percent.
The result of this collective denial is the most precarious budget situation that we’ve seen in years. We now have an annual budget deficit of almost $300 billion, not counting more than $180 billion we borrow every year from the Social Security Trust Fund, all of which adds directly to our national debt. That debt now stands at $9 trillion—approximately $30,000 for every man, woman, and child in the country.
It’s not the debt itself that’s most troubling. Some debt might have been justified if we had spent the money investing in those things that would make us more competitive—overhauling our schools, or increasing the reach of our broadband system, or installing E85 pumps in gas stations across the country. We might have used the surplus to shore up Social Security or restructure our health-care system. Instead, the bulk of the debt is a direct result of the President’s tax cuts, 47.4 percent of which went to the top 5 percent of the income bracket, 36.7 percent of which went to the top 1 percent, and 15 percent of which went to the top one-tenth of 1 percent, typically people making $1.6 million a year or more.
In other words, we ran up the national credit card so that the biggest beneficiaries of the global economy could keep an even bigger share of the take.
So far we’ve been able to get away with this mountain of debt because foreign central banks—particularly China’s—want us to keep buying their exports. But this easy credit won’t continue forever. At some point, foreigners will stop lending us money, interest rates will go up, and we will spend most of our nation’s output paying them back.
If we’re serious about avoiding such a future, then we’ll have to start digging ourselves out of this hole. On paper, at least, we know what to do. We can cut and consolidate nonessential programs. We can rein in spending on health-care costs. We can eliminate tax credits that have outlived their usefulness and close loopholes that let corporations get away without paying taxes. And we can restore a law that was in place during the Clinton presidency—called Paygo—that prohibits money from leaving the federal treasury, either in the form of new spending or tax cuts, without some way of compensating for the lost revenue.
If we take all of these steps, emerging from this fiscal situation will still be difficult. We will probably have to postpone some investments that we know are needed to improve our competitive position in the world, and we will have to prioritize the help that we give to struggling American families.
But even as we make these difficult choices, we should ponder the lesson of the past six years and ask ourselves whether our budgets and our tax policy really reflect the values that we profess to hold.
“IF THERE’S CLASS warfare going on in America, then my class is winning.”
I was sitting in the office of Warren Buffett, chairman of Berkshire Hathaway and the second richest man in the world. I had heard about the famous simplicity of Buffett’s tastes—how he still lived in the same modest home that he’d bought in 1967, and how he had sent all his children to the Omaha public schools.
Still, I had been a little surprised when I walked into a nondescript office building in Omaha and entered what looked like an insurance agent’s office, with mock wood paneling, a few decorative pictures on the wall, and no one in sight. “Come on back,” a woman’s voice had called out, and I’d turned the corner to find the Oracle of Omaha himself, chuckling about something with his daughter, Susie, and his assistant, Debbie, his suit a bit rumpled, his bushy eyebrows sticking out high over his glasses.
Buffett had invited me to Omaha to discuss tax policy. More specifically, he wanted to know why Washington continued to cut taxes for people in his income bracket when the country was broke.
“I did a calculation the other day,” he said as we sat down in his office. “Though I’ve never used tax shelters or had a tax planner, after including the payroll taxes we each pay, I’ll pay a lower effective tax rate this year than my receptionist. In fact, I’m pretty sure I pay a lower rate than the average American. And if the President has his way, I’ll be paying even less.”
Buffett’s low rates were a consequence of the fact that, like most wealthy Americans, almost all his income came from dividends and capital gains, investment income that since 2003 has been taxed at only 15 percent. The receptionist’s salary, on the other hand, was taxed at almost twice that rate once FICA was included. From Buffett’s perspective, the discrepancy was unconscionable.
“The free market’s the best mechanism ever devised to put resources to their most efficient and productive use,” he told me. “The government isn’t particularly good at that. But the market isn’t so good at making sure that the wealth that’s produced is being distributed fairly or wisely. Some of that wealth has to be plowed back into education, so that the next generation has a fair chance, and to maintain our infrastructure, and provide some sort of safety net for those who lose out in a market economy. And it just makes sense that those of us who’ve benefited most from the market should pay a bigger share.”
We spent the next hour talking about globalization, executive compensation, the worsening trade deficit, and the national debt. He was especially exercised over Bush’s proposed elimination of the estate tax, a step he believed would encourage an aristocracy of wealth rather than merit.
“When you get rid of the estate tax,” he said, “you’re basically handing over command of the country’s resources to people who didn’t earn it. It’s like choosing the 2020 Olympic team by picking the children of all the winners at the 2000 Games.”
Before I left, I asked Buffett how many of his fellow billionaires shared his views. He laughed.
“I’ll tell you, not very many,” he said. “They have this idea that it’s ‘their money’ and they deserve to keep every penny of it. What they don’t factor in is all the public investment that lets us live the way we do. Take me as an example. I happen to have a talent for allocating capital. But my ability to use that talent is completely dependent on the society I was born into. If I’d been born into a tribe of hunters, this talent of mine would be pretty worthless. I can’t run very fast. I’m not particularly strong. I’d probably end up as some wild animal’s dinner.
“But I was lucky enough to be born in a time and place where society values my talent, and gave me a good education to develop that talent, and set up the laws and the financial system to let me do what I love doing—and make a lot of money doing it. The least I can do is help pay for all that.”
It may be surprising to some to hear the world’s foremost capitalist talk in this way, but Buffett’s views aren’t necessarily a sign of a soft heart. Rather, they reflect an understanding that how well we respond to globalization won’t be just a matter of identifying the right policies. It will also have to do with a change in spirit, a willingness to put our common interests and the interests of future generations ahead of short-term expediency.
More particularly, we will have to stop pretending that all cuts in spending are equivalent, or that all tax increases are the same. Ending corporate subsidies that serve no discernible economic purpose is one thing; reducing health-care benefits to poor children is something else entirely. At a time when ordinary families are feeling hit from all sides, the impulse to keep their taxes as low as possible is honorable and right. What’s less honorable has been the willingness of the rich and the powerful to ride this antitax sentiment for their own purposes, or the way the President, Congress, lobbyists, and conservative commentators have been able to successfully conflate in the mind of voters the very real tax burdens of the middle class and the very manageable tax burdens of the wealthy.
Nowhere has this confusion been more evident than in the debate surrounding the proposed repeal of the estate tax. As currently structured, a husband and wife can pass on $4 million without paying any estate tax; in 2009, under current law, that figure goes up to $7 million. For this reason, the tax currently affects only the wealthiest one-half of 1 percent of the population, and will affect only one-third of 1 percent in 2009. And since completely repealing the estate tax would cost the U.S. Treasury around $1 trillion, it would be hard to find a tax cut that was less responsive to the needs of ordinary Americans or the long-term interests of the country.
Nevertheless, after some shrewd marketing by the President and his allies, 70 percent of the country now opposes the “death tax.” Farm groups come to visit my office, insisting that the estate tax will mean the end of the family farm, despite the Farm Bureau’s inability to point to a single farm in the country lost as a result of the “death tax.” Meanwhile, I’ve had corporate CEOs explain to me that it’s easy for Warren Buffett to favor an estate tax—even if his estate is taxed at 90 percent, he could still have a few billion to pass on to his kids—but that the tax is grossly unfair to those with estates worth “only” $10 or $15 million.
So let’s be clear. The rich in America have little to complain about. Between 1971 and 2001, while the median wage and salary income of the average worker showed literally no gain, the income of the top hundredth of a percent went up almost 500 percent. The distribution of wealth is even more skewed, and levels of inequality are now higher than at any time since the Gilded Age. These trends were already at work throughout the nineties. Clinton’s tax policies simply slowed them down a bit. Bush’s tax cuts made them worse.
I point out these facts not—as Republican talking points would have it—to stir up class envy. I admire many Americans of great wealth and don’t begrudge their success in the least. I know that many if not most have earned it through hard work, building businesses and creating jobs and providing value to their customers. I simply believe that those of us who have benefited most from this new economy can best afford to shoulder the obligation of ensuring every American child has a chance for that same success. And perhaps I possess a certain Midwestern sensibility that I inherited from my mother and her parents, a sensibility that Warren Buffett seems to share: that at a certain point one has enough, that you can derive as much pleasure from a Picasso hanging in a museum as from one that’s hanging in your den, that you can get an awfully good meal in a restaurant for less than twenty dollars, and that once your drapes cost more than the average American’s yearly salary, then you can afford to pay a bit more in taxes.
More than anything, it is that sense—that despite great differences in wealth, we rise and fall together—that we can’t afford to lose. As the pace of change accelerates, with some rising and many falling, that sense of common kinship becomes harder to maintain. Jefferson was not entirely wrong to fear Hamilton’s vision for the country, for we have always been in a constant balancing act between self-interest and community, markets and democracy, the concentration of wealth and power and the opening up of opportunity. We’ve lost that balance in Washington, I think. With all of us scrambling to raise money for campaigns, with unions weakened and the press distracted and lobbyists for the powerful pressing their full advantage, there are few countervailing voices to remind us of who we are and where we’ve come from, and to affirm our bonds with one another.
That was the subtext of a debate in early 2006, when a bribery scandal triggered new efforts to curb the influence of lobbyists in Washington. One of the proposals would have ended the practice of letting senators fly on private jets at the cheaper first-class commercial rate. The provision had little chance of passage. Still, my staff suggested that as the designated Democratic spokesperson on ethics reform, I should initiate a self-imposed ban on the practice.
It was the right thing to do, but I won’t lie; the first time I was scheduled for a four-city swing in two days flying commercial, I felt some pangs of regret. The traffic to O’Hare was terrible. When I got there, the flight to Memphis had been delayed. A kid spilled orange juice on my shoe.
Then, while waiting in line, a man came up to me, maybe in his mid-thirties, dressed in chinos and a golf shirt, and told me that he hoped Congress would do something about stem cell research this year. I have early-stage Parkinson’s disease, he said, and a son who’s three years old. I probably won’t ever get to play catch with him. I know it may be too late for me, but there’s no reason somebody else has to go through what I’m going through.
These are the stories you miss, I thought to myself, when you fly on a private jet.
The Audacity of Hope The Audacity of Hope - Barack Obama The Audacity of Hope