- Racial Equity
- Talk About Race
President Obama recently unveiled his jobs proposal before Congress. Known as the American Jobs Act, the proposal would cut payroll taxes paid by both American businesses and American workers, invest in education and infrastructure to create jobs and prevent public sector layoffs, and offer tax credits to employers for hiring long-term unemployed workers, among other provisions. The question is: will it work?
According to the Bureau of Labor Statistics, unemployment remains high at 9.1%. That’s 14 million Americans looking for work. The real unemployment rate, including discouraged workers, may be much higher. Some groups suffer even greater rates of unemployment: Hispanics at 11.3%, blacks at 16.7%, and teenagers at 25.4%. Will the American Jobs Act reduce these numbers?
Tax relief is a major part of the President’s proposal, and undoubtedly calculated to appeal to Republican legislators in Congress. In addition to cutting payroll taxes on employers and employees, the President proposes a $4,000 tax credit to employers for hiring the long-term unemployed. The idea behind a tax credit is to reduce the cost of labor for employers, who would like to hire employees, but can’t afford it. The credit indirectly subsidizes those hires and helps create jobs. If successful, the tax credit is defrayed by the taxes workers pay in taxes and spend into the economy.
There is growing body of research indicating that tax credits don’t create jobs, and when they do, they are economically inefficient, resulting in a loss of public money for infrastructure and education without a corresponding benefit in job creation. A Federal Reserve Bank of Boston report reviewing available studies on tax credits concluded that the evidence on the effectiveness and efficiency of state tax credits were mixed. Academic analysis of job creation tax credit programs suggests that between seventy to eighty percent of the credits granted employers are awarded for jobs that would have been created without the credit. In other words, only 20-30% of tax credits can be directly linked to new job creation. In some cases, virtually none of the tax credit money could be traced to job creation.
The reasons for this vary, but the research suggests that many employers, especially smaller businesses, are unaware of tax credits until they meet with their tax preparer at the end of the year. Only then are new hires claimed for tax benefits. Conversely, there is evidence that larger firms are relatively indifferent to job tax credits. In such cases, the tax credit does not actually directly affect hiring decisions. Instead, allowing employers to claim credits for every new hire is a windfall that may inadvertently subsidize natural turnover in the labor market. Even during recessions, businesses hire workers to replace retiring or other exiting workers.
Research also suggests that tax credits don’t generate sustainable employment – employers who take advantage of tax credits in hiring decisions are committed to the subsidy, not the employee. Employers may terminate newly hired workers after the subsidy lapses, only to replace them with another subsidized worker. Although many tax credit programs feature clawbacks or accountability measures designed to ensure that employers do not replace subsidized workers with new ones, such measures are expensive to supervise and costly to monitor.
The more targeted the tax credit subsidy, the more likely it is to influence firm behavior. However, research also suggests that the more complicated the tax credit is to administer, the less likely the credit is to produce beneficial effects. Employers may not only be unaware of the tax credit, but they may be reluctant to take on additional paperwork responsibilities to administer it. The more conditions on the credit, such as targeting the long-term unemployed, the less attractive the credit may be to potential employers.
Aside from the flaws in job creation tax credits, there is a larger problem: American businesses aren’t hiring American workers. This has less to do with economic uncertainty than the new economic reality. American firms generated $1.68 trillion in profit in the last quarter of 2010, and these profits grew by another $50 billion in the first quarter of 2011. However, rather than hire new employees or expand production in the United States, they are hoarding cash, increasing dividends, buying up their own shares, or investing in R&D and production in other countries. American companies, especially our major corporations, are making “plenty of money; they just don’t spend it on workers here” when those jobs can be easily outsourced to cheaper labor markets or invested in plant and production overseas. This was true even before the recession.
From 1990 to 2008, American companies that were largely confined to the US market or were immune to global competition (such as retailers and hotels) were the primary sources of job growth. In contrast, companies that conducted business in global markets, particularly manufacturers, banks, exporters, and financial services and energy firms did not meaningfully contribute to job growth. Only the “knowledge-based work of design and marketing” and similar high value-added endeavors are actually “done by the company that owns the brand.” The consequences have been profound.
America’s largest corporations, proportionally, employ half as many employees as they did in 1970, the era of the vertically integrated corporation. In fact, throughout the 1990s, an era of economic growth, Fortune 500 companies erased more jobs than they created. Recently, Apple briefly surpassed Exxon as America’s most valuable company. And yet, the grocery store chain Kroger employs ten times as many workers in the United States as Apple employs worldwide. Companies which are enormous in terms of revenues and market capitalization may hire few American workers. To illustrate this point, according to recent reports, Apple has more cash on hand than the U.S. Government.
The provisions of the American Jobs Act pertaining to infrastructure investment and education may hold out the best hope of not simply creating jobs, but preventing further job cuts. Nearly every state in the union is amidst a fiscal crisis. Governors and legislatures across the United States are planning for the future in which 44 states and the District of Columbia confront a nearly $140 budget deficit for FY 2012. The irony is that the state fiscal situation is worse now even though the absolute budget deficits are falling because stimulus money is now largely spent. FY 2010 witnessed a $200 billion dollar combined state budget deficit, but the Recovery Act and other federal aid helped offset nearly $123 billion, over 60% of the gap. Even though overall projected budget deficits are declining, states are being forced to find greater cost savings now that the Recovery Act has wound down. Cuts to education, Medicare, Medicaid, health care, and food stamps are expected at a time when such services face growing demand. The American Jobs Act may help plug some of these state deficits.
Ultimately, the American Jobs Act is necessary, not because it is well tailored to stimulate job growth, but because it is better than doing nothing. Economists believe federal tax credits are far more efficient tool than state-level tax credits, since states not only compete against each other, but operate in zero-sum fiscal environments, where any loss of tax revenue must be offset by spending cuts in other areas. It is up to the federal government to help get us out of the recession, and to take bold action. The American Jobs Act is not the solution, but it is a step in the right direction.