The title question sets out the puzzle of the day for your consideration. Some of our old assumptions about capitalism, competition and the economy don’t seem to be operating the way they did in the past and in keeping with the playbook which fiscal conservatives tend to follow. The real question is.. why?
Traditionally, we expect that when the economy collapses as it did in 2007, there are massive layoffs and the unemployment rate rises. Fewer jobs are available so the available labor pool is swollen. Employers feel no pressure to offer increased compensation packages because there are so many qualified people out there looking for work. That’s precisely what happened throughout the great recession.
But when the economy bounces back as it now has, unemployment drops. The available pool of workers shrinks. In theory, employers should be competing with each other for the best talent and offering increasing compensation to attract that talent. But wage growth remains stuck at the same dismal pace we’ve been seeing since the end of the tech boom in the nineties. For the year ending this past April, hourly wage growth overall was 0.4% and for those in the middle and lower middle class they grew only 0.1%. What’s gone wrong? Robert Verbruggen has a lengthy think piece at National Review which examines this question and offers a few possible answers.
Writing in 2014, Barry P. Bosworth of the Brookings Institution offered a handy breakdown of the “primary determinants” of wage growth: “(1) gains in labor productivity, (2) the division of earned income between labor and capital (profits), and (3) the allocation of labor compensation among wages and nonwage benefits.”
Productivity growth is the ultimate root of wage growth: If workers aren’t producing more output for each hour they work, employers cannot raise hourly wages without also raising prices, which eats away the wage gain through inflation. And productivity growth has been abysmal for the decade since the end of the tech boom. Even more troubling, the tech boom aside, it’s been flagging for much of the past 40 years.
Explanations for the falloff abound. Some say it’s a statistical artifact of one kind or another. Maybe the numbers are failing to capture the benefits of major technological gains that don’t cost a thing to the user (and thus produce no measurable “output”), such as Internet search engines. Maybe the data are thrown off by the increasing tendency of companies to stash profits overseas, removing them from the measured output of American workers. Maybe people are spending more time goofing off in the workplace, and their productivity during the time they actually work has gone up.
There’s plenty more to read, but it does center on those three factors and how they are being interpreted differently today by both analysts and the employers themselves. I can’t argue with most of it. If productivity has essentially peaked and automation is augmenting that productivity with non-human labor, justifying paying more may be difficult. The entire question of division of revenue between wages and profits speaks to a corporate culture which has been evolving since at least the 70s.
One of the biggest hitters on the list would seem to be the increasing cost of benefits which workers expect and that eats away at the amount of cash which can be put forth in actual wages. Health insurance is the biggest culprit, but government mandates for other forms of benefits share some of the blame as well.
But I would offer another, less tangible factor, while realizing that it’s hard to plug into a formula for analytical purposes. It seems to me that the general mental attitude of jobseekers has changed significantly in my lifetime. There was a time when workers looked for a good job with a reliable company which would be around for the long run and they had a reasonable expectation that they might stay at that job until retirement. But the loyalty on both sides of that equation has faded almost entirely. Companies are more likely to come and go and they are quicker to lay people off during slow times and replace the workforce later as needed. Workers also see the process of jumping from job to job and even moving to different parts of the country to find better compensation as a viable route to advancement. In short, if there are jobs out there, the workers are far less loyal to the company and more likely to leave for greener pastures.
Why does that matter? Because when you suddenly find yourself unemployed, even at a time such as this when there’s low unemployment, it’s not zero unemployment. There are other people out there interviewing for pretty much any job you apply for. Let’s say you were earning around $25 per hour or roughly $50K per year for a fairly higher skill job (a bit over the national average). The maximum unemployment benefit you can receive varies from state to state, but it’s generally not anywhere near that much. The highest I see is in New Jersey ($677 per week) but a lot of them, like Alabama, are down in the $250 per week range. You’re probably not going to be getting by for very long on that.
So if you go in for the interview and the prospective employer offers you $16 dollars per hour instead of the $25 you were expecting, there’s a fairly good chance you’ll take it because it’s still a lot better than unemployment checks. And if you try to push them too hard for more money there are others out there who will likely take it. Part of that ties back into the shifting corporate attitude about division of labor and profits, but it seems as if it’s also a drop in confidence among jobseekers to push for those higher wages. If employers know that, they’re obviously going to offer the least amount they can in order to maximize profits.
Anyway, those are just my musings on the subject. Read the rest of Verbruggen’s article for a more detailed analysis. This isn’t good news by any means (at least for workers) but if we can understand it better perhaps a solution can be found.