One of the greatest mysteries about the American economy right now is why workers don’t seem to be getting all that much better at their jobs over time. From 2007 to 2016, productivity in the U.S. grew at about 1 percenta historically low rate. In other recent periods, it’s been much higher: 2.6 percent from 2000 to 2007 and 2.2 percent in the 1990s.

The slowdown in productivity is worrying for the U.S. economy in the long run, and scholars and economists have speculated a host of reasons for why that might be, from a lack of innovation to weakening investment to even the possibility that it’s a statistical mirage.

A new report by the left-leaning Economic Policy Institute (EPI) puts forward another theory. Josh Bivens, the director of research at the EPI and the author of the paper, argues that the shortfall in spending (or demand) by households, governments, and businesses has held back the kinds of big-idea investments by American companies that drive increased productivity. These investments can include a company improving its workforce, such as by implementing training programs that help employees become more productive or hiring more experienced workers; investing in equipment so workers can do their jobs better; or researching technological advancements.

“It’s really hard to get a handle on what the underlying trend in the growth rate of productivity is by looking at just the last couple of years,” says Bivens. “If you start to think about what has been strange about the last five or six years in the economy that has coincided with this deceleration of productivity, it’s really obvious: We’re still very scarred from the fallout from the Great Recession.” The report found a correlation between the two—when demand goes up, so does investment. Since the economic recovery has been slow and steady, that in turn hasn’t resulted in the kind of spending which would spur companies to make productivity-improving investments in order to catch up with demand.

Bivens also considered another related theory: that the lack of demand in recent years has slowed wage growth for American workers, and relatively cheap labor means that businesses haven’t had incentive to invest in technology or equipment that would make their workers more productive. When labor becomes more expensive, Bivens believes that’s when companies will invest: “There’s a strong statistical relationship between real wage growth starting to pick up and businesses starting to get serious about investing in productivity enhancing technology.”

In some ways, this is part of the argument made by business owners who oppose minimum-wage measures and promise that higher wages would mean job losses: When labor gets expensive, employers might opt to invest in robots to do their jobs instead. But according to Bivens, worrying about both productivity and workers being displaced by automation doesn’t make much sense. “I find it ironic that on the one hand, you read the economic statistics about how slow productivity growth is and how weak capital investments is. Then at the same time, you’ll read a lot of stories about how we should be worried about robots putting a lot of pressure on jobs. Those are diametrically opposed. You really have to pick one or the other to worry about,” says Bivens.

The productivity argument, and whether the U.S. economy still has so-called “room to run,” is particularly relevant this week as the Federal Reserve’s policymakers are expected to raise interest rates at its March meeting. Rates went up a quarter of a percent in December, and the financial world will be watching closely for another hike, since the Fed has kept interest rates low for a historically long stretch since 2007, when they were lowered as part of the post-recession recovery effort.

The impending raise has brought up the perennial question of whether the Fed could be making such a move too late or too soon. Monetary policy is the way a central bank changes the cost and availability of money. For the country’s central bankers, tasked with the timing of these important policy decisions, the stakes are high: Raising rates too early can hurt a fragile economy, causing it to slow and dampening recovery efforts. Raising rates too late means money is too cheaply available, and it could overheat an economy, leading to asset-price bubbles and inflation.

Productivity is part of the Fed’s calculus; Fed Chairwoman Janet Yellen mentioned this briefly in her recent remarks, regarding why the Fed has taken so much longer to raise interest rates over the last few years than had been anticipated. “These reassessments reflected, in part, the persistence of surprisingly sluggish productivity growth—both in the United States and abroad—and suggested that fewer federal-funds rate increases would be necessary than previously thought to scale back accommodation.” The Fed has been worried about productivity growth, which factors into their thinking about how long to keep low interest rates around in order to push employment and wage growth up and to make sure that an economy won’t be stuck in stagnation.

Bivens believes that, because of declining productivity growth, the risk of waiting to raise rates may be worth it. Currently, three rate hikes are projected by Fed policymakers in 2017. He also acknowledges that though he could be wrong (predictions on macroeconomic indicators is notably hard), he believes the upside of being right—the possibility of spurring productivity growth—is greater than the downsides of being wrong (rising inflation).

“The key thing is that the asymmetry of risks of being wrong on this issue weighs in really strongly on the policy point of trying to boost growth more. If I’m wrong, and we’re really locked in an era of low productivity ... we’ll get a couple years of inflation,” he says. “But we’ve just had six years of below-target inflation, so it’s really hard to see a downside in aggressively going for growth in the next couple of years.”