Arguably the biggest concern that employers and managers have with remote work is how to make sure their employees are really spending the full 40, 50, 60 or whatever it may be hours per week working. If they can’t see their employees, how do they know if they’re actually at their desks?
But what if this is the wrong way to look at it? What if what matters most is output?
The Obsession With Hours Worked
For years employers have rewarded employees who worked the most.
For example, say you work in a widget factory and can produce 10 widgets per hour. If you work 40 hours, you produce 400 widgets, but, if you work 70, you produce 700 widgets. For the company 700 widgets means more revenue and more profit. So, the manager rewards the employee who produces 700 widgets with a raise and promotion to keep them around. And if an employee only worked 30 hours and produced 300 widgets, they were shown the door.
This makes sense in the widget factory but in real life, many employers stopped looking at the actual number of widgets produced because it’s not always as black and white (what if the person is a manager or more difficult to measure position). Instead, they looked at hours worked both quantitatively and qualitatively as a measure of productivity. If you worked 30 hours, you must have produced 300 widgets.
Or, in the real world, did Melissa stay later than Bob each day this week? She must be more productive and deserving of a promotion. Right?
Output Moves the Needle
Yes and no. If an employee works 5 hours per week, they probably need to be shown the door.
But what if a hypothetical employee works 30 hours per week in the previous example but produces 800 widgets? Hours on their own don’t do anything for your company.
If someone is sitting in their desk twiddling their thumbs for 70 hours per week, they are contributing nothing and are far less valuable than someone who works 30 hours per week and is incredibly focused and productive.
More Hours = Less Productive?
In addition, even if an employee is productive each hour, more hours don’t necessarily translate to more productivity. That’s confusing so let me explain. The Economist recently conducted a study and found that even for productive employees, each hour over 50 hours per week led to marginally less output. As a result, they realized that an employee working 70 hours per weeks is equally productive to one working 56 hours per week. This means, that at higher levels, more hours are not the answer to more productivity.
To find more productivity, you have to look to other sources.
How Does This Apply To Remote Work?
So when managers are looking at remote employees and asking how they can ensure that they’re working a full 40 hours per week, they are simply confused. They should be asking, how can I track an employee’s output to ensure they continue contributing to the company’s continued progress. And, if managers reframe how they look at measuring employee productivity, they should embrace remote work.
Why you ask? Well, remote workers are more productive than their on-site counterparts (i.e. they produce more widgets) which moves the needle for your company faster.
And if an employee works 56 hours per week and is at “maximum productivity,” rather than asking them to work more, encouraging them to work remotely and become more productive per hour might be the only way to increase their productivity further.
What Do You Think?
What do you think? Has your thinking shifted? Or am I wrong and hours are still important? Did you try out remote work in your company?
We’ve love to hear from you in the comments section!
Article by Will Zimmerman. Image via Pexels.com