The free flow of employees and ideas encourages innovation. Just ask Silicon Valley.
Noncompete agreements, once restricted to tech workers and high-level executives, have become commonplace among ordinary workers, including security guards, home health aides or hair stylists. If history is any guide, the spread of these contracts could have far-reaching negative implications for the U.S. economy.
Critics oppose requiring ordinary, low-skill workers to comply with these restrictions, arguing they should be reserved for well-compensated employees who benefit from specialized training and investment. But the history of these covenants suggests that there’s a strong economic case for banning them entirely.
The agreements, known as NCAs, forbid workers from taking valuable skills acquired from one employer to a competing firm. They first appeared in the Middle Ages, when master artisans required them of apprentices because they didn’t want to face direct competition once their protégés set up shop on their own. Courts eventually sanctioned these restraints, provided they didn’t harm the public interest, establish a monopoly or unduly restrain an employee’s right to work.
But this trend toward wider use of the contracts, which gathered steam from the late 18th century onward, conveniently omitted that they originally applied to skilled laborers operating in a pre-capitalist society. Yet employers increasingly used noncompete clauses to limit the mobility of unskilled wage laborers along with skilled workers.
In Great Britain, courts generally endorsed NCAs so long as they remained “reasonable” — a quality that was very much in the eye of the beholder. In the U.S., courts approached NCAs in much the same way through the 19th and 20th centuries, often upholding them, but occasionally voiding them if they seemed too, well, unreasonable.