The First Contract Was a Promise to Shut Up
In 2100 BCE, a Sumerian scribe named Ur-Nanshe made a deal that would sound familiar to anyone who's ever worked at a tech startup. In exchange for access to the royal granary records, he swore an oath that speaking about what he learned would result in his tongue being cut out and fed to the palace dogs. No lawyers, no arbitration clauses — just a simple understanding that some information was worth more than the person who knew it.
Photo: Ur-Nanshe, via static.wikia.nocookie.net
The cuneiform tablet recording this arrangement sits in the British Museum today, a 4,000-year-old reminder that the urge to control information is as old as information itself. What's remarkable isn't that ancient kings wanted to keep secrets — it's that they figured out the exact same psychological pressure points that modern corporations use to make employees sign NDAs.
Photo: British Museum, via i.redd.it
Ur-Nanshe signed that oath for the same reason a Google engineer signs a non-disclosure agreement today: because the alternative was worse than the risk. The psychology hasn't changed. Only the consequences got more civilized.
Fear Works Better Than Money
Ancient Egyptian temple records show that priests weren't just bound by religious vows — they signed detailed contracts specifying exactly what they couldn't discuss about temple finances, political alliances, and trade relationships. The penalties ranged from exile to having their names chiseled off every monument in the kingdom, effectively erasing them from history.
These weren't desperate people with no options. Temple priests were the educated elite of their society, the equivalent of today's tech workers or financial analysts. They had leverage, connections, and alternative career paths. Yet they consistently agreed to terms that modern employment lawyers would call unconscionable.
The reason? The same cognitive bias that makes contemporary employees sign NDAs that restrict their ability to work elsewhere: loss aversion. The human brain is wired to fear losing what we have more than we value gaining something equivalent. Ancient temple administrators understood this. They made the cost of speaking up higher than the cost of staying quiet, then let human psychology do the rest.
Roman merchant guilds perfected this approach. Guild members signed contracts that bound them to secrecy about trade routes, supplier relationships, and pricing strategies. The penalty for violation wasn't just expulsion from the guild — it was blacklisting from every guild in the empire. One broken promise could end a career across three continents.
Photo: Roman merchant guilds, via i.pinimg.com
Sound familiar? It should. Modern NDAs work the same way, just with less dramatic language.
The Insiders Always Cave
What's fascinating about ancient secrecy contracts is how consistently people honored them, even when the information they were hiding was clearly in the public interest. Babylonian tax collectors knew their assessment methods were corrupt. Roman engineers knew which bridges were built with substandard materials. Medieval guild masters knew which merchants were adulterating their products.
They stayed quiet anyway.
Not because they were cowards, but because the same psychological mechanisms that keep modern employees from speaking up about workplace problems were already fully operational 3,000 years ago. The bystander effect, authority bias, and social proof all conspired to make silence feel like the rational choice.
Archaeological evidence suggests that information did leak, but almost always through unofficial channels — gossip, rumors, and anonymous accusations rather than direct testimony. The formal secrecy systems worked not because they prevented all leaks, but because they channeled dissent into forms that were easy to dismiss or ignore.
This is why modern NDAs are so effective even when they're legally questionable. They don't need to win in court. They just need to make speaking up feel riskier than staying quiet.
The More Things Change
The language has evolved, but the basic transaction remains identical: access to valuable information in exchange for a promise not to share it, backed by consequences designed to make the promise feel binding. Whether it's a Mesopotamian grain inspector or a Facebook data scientist, the psychological calculation is the same.
What's changed is our ability to pretend this is about protecting legitimate business interests rather than controlling people. Ancient kings were honest about what they were doing — buying silence through fear. Modern corporations dress it up in legal language about trade secrets and competitive advantage, but the underlying mechanism is unchanged.
The most successful secrecy systems in history weren't the ones with the harshest penalties. They were the ones that made signing feel like joining an exclusive club rather than surrendering a right. Ancient Persian court officials didn't just promise secrecy — they were inducted into a brotherhood of trusted insiders. The oath was a privilege, not a burden.
Every tech company that talks about being a "family" or having a "mission" is running the same playbook. Make the NDA feel like membership rather than muzzling, and people will sign it willingly.
Why We Keep Falling for It
Five thousand years of evidence suggests that humans will always choose the illusion of insider status over the reality of free speech. We're social creatures who crave belonging more than we value abstract principles. Ancient rulers understood this. Modern executives understand this. The people signing the agreements, then and now, understand this too — they just hope the trade-off will be worth it.
It almost never is. But that doesn't stop us from making it, again and again, because the psychology that drives these decisions is older than writing itself. We've been trading our voices for access since before we had words for either concept.
The cuneiform tablets prove it. The NDAs in your email prove it. The pattern is unbroken, and unless human nature fundamentally changes, it always will be.