Commercial software often comes saddled with undesirable features. Microsoft Office, for example, has an annoying pop-up screen that prevents me from using the software unless I first purchase a “product key” to “activate” it.
Microsoft’s copy protection is a self-enforcing contract: In order to gain access to the software, I must pay $149.99 for a license. Licensing is administered by the product key, which activates the software.
A contract is only as good as its enforceability, and commercial copy-protection has long been terrible at self-enforcement.
In early days, distributors copy-protected their disks by introducing intentional bad sectors . The floppy disk’s boot loader would look for errors upon execution, and only load the software if the intentional errors occurred. Copied disks didn’t have bad sectors, so they couldn’t generate the expected errors.
This stopped piracy for about five minutes, or however long it took to delete three lines from the boot loader.
In the 80s, game publishers brought copy protection into the physical world: At runtime, the program would ask a question whose answer could only be found in the game manual. If answered incorrectly, the software would terminate.
Boxed versions of Tetris came with a book of information about the fifteen Soviet Republics (Tetris was developed at the Academy of Sciences of the USSR). I didn’t own this manual because my dog ate it. As a result, I spent hours at the library memorizing Soviet trivia in order to play my stupid Tetris game.
There were better workarounds.
Software-liberators armed with debuggers could view machine instructions as the program was running. When the copy-protection popped up, the user would identify the offending lines and edit them out of the program. This is called a crack.
CMP WORD PTR [A720], 1C20 ;compare location A720 (user’s response) with hardcoded passphrase JNZ kill_function ;if Z flag not set, CMP failed. Quit. CALL continue_playing ;otherwise, continue
Replace the second line with a
NOP and it’s fixed.
There were many variations on this scheme: Free trials that expire after 30 days, mail-order activation codes, serial numbers based on device IDs. These protections all relied on a checkpoint, followed by a conditional jump. The checkpoints were trivial in the face of a debugger.
A lock does no more than keep an honest man, honest.
Protectionists realized that it was impossible to guarantee digital rights security for software running on an untrusted machine. Still, they hoped to at least increase the cost and inconvenience of breaking copy protection.
They wrote software that could detect if it was running in a debugger; they added code checksums to detect alterations; they planted copy-protection subroutines in multiple locations.
Unfortunately, they overestimated the value of time to Eastern Europeans making fifty kopeks a day. The net result was that proprietary software became buggier and more bloated for legitimate users, while only nominally slowing the freedom-fighters.
You know how doctors warn against the unnecessary use of antibiotics because it leads to an increase in drug-resistant bacteria? Decades of broken software security trained reverse-engineers to identify flaws in the toughest protection schemes. Many then applied these skillz to creating malware, the malicious code that resulted in multiple breaches at the US Federal Reserve and disappeared $81 million from Bangladesh’s central bank.
As for large software vendors, they found that the most effective copy protection is still best-enforced in the physical world: Lawsuits.
There are some transactions software can’t enforce. For everything else, there’s the blockchain.
1. Pournelle, Jerry. Zenith Z-100, Epson QX-10, Software Licensing, and the Software Piracy Problem. BYTE, June 1983.
2. +HCU (High Cracking University) Academy of Reverse Engineering –Fravia+ (the definitive cracking guide of the 90s)