Innovation can occur in dramatic bursts. like when the telegraph replaced the Pony Express. This iconic mail carrier cut previous delivery times in half and reigned for 18 months as the fastest way to deliver information across the United States. The Pony Express was introduced on April 3, 1860 and delivered mail between St. Joseph, Missouri and Sacramento, California. The 2,000-mile route took approximately 10 days with riders traveling 75 to 100 miles each, switching horses every 10 to 15 miles.
Western Union erected the first telegraph poles on July 4, 1861. Construction took 112 days to complete the first electronic transcontinental communication system on Oct. 24, 1861. Two days later the Pony Express was discontinued.
The telegraph reduced the time it took to deliver a message by 99.93 percent from 10 days to 10 minutes. Sending a message got 1,439,900 percent faster.
Are we witnessing another Ponies to Electrons Innovation between Open AI and DeepSeek? Maybe.
Peter Diamandis has noted that DeepSeek was founded less than two years ago. With only 200 employees and $5 million they’ve developed a new AI system. By comparison, OpenAI was founded 10 years ago, has around 4,500 employees, and has raised $6.6 billion in capital. AI tech giants like Open AI and Anthropic have been spending $100 million+ to train their models. DeepSeek has matched their systems for 95 percent of the cost. A 95 percent drop in cost means you now get 20 for the price of one, indicating a 1,900 percent increase in abundance. DeepSeek has done this with three innovations.
1. Precision Reimagined. Instead of using computational overkill (32 decimal places), they proved 8 is enough. The result? 75% less memory needed. Sometimes the most powerful innovations come from questioning the most basic assumptions.
2. The Speed Revolution. Traditional AI reads like a first-grader: "The... cat... sat..." But DeepSeek's multi-token system processes whole phrases at once: 2x faster with 90% accuracy. When you're processing billions of words, this is transformative.
3. The Expert System. Instead of one massive AI trying to know everything (imagine one person being a doctor, lawyer, AND an engineer), they built a system of specialists. Traditional models rely on 1.8 trillion parameters being active ALL THE TIME. DeepSeek, by contrast, relies on 671 billion in total, but only 37 billion are active at once (97.9% fewer).
Peter goes on to note more staggering results from DeepSeek’s innovations:
Training costs slashed from $100M to $5M
GPU requirements slashed from 100,000 GPUs to 2,000 GPUs
95% reduction in API costs
Runs on gaming GPUs instead of specialized hardware
They did this with a team <200 people, not thousands
The DeepSeek system is open source, which means anyone can verify, build upon, and implement these innovations. You can download the new app on your iPhone.
Bonus: AI now has a counterpoint to the environmentalists who say AI uses so much electricity. DeepSeek just brought down the cost of inference by 97 percent.
Former Intel CEO Pat Gelsinger notes that “DeepSeek is an incredible piece of engineering that will usher in greater adoption of AI. It will help reset the industry in its view of Open innovation. It took a highly constrained team from China to remind us all of these fundamental lessons of computing history.”
Vitaliy Katsenelson notes that
While everyone assumed that AI’s future lay in faster, better chips—where the only real choice is Nvidia or Nvidia—this previously unknown company has achieved near parity with its American counterparts swimming in cash and datacenters full of the latest Nvidia chips. DeepSeek (allegedly) had huge compute constraints and thus had to use different logic, becoming more efficient with subpar hardware to achieve a similar result. In other words, this scrappy startup, in its quest to create a better AI “brain,” used brains where everyone else was focusing on brawn—it literally taught AI how to reason.
Moore’s law suggests that computer transistor abundance doubles every two years. That would indicate a compound rate of around 41.4 percent a year. The cost to train an AI system to recognize images fell 99.59 percent from $1,112.64 in 2017 to $4.59 in 2021. This would indicate a compound rate of 295 percent a year. AI is growing over seven times faster than Moore’s law.
Nvidia is leading the development of these systems and their CEO Jensen Huang has claimed that AI processing performance has increased by “no less than one million in the last 10 years.” This is a compound annual rate of 298 percent. He expects this rate to continue for the next 10 years. That would mean we go from one to one trillion in twenty years. We’ll be 976 million times ahead of Moore’s law. Quite astonishing.
The Pony Express needed lots of fast horses and skinny riders. The telegraph was a whole new platform that used wire and batteries and poles instead. DeepSeek may be the Western Union to the Pony Express.
So what about Stargate—a $500 billion US AI infrastructure initiative led by OpenAI’s Sam Altman, Oracle’s Larry Ellison, and Softbank’s Masayoshi Son? They want to spend 100,000 times more than DeepSeek has spent so far. Will their product be 100,000 times more valuable?
Microsoft’s CEO Satya Nadella brought up Jevon's paradox in regard to DeepSeek. His tweet on 2025-01-26 had this to say: “Jevons paradox strikes again! As AI gets more efficient and accessible, we will see its use skyrocket, turning it into a commodity we just can't get enough of.”
On January 6th Nvidia announced their new Nano line of AI development hardware starting at $259 and their new Project DIGITS as World’s Smallest AI Supercomputer Capable of Running 200B-Parameter Models and expected to be priced at around $3,000. Between DeepSeek’s open-source software and Nvidia’s hardware, the world could experience a brilliant efflorescence of superabundance in learning.
We expect to see AI make dramatic advances in our ability to discover valuable new knowledge, but we’ll also come to realize that the only intelligence in artificial intelligence is human intelligence. If human beings have the freedom to innovate, the potential to create resources is infinite.
Please consider enjoying our new course on the Economics of Human Flourishing at the Peterson Academy. Students have logged over 18,000 hours watching and learning from this course. Join for a free 7-day trial.
We explain and give hundreds of examples why more people with freedom means much more resource abundances for everyone in our book, Superabundance, available at Amazon.
Gale Pooley is a Senior Fellow at the Discovery Institute, an Adjunct Scholar at the Cato Institute, and a board member at Human Progress.