The Skills Layer
Why the Real Value in AI Might Not Be Where You Think
I was thirteen years old when I saved up $99 cutting grass and shoveling driveways to buy a license for Macromedia Flash, a new software for developing animated graphics for websites. I remember how stoked I was when that box arrived. I tore it open, installed it, and then sat there staring at a blank canvas with absolutely no idea what to do next.
The software was just the beginning.
My mom drove me an hour from my house into Washington D.C. every morning for a week so I could sit in a hotel ballroom and learn from instructors who actually knew what they were doing. But by the end of that week, I had something more valuable than the $99 software application — I had the skills to use it. Not long after, I was charging $125 an hour to build Flash animations for websites, making thousands of dollars a year as a teenager. The application was the beginning. The skills let me create real value.
Years later, the same pattern repeated itself. When mobile arrived, I also paid $99 for an Apple Developer license to access Xcode. Same story — the software application existed, but the real work was learning to build with it. My team and I spent the time developing those skills, and we turned them into millions of dollars. Macromedia Flash eventually became part of Adobe before going away. Xcode became a foundational piece of Apple’s rise as one of the most valuable companies in history.
But here’s what I kept coming back to: the software application was never where the value was. The value was always in the skills.
The 1,000x Gap Nobody Talks About
All of the value that software has ever created has been a composite — an application paired with a human who had the skill to extract value from it. The investment banker who uses Excel to build a financial model. The developer who uses VS Code to ship software. The writer who uses Google Docs to produce content.
The application might cost $150 a year in licenses. The human providing the skill to operate it costs $150,000 a year.
That’s a 1,000x gap between the cost of the tool and the cost of the skill to use it. Economists would recognize this as a fairly classic principal-agent cost — the gap between what a tool can theoretically produce and what it actually produces once you factor in the cost of the skilled human required to operate it. For decades, we’ve accepted this gap as the price of doing business, because there was no other way. The skill had to live in a person.
We’ve been trying to change that for thousands of years. We codify skills in books, classes, apprenticeships, online courses, videos, even cave paintings. It’s what’s allowed us to evolve as a species, to accumulate knowledge across generations. But we’ve always hit the same ceiling: the skills ultimately had to be re-learned and re-embodied by another human being. The transfer was always lossy, always slow, always expensive.
Skills Trapped Inside Humans
Here’s the thing that nobody talks about enough: we weren’t just paying for skills. We were paying for everything that came with the human who held them.
I saw this firsthand when we were building Grubhub. In the early days, restaurants would fax or email us their menus as PDFs. We needed that information — every item, every price, every modifier — extracted and loaded into a database so our customers could actually order from it. So we built a process: scan the PDFs, send them to our team in India, have people manually key every SKU from the menu into a spreadsheet, then import that spreadsheet into our system. It worked. It scaled. And it was enormously expensive and slow, because the only receptacle available for the skill of reading a PDF and populating a database was a human being.
And with every human came everything else — management overhead, communication lag, quality variation, training time, turnover. We didn’t just hire for the skill. We hired the whole person, and we built entire operational structures around supporting them, because that was the only way to access the capability we needed.
That’s the trap we’ve been in for the entire history of organized work. The final receptacle for knowledge has always been a human. So we’ve had to acquire, train, manage, retain, and support humans in order to get access to the skills they carried. The skill was never actually portable. It was always embedded inside a person, which meant it was always bundled with everything that comes with a person.
AI breaks that bundling. For the first time, a skill can live on its own.
Why “Services as Software” Gets It Wrong
The prevailing frame in venture right now is “services as software” — the idea that AI will finally automate the massive services economy the same way SaaS automated back-office workflows. It’s a compelling narrative, but I think it’s too abstract in a way that matters.
“Services” is an extraordinarily generic term. Someone cutting your grass is a service. Someone helping you build an investment deck is also a service. But those are not remotely the same thing. They require entirely different skills, different tools, different levels of judgment, and they create entirely different kinds of value. Lumping them together under the banner of “services” obscures as much as it reveals.
What AI is actually atomizing isn’t services — it’s skills. Specific, discrete, deployable skills.
Consider something as simple as converting a PDF into a spreadsheet — exactly the problem we were solving at Grubhub, at significant cost. The tools existed on both ends. Adobe gave you the PDF. Microsoft gave you the spreadsheet. But bridging them required a human operator with the skill to understand both, translate between them, and deliver the objective on the other side. The same logic applies to mowing a yard. A lawnmower and a trimmer are the applications. But operating them correctly — knowing how to edge, how to overlap rows, how to handle slopes — is the skill. The objective, a neat and tidy yard, is only accomplished when the skill is applied to the tools.
That’s the actual unit of value AI is unlocking: not services broadly, but individual skills, atomized and inserted into an AI platform, with an accomplished objective coming out the other side. And critically, it’s not about replacing jobs wholesale. Most jobs are a bundle of tasks — some tool-intensive tasks, some interpersonal tasks, some social and relational tasks that aren’t going anywhere. What’s being automated is the skills for tool-intensive task, the expertise component, the part that used to require years of training or a team on another continent. The rest of the job — the human parts — largely remains.
Skills Are Compressed Experience
To understand why the skill layer is so valuable, it helps to think about what a skill actually is.
So much of the human experience is an exercise in data compression. We spend our lives trying to take enormous amounts of information and squeeze it into smaller and smaller form factors. We take a lifetime of experience and turn it into a textbook. We take a textbook and turn it into a course. We take a course and turn it into a framework. We take a framework and turn it into a principle. And at every step, we’re trying to pack as much signal as possible into something compact enough to transfer to another person.
A skill is the embodiment of that process. When someone has the skill of financial modeling, what they really have is thousands of hours of compressed experience — patterns recognized, mistakes made and corrected, judgment developed over time — all collapsed into a capability they can now deploy rapidly and reliably. We wrap that compression with credentials: degrees, certifications, titles. But those are just signals pointing at the underlying thing, which is a highly compressed body of experiential data.
What AI training is doing, at its core, is the same thing — just at a scale and fidelity that humans can’t match. Reinforcement learning, fine-tuning, RLHF — these are all mechanisms for taking vast amounts of data and compressing it into deployable capability. Into skills. The models are extraordinary, but the models are the infrastructure. The skills they encode are the product.
What OpenClaw Taught Me About the Skill Layer
I’ve spent the last few weeks building with OpenClaw — an AI orchestration layer that sits above your AI models and coordinates what they do and when. It was recently acquired by OpenAI, and I think that acquisition makes a lot of sense once you understand what actually makes OpenClaw valuable. Big shout out to Peter Steinberger and the team for what they built.
Here’s what I’ve learned from working with it directly: the agent itself, in isolation, is surprisingly inert. At its core, an OpenClaw agent is essentially a perpetual cron job running against an AI — a loop that keeps asking, “do you have anything for me to do?” over and over again. Without the right inputs, the answer is always effectively no.
What makes the agent genuinely useful is the skills you give it. When you equip an OpenClaw agent with the skill to manage a calendar — which means not just API access, but the judgment about what “optimized” means, the ability to weigh competing priorities, the understanding of context and preferences — suddenly the loop has something to work with. The agent can actually do something.
This is my skill layer thesis made concrete and observable in real time. The infrastructure exists. The applications exist. The orchestration layer exists. But the thing that determines whether any of it creates value is whether the right skills have been encoded and made available. That’s where the leverage is. That’s where the differentiation will compound.
What This Means for Valuations
This reframe matters enormously when you think about AI company valuations — which, to put it charitably, have raised some eyebrows.
The typical critique goes something like this: how does a company at a $500 million Series A valuation ever generate venture-scale returns? The numbers look crazy if you’re measuring against software application TAMs. But if you stop measuring against software TAMs and start measuring against the actual skill being displaced, the picture changes completely.
“What’s the market cap for the skill of coordinating and scheduling meetings?”
“What’s the addressable market for the skill of financial analysis?”
You’re not talking tens of billions anymore. You’re talking tens of trillions — because you’re pricing against the human capital currently performing those skills across the entire global economy. And when you look at it that way, the entry valuations that seem irrational at Series A start to pencil out at exit.
In the same way that gasoline turned out to be the enormously valuable byproduct of Rockefeller’s kerosene production process, I believe skills will turn out to be the valuable byproduct of the reinforcement learning and model training process. The models are being built. The infrastructure is being funded. But the thing that actually comes out the other side — specific, deployable, productized skills — is where the durable value will accumulate.
That’s where I’m paying attention. That’s where I’m investing.
Thanks for being a Perishable Knowledge subscriber.
If you are getting value from my blogs, I’d appreciate it if you share this post with someone who will enjoy it. (If you’re reading this email because someone sent it to you, please consider subscribing.)


