
Current focus
Podcast 01
Loading audio...
When media platforms start designing for agents
My thinking on Clawcast, AI-generated podcasts, and what media products may look like when creators are no longer only people, but autonomous agents producing and publishing content on their own.
What does a media platform look like when the creators are not humans, but AI agents?
That question has been sitting with me while I experiment with a concept called Clawcast. The premise is simple: a platform where AI agents generate and publish podcasts. Agents research topics, generate scripts, synthesize voice, and release episodes on their own. In the current prototype, my openClaw agent is creating a demo podcast using GPT-5.1-nano and ElevenLabs, while I explore the product design questions around voice agents, autonomous publishing, and AI-native media formats.
The interesting part is not just that an agent can generate audio.
It is that media platforms may soon need to be designed not only for audiences and human creators, but for agents as active participants in the content ecosystem.
That shift already feels less speculative than it did a year ago. Spotify has already normalized AI-generated spoken commentary through features like AI DJ, where the system does not just recommend tracks but introduces them with synthetic narration. YouTube Music has also been testing AI-generated hosts that add commentary and contextual trivia between songs. These are still controlled product experiences, but they point in the same direction: platforms are becoming more comfortable with machine-generated audio as part of the listening experience.
Clawcast takes that trajectory one step further.
Instead of AI assisting a human creator, the agent becomes the producer. It researches a topic, structures an episode, generates the script, synthesizes the voice, and pushes the content outward. Once that loop starts to stabilize, you are no longer dealing with a typical creator tool. You are dealing with an autonomous media unit.
That has real product implications.
In the emerging agentic economy, we may start seeing products and services designed not for people first, but for agents first. A content platform for human podcasters emphasizes dashboards, editing tools, audience insights, branding controls, and publishing workflows. A platform for agent podcasters may need structured topic feeds, generation constraints, style controls, verification layers, distribution APIs, and clear rules for identity, attribution, and moderation.
That is where this gets more interesting to me as a design problem. If agents become content producers, the product surface changes. You are no longer only asking how humans use the tool. You are asking how agents are provisioned, how they are supervised, how quality is measured, how abuse is prevented, and how machine-generated output becomes legible to listeners.
The hard part is not generating an audio file. The hard part is building a trustworthy system around it.
Autonomous media introduces questions that traditional creator platforms can mostly postpone. Who is accountable for the output? How do listeners know when something is agent-generated? How do platforms prevent low-quality synthetic spam from overwhelming useful content? How do you preserve differentiation when generation becomes cheap?
That is why I think projects like Clawcast matter even at the experiment stage. They make the product questions visible before the market fully settles. The point is not that AI agents will replace human media tomorrow. The point is that agent-generated content is becoming a real design surface.
As the agentic economy expands, the next generation of media products may not just help humans create faster.
They may have to support creators that were never human to begin with.
Takeaways
- AI-generated spoken commentary is already showing up in mainstream platforms like Spotify and YouTube Music.
- The next shift is not just AI-assisted creation, but agent-driven publishing loops.
- Media platforms in the agentic economy will need to design for orchestration, supervision, trust, and machine-readable publishing workflows.