It feels like we’re all strapped into the cockpit of a rocket ship with Sam Altman’s hand on the throttle, and he’s flooring it. The G-force is pinning us to our seats. Every week, another headline, another multi-billion-dollar deal, another jaw-dropping demo like Sora 2 that seems to bend reality itself. The sheer velocity is dizzying, and for many, it’s terrifying.
You can feel the anxiety in the air. Tech CEOs marvel — and worry — about Sam Altman's dizzying race to dominate AI. I read the columns painting Altman as a Bond villain, a smiling psycho who is gleefully dismantling copyright, cozying up to politicians, and moving so fast he’s bound to break the world. One critic even suggested OpenAI’s real motto is, “we’ll do what we want and you’ll let us, bitch.” There’s a palpable fear that we’re repeating the mistakes of the social media era, but on hyperdrive. That we’re lauding a Pied Piper who’s leading us, and our children, off a cliff.
And I get it. I really do. The pace is unsettling. When a technology is moving this quickly, it outpaces our ability to legislate it, to philosophize about it, and sometimes, even to comprehend it. But what if this isn't a race to the bottom, but a race toward a new horizon? What if the speed isn’t the danger, but the very thing that will get us past the dangers faster?
The Beautiful Chaos of Creation
People see the trillion-dollar deals with Nvidia and AMD, the constant stream of product launches, the political maneuvering, and they see chaos, but I see the frantic, beautiful, high-stakes process of building the infrastructure for a new world in real-time—it’s like trying to assemble a rocket ship while it’s already on the launchpad, moments from ignition. This isn't recklessness; it's necessity. Aaron Levie of Box hit the nail on the head when he said these platform shifts only come around once a decade or two. This is the big one. This is the paradigm shift. You don't tiptoe into a revolution.
When I first saw the Sora 2 demos—not the silly memes of Sam Altman shoplifting, but the truly artistic ones showing fantastical creatures moving through impossible landscapes—I honestly just sat back in my chair, speechless. It felt like watching the first moving pictures, a flicker of a new reality. This is the kind of breakthrough that reminds me why I got into this field in the first place. The immediate reaction from many was to focus on copyright, on what was being "stolen." And that’s a conversation we absolutely must have. But it’s the wrong first question.
The better question is: what is being created?

These models learn from a vast corpus of human-generated data—in simpler terms, they've studied a library larger than any human ever could, and they're not 'copying' a single book, they're learning the principles of storytelling, physics, and emotion from all of them at once. It’s a form of conceptual distillation, not plagiarism. Every time a technology has democratized creation—from the printing press making scribes obsolete to the camera threatening portrait painters—the existing gatekeepers have cried 'theft' and 'devaluation.' Is this truly any different, or are we just watching the painful, messy, and ultimately brilliant birth of a new artistic medium?
A New Partnership Awaits
The fear is that we’re losing control, that we’re handing the keys to human culture over to a machine. But I see the opposite. I see the potential for the most profound partnership in human history. We are not being replaced; we are being augmented.
Think about it. A single indie filmmaker can now dream up a scene that would have cost millions and render it in an afternoon. A teacher can generate a custom visual explanation for a complex scientific concept in seconds. A novelist can bring a character from their pages to life to see how they walk and talk. This isn't the end of human creativity; it's the removal of the physical and financial barriers that have constrained it for centuries.
Of course, this comes with immense responsibility. The concerns raised by people like Kate Doerksen about safety aren’t just valid; they are essential. We need the guardrails. We need the ethical frameworks. But building those frameworks can't happen in a theoretical vacuum. It has to happen in tandem with the technology's development, moving at the same breakneck speed. Slowing down doesn't make us safer; it just gives the less scrupulous actors a chance to catch up and define the terms. Altman is in a unique bind, as Jill Popelka of Darktrace noted: he has to be both the hero and the villain. He has to push the boundaries while also trying to build the fence.
I see the chatter on forums like Reddit, and while the cynics get the headlines, I'm drawn to the threads where digital artists are already experimenting, asking, "How can I use this to tell a story I could never afford to film before?" or "What new visual language does this unlock?" That’s where the real story is. Are we losing something in this transition, or are we on the cusp of gaining a new form of creative partnership we can't even fully imagine yet?
We're Not Breaking Things, We're Building a New World
Let's be clear. The road ahead will be bumpy. There will be lawsuits, ethical dilemmas, and moments of profound societal vertigo. But the narrative that Sam Altman is a reckless villain chasing revenue is a fundamental misreading of the moment we are in. This isn't about breaking things. This is about building the next chapter of the human story at the speed of thought. The alternative to moving this fast isn't a safer, more deliberate world; it's a world where this transformative power remains in the hands of a few, or worse, is defined by those who don't share the optimistic vision for what humanity and AI can achieve together. We can either be terrified by the speed, or we can grab the controls and learn to fly. I know which one I’m choosing.
