I’ve Seen What It Takes To Build Hard Things
I didn’t join Niobium because of AI. I joined because of what comes after it. But to explain that, I should start with how I got here.
Over the last decade, I’ve been part of teams trying to do things that weren’t supposed to be easy.
At Astra, the challenge wasn’t just building a product. It was launching rockets—repeatedly, reliably, and at scale. There’s no shortcut to getting that right. At Siprocal, we set out to build a cross-device growth and monetization platform across LATAM, navigating fragmented markets, inconsistent infrastructure, and proving a model where nothing was standardized. More recently, at Celestial AI, the challenge was breaking the memory wall. Unlocking the next level of scale in accelerated computing meant pushing the limits of performance, power, and manufacturability while demand was moving faster than the underlying technology.
Across all of those experiences, a few things have become clear. Bringing new capabilities to market—especially ones that require shifts in infrastructure or behavior—is messy. It takes longer than expected. It requires iteration. And most attempts don’t make it.
When it does work, it comes down to a few things: Is the problem real? Is the technology fundamentally sound? And is the team capable of navigating the inevitable setbacks to actually get it into the market?
Where I Started To See The Problem
That pattern recognition is what shaped how I looked at Niobium.
At the same time, I was seeing something else play out in real time. Like everyone else, I leaned into AI, both professionally and personally. I started using it across my own workflow, including writing, analysis, modeling, and even day to day decision making.
At first, it feels like a step function improvement. You move faster. You think differently. You start to rely on it. But then you hit a wall. And you hit it quickly.
The moment the work actually matters, the data you need is sensitive. Financials, internal metrics, customer information. The data that actually drives decisions. And that is where things break.
You hesitate. You sanitize inputs. You hold back. Or you stop using the tool altogether for the most important work.
I have seen the same pattern play out across teams. Companies talk about unlocking data, but when it comes time to use sensitive datasets, everything slows down. Legal gets involved. Security pushes back. Projects stall. Not because the use case is not valuable, but because the risk is not acceptable.
So the most valuable data just sits there.
That disconnect stuck with me. It is not a tooling problem. It is not a performance problem. It is a trust problem. If the data cannot be used safely, nothing built on top of it reaches its full potential.
The more I looked into what Niobium was building, the more it started to click. Not because it was positioned as the perfect solution, but because it was grounded in the same problem I was already seeing firsthand. A way to move forward without forcing a trade off between what is useful and what is safe.
The Right Team
One of the biggest lessons I’ve taken from past teams is this: having the smartest people in the room is just the starting point.
What matters is how the team operates when things get difficult, which they always do. The teams that make progress are aligned on the problem, even when the solution isn’t clear. They challenge each other without ego. They focus on getting to the best answer, not being the one who had it first. And they trust each other enough to move quickly.
I’ve seen the opposite too, smart people working in silos, conversations turning into debates, and decisions slowing down because alignment never fully forms. That’s where things can fall apart.
When I spent time with Kevin Yoder and John Barrus, the difference was immediately noticeable. There’s real depth, people who understand the problem and have built at scale. But more importantly, they work through problems together. They iterate. They pressure test ideas without it becoming political.
That only became more evident as I spent time with Dave Archer and the broader leadership team. It’s not about individual brilliance. It’s about collective clarity.
In my experience, that’s one of the strongest indicators of whether a team can take something complex and actually get it into the market.
From Concept to Reality
Niobium is already guiding its first private beta users through the Fog. They’re using the mistic™ Core FPGA platform to build real applications. Not experiments. They have what they need to get started. A compiler. SDK. Templates. Direct access to the engineering team. And importantly, this isn’t a dead-end path. The same architecture carries forward to the mistic ASIC. That means early users aren’t starting over. They’re building ahead. By the time others are still figuring out how to safely use sensitive data, these teams will already have production-ready applications.
The Right Time
Enterprises are still early in figuring out how to safely use sensitive data. But that’s changing quickly. As AI matures, the value shifts toward proprietary, high-quality data. At that point, privacy isn’t a feature. It’s foundational.
No company is going to accept the risk of exposing its most valuable data just to get compute. The ones that solve this early will have a meaningful advantage.The ones that don’t will be constrained by it. I’ve seen what it takes to bring hard things to market. I’ve seen how often it doesn’t work. And I’ve seen what it looks like when it does.
This feels like one of those moments. Not because it’s easy. But because the problem is real, the technology is real, and the team is built to take a real shot at it. There’s a window right now to define how private computation works at scale. That window won’t stay open forever.
That’s why I joined.
And that’s why I’m all in.