**RJ Scaringe** (0:00)
By 2030, it will be inconceivable to buy a car and not expect it to drive itself. Every single one of our cars, we want to have the ability for it to operate at very high levels of autonomy. Radars are extremely cheap, lighters are very cheap, but the really expensive part of the system is actually the onboard inference. An order of magnitude more expensive than any of the perception stack. My view is EV adoption in the United States is a reflection of the lack of choice. As consumers, we need lots of choices. We need to have variety, we self-identify with the thing we drive. The world doesn't need another Model Y, the world needs another choice.
**Sarah Guo** (0:37)
Hi, listeners, welcome back to No Priors. Today, I'm here with RJ Scaringe, the founder and CEO of Rivian. We're here to talk about their autonomy strategy, proprietary chips, their coming R2 model, whether Americans want EVs, and what our relationship to cars is going to be in the age of AI. Let's get into it. RJ, thanks so much for doing this.
**RJ Scaringe** (0:57)
Thank you for having me.
**Sarah Guo** (0:58)
So Rivian's already an incredibly cool company. How'd you decide it was going to become an autonomy company? When did that happen?
**RJ Scaringe** (1:04)
I mean, from the beginning, we thought of it as a transportation and mobility company. And in fact, even before Rivian became Rivian, when I was thinking about what's the first products, it was unclear what kind of car would be, but Rivian was a car. But it was always clear we wanted to be at the front edge of helping to redefine what does it mean to have access to personal transportation. And so autonomy has always been part of the strategy, but it's now fully coming to life with the technology that we're building.
**Sarah Guo** (1:29)
And you think about the function of Rivian. There's transportation, there's also the experience, like when, how long ago did you guys start investing in the autonomy strategy here?
**RJ Scaringe** (1:40)
Yes, we launched R1 in very end of 2021 And we used what I'll broadly characterize like a 1 approach to autonomy. So we had a perception platform. We used a third party front-facing camera that was essentially a third party solution that then plugged into an overall framework that we built, but it was all rules-based. So the camera is fed a rules-based plan, and the plan would then make a bunch of decisions around the feeds from the perception. And it was, you know, the moment we launched, we knew it was the wrong approach, but it was the thing we'd started working on well before the launch. And so at the end of 2021, beginning of 2022, we made the decision to completely reset the platform.
**Sarah Guo** (2:20)
Was that a hard decision?
**RJ Scaringe** (2:22)
No, because it was so clear. When you're building something like this, you recognize you're going to spend many, many billions of dollars creating it. So we knew this, like at the core of transportation is driving, and at the core of that is a shift to having the vehicle be capable of driving itself. And so we made the decision to redo it, like clean sheet, you know, no legacy of what we had built in the Gen 1 And that first launch from a hardware point of view in the middle of 2024, so it was with our Gen 2 vehicles, you know, not a single line of shared code, not a single piece of common hardware on the perception or on the compute side.
And then we had to build like the actual data flywheel. So we had to grow the car park to build enough of the data flywheel to then start to train the model. And what we showed in our autonomy data late last year, late in 2025, was the beginnings of a series of really like super exciting steps of how this is going to grow and expand. I say this all the time, I think of not just for Rivian, but I'd say for the auto industry in general. The last three years compared to the next few years are going to look very different. So the rate of progress that we saw in autonomy between let's say 2020 and 2025, or 2021 and 2025 And what we're going to see between today and let's say 2029, 2030 are completely different slopes. And that really comes back to entirely new architectures now being used to develop self-driving actually truly AI architectures. Whereas before these were not AI architectures in the true sense, they were using machine vision, but really rules-based environments that we defined as humans. We codified them, which is very different than how they're now built today.
28 more minutes of transcript below
Try it now — copy, paste, done:
curl -H "x-api-key: pt_demo" \
https://spoken.md/transcripts/1000749420025
Works with Claude, ChatGPT, Cursor, and any agent that makes HTTP calls.
Get the full transcriptFrom $0.10 per transcript. No subscription. Credits never expire.
Using your own key:
curl -H "x-api-key: YOUR_KEY" \
https://spoken.md/transcripts/1000749420025