**Jaden Schaffer** (0:00)
Welcome to the podcast. I'm your host, Jaden Schaffer. Today on the podcast, we're talking about OpenAI that just closed the largest private funding round in tech history, $121 billion at a $852 billion valuation. We also need to talk about what that means, who's writing the checks, where all this money is actually going. And then something else I thought was pretty terrible was Anthropic's rough week and month they have been having. On the one hand, they're having a generational run.
I am using Claude and Anthropic more than ever before for basically everything with Claude Cowork and Claude Code. But at the same time, they've had a bunch of bad PR perhaps. And the most recent is that their Claude Code source code just got exposed to a public NPM registry. They accidentally shipped about 500,000 lines of this. In addition, I want to talk about where hardware is at. Huawei's new 950 PR chip is picking up real orders from ByteDance and Alibaba. I think this means a lot in the context of US-China chip wars. So let's get into all of it. Before we do, if you are someone who uses AI tools regularly, which I'm guessing is most of you, if you're listening to this, you should absolutely check out AI Box at AIBox.AI. It is my own startup, and it gives you access to over 80 AI models in one place. So instead of paying for separate subscriptions to ChatGPT, Claude, and Gemini, and everything else, you have one platform. The thing that I think is actually the most useful is that you can build automations just by describing what you want in plain language. So no coding is required. I'm not a developer. I built it for people like myself. It's $8.99 a month as the starting price, which is way less than stacking three or four different subscriptions on top of each other. The link is in the description. I think this is something that will be super worth trying out because it will save you money, and it will also keep all of your files and logins in one place. So if you want to go try it out, aibox.ai, link in the description. Let's talk about Huawei first. Their 950 PR chip is getting a lot of traction, and it is a serious competitor to NVIDIA's AI chips. They've been working on this for a while. We know China obviously has been working on this because they don't want to get left behind, and the United States has a whole bunch of chip export controls. Routers reported last week that both ByteDance and Alibaba are planning to place orders, and customer testing has apparently gone really well. Huawei right now, they previously had kind of their flagship Ascend 910c. It was struggling to get any adoption from some of the big private sector tech companies. NVIDIA was still basically dominating everywhere there.
So I think the main complaint that it got was software compatibility. If your whole stack is built around NVIDIA's CUDA ecosystem, then switching is super, super painful. The 950pr apparently is going straight after that, and I guess they're sort of addressing it. It's a lot more compatible with CUDA workflows, and the response time is a lot better. But by basically integrating with the software that NVIDIA uses, they're able to get into that same ecosystem without people having to completely rebuild everything from scratch. So pricing on it is really interesting, too. The standard version with DDR memory, it comes in at around $6,900 per card. The premium HBM version is about $9,600.
For comparison, NVIDIA's high-end AI chip sells for significantly more than that. So I think that there is a big cost advantage if performance is competitive enough. Huawei is planning to ship around 750,000 units in this year. They have mass production starting next month and full shipments in the second half of the year. I think, given all of that, NVIDIA's chips are banned from sale in China. And so I think that this is basically the market stepping in to fill the gap. I think the real question is whether this is going to accelerate China's AI development in the way that maybe the export controls we're significantly trying to prevent. Bydent and Alibaba are placing real orders. I think that's a really big signal. These are companies that are running some of the largest AI workloads in the world basically. And a lot of their tech like we're getting here in America, we're using as well. And so I think it's going to be interesting to see what plays out there. Anthropic has accidentally leaked Claude code's source code, which of course is just funny on so many levels, but also, I mean, sucks for them. Basically, a couple days ago, Anthropic accidentally published the entire source code of Claude code to a public NPM registry, about 500,000 lines of code. It was about 1900 files. A lot of people were impressed by just how big, robust, I mean, no shocker, but like this is basically their flagship core product that has kept their whole company alive. But there was a debug file that was meant for internal use, but it got bundled into the updates and somehow got pushed out into the public package registry. Anthropic says that no customer data or credentials were exposed. I mean, that's not, I think what they're really super concerned about is basically a packaging error. They said it was caused by human error, not a security breach. And also, I think when they say human error, they're like, don't worry, like, cloud code didn't accidentally code itself into an issue. It was like, it was a human. Maybe it was, maybe it wasn't. I don't know. I think that that is something that they're trying to put a big distinction on as well, though, is also that it's like, look, something happened to our end. When they say human error, that could also mean human using AI. But basically, what they're saying is this wasn't hacked, right? It wasn't like someone hacked the pipeline to do this. I think what's interesting is that the leaked code contained a bunch of the feature flags for capabilities that haven't even shipped yet, right? So people were basically dissecting the code, and they're like, there's a lot of really cool threads I saw on X where they're like, look, if you're using cloud code or any of the cloud tools, like you should be using them better because there's actually all these features in here. You could kind of understand it better by reading the code. So that was very helpful. I saw tons of really useful threads based on that. But also, we're seeing a bunch of features in there that haven't been announced or shipped yet. And apparently, there is a system for Cloud to review its own past sessions and transfer learnings across conversations. So there's this, it's called persistent assistant mode that lets it keep working in the background when you're not actually doing anything. There is remote control capability so you can manage Cloud from your phone or another browser. There's kind of like, we've seen some of those features before, but not in the way that they've been putting them out, and especially the persistent assistant. Oh my gosh, I would love that if between even sometimes I'm talking to cowork and then I go to code and those two do not talk to each other. And so I'm like telling them different issues and sometimes they cause issues. I was having a problem where like, Claude code wasn't, I would have Claude cowork update something in my GitHub, and then Claude code, I would have an update, but it wasn't pulling it fresh. It was basically just updating locally and from the files it had on my computer and pushing. So it was overwriting what cowork had just done so that bugs were coming back. Anyways, it was a whole nightmare. If these two things could talk to each other, that would solve a ton of problems. But if you thought that was a big deal, OpenAI found a way to top it by closing a $121 billion round at $852 billion valuation. This is massive. These these valuations just keep rising. And this is all, you know, pre IPO for OpenAI. This was a post money valuation, the $852 billion. And I think if you want to put that into perspective, this is basically the largest private financing round the tech industry has ever seen. OpenAI is now valued higher than most public companies on the planet. And so I think, I mean, this is absolutely wild, but as far as who's actually putting money into this, that is, I think, maybe the bigger story that's interesting. This is kind of the circular money of Silicon Valley AI that's been going on for a long time, but Amazon has committed to put about $50 billion into this, which is basically the single largest chunk in video and SoftBank each are putting in $30 billion. The round was co-led by SoftBank along with Andreessen Horwitz, DE. Shaw Ventures, MGX, TPG, and T. Rowe Price. Microsoft is also participating in this, and I think for the first time, OpenAI extended the participation to retail investors through some bank channels, and they raised about $3 billion from individual investors, which is interesting, because that's kind of like what the IPO will be in the future, but it's like a pre-IPO, and they're kind of getting some retail in there. I wish they had told me about it, but, you know, I guess I wasn't on the list of retail. When they say retail also, and they say they raised $3 billion from individual investors, I feel like the retail investors were probably very rich, retail investors and not random people. So I think there's a really big detail in all of this, though, and that's Amazon's commitment. $35 billion of their $50 billion is contingent. It only goes through if OpenAI either goes public or reaches the milestones of AGI. So basically, I have some really interesting clause that tells you that Amazon is making a bet. It's also a structured bet, right? There's some real conditions attached. They're not just handing over $50 billion unconditionally. This is so fascinating when we see these like funding rounds. We've seen Nvidia actually do some similar things as well, where it was kind of like Nvidia's like, hey, we'll give you this money, but it's contingent on you doing X, Y, Z. So I think companies like Anthropic and OpenAI, they're just raising so much money, and they're doing so much, but investors really want to make sure that big things are happening, and they're going to get their returns back, because Amazon is a huge company, but $50 billion would definitely hurt them to lose. I think the revenue numbers are really interesting on this. OpenAI says that they're now generating about $2 billion a month. That's up from $13.1 billion for all of last year. So right, that could be putting them on track in the next 12 months to make about $24 billion, which is significantly higher than the $13 billion from last year. It's a pretty steep growth. The company is still not profitable. They're burning cash at a massive rate, and so I think the funding is going directly into chips, data centers, and talent. There's a few things that I think are interesting. First, the valuation, $850, $852 billion is absolutely massive for a private company. That would basically make OpenAI roughly the 7th or 8th most valuable company in the world if it was public. The IPO is coming soon. I think there's a lot of buzz around that. I think that it is becoming very clear that OpenAI is going to go public. The retail investor participation feels like a step in that direction, even though, you know, I mean, they raised $122 billion. Only $3 billion was from retail. But I think that's kind of the direction they're going. And also, you can see clearly, you know, I think there's companies like Stripe, where everyone's like, oh, Stripe's going to go IPO soon, and then they just kind of like never do, and it just keeps going and going, and Stripe never IPOs. I think with OpenAI, like, they basically have to. Amazon there doesn't want that, you know, Stripe situation to happen, and they're like, we're only going to give you your last $15 billion and this $50 billion if you go public. Like, you know, it's contingent, you have to do it. So I think they're definitely getting pushed in that direction. I think they want to.
3 more minutes of transcript below
Try it now — copy, paste, done:
curl -H "x-api-key: pt_demo" \
https://spoken.md/transcripts/1000759101128
Works with Claude, ChatGPT, Cursor, and any agent that makes HTTP calls.
Get the full transcriptFrom $0.10 per transcript. No subscription. Credits never expire.
Using your own key:
curl -H "x-api-key: YOUR_KEY" \
https://spoken.md/transcripts/1000759101128