Tech Brew // Morning Brew // Update
Plus, OpenAI picks a new main quest.

Do you want to build a snowman? Disney just did. Its new robotic Olaf will debut at Disneyland Paris on March 29, and he's surprisingly lifelike. At 35 inches tall, Olaf moves exactly like the animated character, but isn't AI—he has prerecorded lines from voice actor Josh Gad and is steered via Steam Deck by a cast member—and when he waddles around, he crosses the uncanny valley completely. (Watch Olaf waddle here.)

The technical feat? Getting a top-heavy snowman head to balance on a small neck and literal snowball feet. Disney ran 100,000 virtual Olafs through an Nvidia simulation—rewarding screen-accurate moves that kept that neck joint from overheating—all in just two days. Disney calls this its most advanced robot yet, with plans to eventually have “believable autonomy” so it can create “an entire world populated with characters you know and love.”

Also in today's newsletter:

  • Nvidia wants more than the GPU crown.
  • A courtroom smart-glasses play backfired fast.
  • OpenAI is shifting its focus—and it’ll affect a lot of its products.

—Carlin Maine, Whizy Kim, and Saira Mueller

THE DOWNLOAD

Jensen Huang

Josh Edelson/Getty Images

TL;DR: During its GPU Technology Conference keynote yesterday, Nvidia announced the Vera CPU (yes, you read that right). It’s a new chip designed not for the training work that made it a $4.5 trillion company, but for the kind of inference work that AI is moving toward. This marks Nvidia’s most direct challenge to Intel and AMD yet, and it’s a sign that the AI chip race may no longer be winnable on GPUs alone.

What happened: Clad in his trademark leather jacket, Jensen Huang walked onstage at GTC yesterday to declare: The AI “inference inflection” is here. As the industry shifts away from training and towards practical use—ChatGPT alone runs about 2.5 billion prompts a day—the GPU titan is answering with a new 88-core CPU that’s built to excel at inference.

By data center standards, 88 cores is modest—compare it to the 288 in Intel’s forthcoming Clearwater Forest and the 256 in AMD’s EPYC Venice. But Vera is promising to eke out more AI performance per core, and it can communicate with Nvidia's GPUs through a proprietary, high-speed link that other chips can't tap into, accelerating the overall workload. Vera is also tailored specifically for agentic AI, which is a highly compute-intensive slice of inference. The CPU is part of a broader seven-chip platform called Vera Rubin, which also includes next-gen GPUs, networking chips, and the Groq 3 LPU, which is designed for extremely fast inference. All together, Nvidia is signaling that it’s diversifying its offerings beyond the beefiest AI chips that reign supreme for training.

The CPU pivot: GPUs can handle a lot of tasks simultaneously, which is ideal for training on massive datasets. Agentic AI workloads, though, benefit from a processor that can order tasks logically—putting on shoes after pants, essentially—and that’s where CPUs shine. Agentic workloads are also far more token-heavy, and that high demand is outpacing what current CPUs can handle.

The year of the AI agent: AI is entering its errand-running era, shifting from chatbots that answer questions like “what’s the cheapest flight to London” to agents that can book the trip themselves. Since last year, OpenAI folded its Operator agent into ChatGPT; Anthropic shipped Claude Code and Cowork, and Microsoft launched Copilot Cowork across Microsoft 365—while smartphone makers are increasingly integrating on-device AI agents. Gemini’s agentic features have been rolling out to new Samsung phones, and Apple's long-awaited Siri overhaul is expected later this year.

Nvidia’s competition heats up: Nvidia's GPUs still dominate training, but inference is a far more crowded fight. Intel holds about 60% of data center CPU market share. AMD has roughly 24%, and Nvidia sits at just 6%. Google, Amazon, and Meta are all developing custom chips to cut their Nvidia dependence—Meta's latest, a family of four MTIA processors, is aimed squarely at inference. Nvidia has taken notice: It spent $20 billion last December to license inference tech from specialized AI chipmaker Groq (one of its founders helped design Google’s TPUs).

Bottom line: Nvidia’s CPU expansion is a notable about-face from a company that has long insisted GPUs could handle all of AI's needs—though it's still betting big on them, projecting $1 trillion in orders across its GPU-led Blackwell and Vera Rubin platforms through 2027. The play for Nvidia, it seems, is to become the one-stop AI shop: CPUs, GPUs, networking, and software—no other chipmakers required. —WK

Presented By ElevenLabs

A stylized image with the words bug report.

Hyphenated and frustrated

A few weeks ago, we shared a reader’s hot take about online forms that have a dropdown list of states, and how they can make filling in your address a user experience nightmare. Tech Brew reader Joseph from O’Fallon, Missouri, followed up with a similar issue he frequently comes across.

As a person whose parents gave me a hyphenated last name, it is really frustrating when online forms only allow alphabetical characters for name fields. It is 2026, and we have the technology to handle other characters! I am a software engineer, so it's doubly annoying because I know the silly reasons these systems do this and how easy it is to handle it properly. I generally have to put in my name with a space instead of a hyphen or just both names illegibly smooshed together when the field doesn't allow spaces either. The worst is when I need to search in these systems by my last name and then I have to guess which combo of my names and spacing characters or lack thereof this particular system is using. They never tell you, so you just get to guess. On top of all of that, my wife took my hyphenated last name when we married, so she blames me for these frustrations when she encounters them as well.

Another day, another online form field giving users headaches—and for Joseph, it doesn’t end there. “To add insult to injury, I've managed to live in towns with apostrophes in their names since 2018,” he adds. “Yet another character that is often not allowed in the form fields for city name.” For now, we’re impatiently waiting for websites to heed his advice to “free the hyphen” (and apparently the apostrophe, too). —CM

Together With Got Print

THE ZEITBYTE

Smart Glasses worn by witness in court

Morning Brew Design

In a scene too ludicrous even for reality court TV, a man in a London courtroom was caught being coached through smart glasses on the stand—and then claimed the voice was ChatGPT.

In January, Laimonas Jakštys showed up to court trying to regain control of his insolvent company—and instead torpedoed his own case, according to the judge. Jakštys’s ploy unraveled fast when people in the courtroom heard audio near him. The judge told him to take his glasses off—and after a few more questions, his phone started playing his coach's voice on speaker for the entire courtroom. His call log showed repeated calls mid-testimony from a contact named “abra kadabra.” (He said this was a taxi driver.) What unfolded next was the equivalent of a magician getting stuck in the water tank. The voice? Just ChatGPT accidentally going off, Jakštys claimed. The judge confiscated the glasses and phone, handing them to his solicitor, but Jakštys showed up the next day wearing his favorite talking eyewear again. The judge ended up throwing out his entire testimony as unreliable; Jakštys lost the case and was ordered to pay the other side's legal costs.

If you're keeping score, smart glasses are now being used in job interviews, on the street, and in courtrooms on both sides of the Atlantic. AI coaching tools like Cluely whisper real-time answers during Zoom interviews. During Meta's social media addiction trial, which began in February, the judge banned recording with smart glasses after Zuckerberg's entourage strolled in wearing Meta Ray-Bans—warning that anyone who used facial recognition on jurors would be held in contempt. The future of wearable computing is here, but maybe leave them at home when you're about to be sworn under oath. —WK

Chaos Brewing Meter: /5

A stylized image with the words open tabs.

*A message from our sponsor.

Readers’ most-clicked story was about Apple’s nine new emojis, including one titled “hairy creature.”

SHARE THE BREW

Share The Brew

Share the Brew, watch your referral count climb, and unlock brag-worthy swag.

Your friends get smarter. You get rewarded. Win-win.

Your referral count: 0

Click to Share

Or copy & paste your referral link to others:
techbrew.com/r/?kid=ee47c878

         
ADVERTISE // CAREERS // SHOP // FAQ

Update your email preferences or unsubscribe here.
View our privacy policy here.

Copyright © 2026 Morning Brew Inc. All rights reserved.
22 W 19th St, 4th Floor, New York, NY 10011