A synthesis of 5 perspectives on AI, machine learning, models release, models benchmarks, trending AI products
AI-Generated Episode
On this episode of The NeuralNoise Podcast, we unpack how Google’s Gemini just reshaped the AI race, why Apple is betting big on its rival’s tech, and what these shifts mean from your iPhone all the way up to global chip geopolitics.
In AI right now, winning isn’t just about having a good model. It’s about owning the stack: chips, cloud, products, and user data. By that measure, Google looks closer than anyone to a true AI endgame.
Gemini 3, released in November and widely regarded as the best overall large language model on the market, sits at the center of this strategy. It tops many benchmarks and is consistently rated at or near the top for most tasks. More importantly, Google trained it on its own custom TPUs — specialized chips it has been building for years — which frees the company from Nvidia’s bottlenecked supply chain and lets it tune hardware and software together.
That gives Google something few others have: full-stack control over AI performance and cost. With that in place, the next step is distribution — getting Gemini in front of billions of people and feeding it the data that makes modern AI so powerful.
Two moves this week show how aggressive that push has become: a blockbuster deal with Apple, and a new “Personal Intelligence” layer that hooks Gemini into almost everything Google already knows about you.
Apple has long prided itself on vertical integration, building its own chips, OS, and services. Yet for AI, it’s now turning outward. Apple and Google have entered a multi‑year collaboration under which the next generation of Apple’s foundation models will be based on Google’s Gemini models and cloud technology, powering future Apple Intelligence features and a more personalized Siri coming this year (TechCrunch, MacRumors).
Previous reporting suggests Apple could be paying around $1 billion annually for access to Gemini, after testing alternatives from OpenAI and Anthropic. That’s a striking acknowledgment that, at least right now, Google’s models are the most capable foundation for Apple’s ecosystem.
For Apple, the upside is obvious: Siri has lagged behind ChatGPT and Gemini, and Apple Intelligence’s subtle, mostly invisible features haven’t delivered the “wow” moment users expected. A next‑gen Siri — expected with iOS 26.4 in March or April — promises better personal context, on‑screen awareness, and deeper app control, such as combining information from Mail and Messages to answer questions about travel or reservations, while Apple keeps privacy guarantees via on‑device processing and its Private Cloud Compute.
For Google, the deal is even more strategically valuable. Siri handles roughly 1.5 billion requests a day; if a meaningful share of those run through Gemini, it instantly becomes one of the most popular ways people interact with Google’s AI — a direct distribution rocket that helps close the usage gap with ChatGPT.
Running the best models at scale is only part of the picture; the other is personalization. Google’s new opt‑in feature, “Personal Intelligence,” connects Gemini to the vast trove of data it already holds about you: search history, YouTube viewing, Gmail, Photos, Drive files, and more.
Instead of asking users for detailed prompts or forcing them to set up elaborate instructions, Gemini can now infer context from your digital life. That makes responses more useful but also significantly raises the stakes on privacy and data control. For now, Personal Intelligence is in beta for a subset of paying AI customers, with plans to roll it out broadly — and, crucially, to integrate it into Google Search’s emerging AI mode.
Combined with the Apple deal, this creates a powerful flywheel: more users, more activity, richer personal data, and therefore better models and products. In 2022, ChatGPT caught Google flat‑footed. By early 2026, Google has the leading models, in‑house compute, massive distribution through Android, Search, and now iPhones, plus an increasingly intimate view into user behavior.
The AI race isn’t just corporate — it’s geopolitical. A US‑led initiative called Pax Silica is building a “coalition of capabilities” to secure semiconductor and AI supply chains, spanning critical minerals, advanced manufacturing, and compute power. Qatar and the UAE are the latest to join a group that already includes Australia, Britain, Israel, Japan, Singapore, and South Korea (Semafor).
This is the macro backdrop for everything else in the episode: as models like Gemini become core infrastructure for phones, enterprises, and governments, control over chips, data centers, and critical materials turns into a strategic lever. Pax Silica is effectively the political mirror image of what Google is building technically — a vertically integrated, AI‑first stack, but at the level of nation‑states.
While Google and Apple battle for the consumer interface, enterprises are racing to embed AI agents where work already happens. Salesforce’s revamped Slackbot is a good example. Now marketed as a “super agent,” it can find information across tools, draft emails, and schedule meetings, and even reach into other platforms like Microsoft Teams and Google Drive — all from within Slack (TechCrunch).
Internally at Salesforce, the new Slackbot has become the company’s most adopted tool, suggesting real product‑market fit for agentic workflows rather than standalone chatbots. Voice capabilities and web browsing are on the roadmap, pointing toward AI that is less a single app and more an always‑on colleague woven into your digital workspace.
Across consumer, enterprise, and geopolitics, the story converges: AI is no longer a discrete feature but the organizing principle of platforms and alliances. Google’s Gemini sits at the center of that shift — powering Apple’s flagship devices, personalizing itself with your data, and riding a broader realignment of chips, clouds, and countries. Whether you’re an iPhone user, a Slack‑bound knowledge worker, or a policymaker, the question is no longer if AI will be embedded in your world, but who gets to own the stack that runs it.