| | Welcome back. For baseball fans (everyone else, skip ahead): the Dodgers became the first team to repeat as World Series champions in 25 years, beating the Blue Jays in Game 7, where tickets started at $1,031, the most expensive sporting event ever in Canada. | IN TODAY’S NEWSLETTER | 1. AI firms grapple with emotional chatbots | 2. Apple may eye M&A to play AI catch-up | 3. Open source struggles to keep up | | | CULTURE | AI firms grapple with emotional chatbots | | More AI firms are cracking down on younger users. | Character.AI announced last week that it would remove the ability for underage users to have “open-ended chat” on its platform by November 25. The company will start by limiting use to two hours per day for under-18 users, and ramp down in the coming weeks. The company will also roll out “age assurance” functionality and open a nonprofit AI safety lab dedicated to safety alignment on future AI features. | Character.AI is the latest company seeking to limit how young users engage with its models. | | These controls come amid increased legislation on AI companionship, such as in California and New York. | And as more young users turn to AI for emotional support and companionship – with nearly one fifth of teens engaging in or knowing someone who has engaged in romantic relationships with AI – these controls are more important than ever, Brenda Leong, director of the AI division at law firm ZwillGen, told The Deep View. | “(Young people) have less life experience, less internal defenses, less ability to make those distinctions for themselves,” said Leong. “There's clear risk here that would justify trying to focus protections and controls specifically for minors.” | But minors aren’t the only ones susceptible to emotional attachment to AI models. As these models learn to placate users, no one is immune to the “emotional overlay” that these models provide, said Leong. But that emotional connection can come with significant risks. | Data from OpenAI published last week found that 0.15% of conversations a week on the platform include explicit indicators of suicidal planning or intent. With more than 800 million active weekly users, that equals roughly one million users per week. | “We're slaves to our own psychology. We learn to make all kinds of judgments – split second, long term, contextual – based on anthropomorphizing everything in our world,” said Leong. “There is danger, because we don't have good defenses. We don't have good barriers.” | | While protections related to underage users are easier to exercise, applying those protections to adult users can get murky. Interacting with these models in emotional ways can provide “dopamine hits,” Leong noted, which can become almost addictive. While it’s easy (in theory) to regulate the substances that young people shouldn’t partake in – alcohol, cigarettes and gambling, for example – adults generally have free rein. AI use is quite the same, even if it’s being utilized to a user’s own detriment. |
| | |
| | | TOGETHER WITH BLAND | Your best rep—without the attitude. | | Voice AI can handle calls as well as your best reps (minus the attitude). | Yet most people haven’t tried it because companies demand seven-figure contracts before even letting you test it. | At Bland, we think that’s stupid. | If you want, you can try a POC first and see the ROI for yourself. | Why can we do this? | Because 97% of our pilots convert to long-term deals. | Set up a pilot today and give it a try. |
| | |
| | | BIG TECH | Apple may eye M&A to play AI catch-up | | Apple might be eyeing acquisitions to catch up in the AI race. | CEO Tim Cook noted this week during the company’s earnings call that Apple is still open to acquisitions and partnerships as it navigates its place in the AI picture. Cook also told CNBC that the company expects to announce more partnerships in the coming months, noting that the “intention is to integrate with more people over time.” | Cook noted that Apple continues to “surveil the market on M&A and are open to pursuing M&A if we think that it will advance our road map.” | Cook’s remarks aren’t the first time we’ve heard rumblings of acquisition and partnerships from Apple. | | The CEO noted that Apple is making “good progress” with AI-powered Siri, and is on track to launch in 2026, and he said he’s “bullish” on Apple Intelligence becoming a major deciding factor in consumers’ decisions to purchase Apple products. | Despite its plans to spend $500 billion on developing AI over the next four years, the company has struggled to make a true name for itself in the AI space, losing talent to more aggressive tech giants like Meta and OpenAI. | Apple keeping an open mind about AI M&A opportunities could signal that it's shifting from its longstanding strategy of waiting out tech trends before developing its own, Apple-branded versions of them. |
| | |
| | | TOGETHER WITH WIZ | The Hidden Risks Behind MCP — and How to Secure Them | | The Model Context Protocol (MCP) is quickly emerging as the go-to standard for connecting LLMs to external tools and data. But as adoption picks up, many teams are implementing MCP without a clear security playbook – and that’s a recipe for disaster. | Exactly why Wiz created their newest whitepaper, The Hidden Risks Behind the Magic: Securing the Model Context Protocol (MCP). It shares early research and practical guidance to help security teams evaluate and secure MCP in real-world environments, and covers… | Key risks with local and remote MCP servers Real-world threats like prompt injection and supply chain compromise Actionable steps for safely using MCP tools
| Download your copy for free right here – you won’t regret it. |
| | |
| | | RESEARCH | Open source struggles to keep up | | Open source models may be struggling to keep up with their closed-off counterparts. | According to data released Thursday by the research institute Epoch AI, open weight models tend to lag around 3 months behind closed source models in capability. | Using the institute’s Epoch Capabilities Index, which measures model capabilities across companies like Google, OpenAI, Anthropic, Meta and xAI, open-weight models score an average of seven points lower than closed source ones. Epoch notes that this is roughly the capability gap between OpenAI’s o3, released in mid-April of this year, and GPT-5, released in early August. |  | Epoch AI @EpochAIResearch |  |
| |
We used our new capabilities index, the ECI, to measure the gap between open- and closed-weight models. The result? This gap is smaller than previously estimated. On average, it takes 3.5 months for an open-weight model to catch up with closed-source SOTA. | |  | | | 7:59 PM • Oct 30, 2025 | | | | | | 393 Likes 77 Retweets | 14 Replies |
|
| Open source AI offers several benefits, such as increased accessibility to AI and open collaboration for faster innovation. But open source tech projects generally struggle to stay financially solvent, one analyst previously told The Deep View. | Though open source AI has seen some wins in recent months, such as Reflection AI’s $2 billion funding round in October and DeepSeek’s explosive debut at the beginning of this year, privately developed models are receiving far more attention. In this privatization, AI research and development may be slowly slipping out of academia’s hands. | An article published Thursday by researchers from Stanford University’s Human-Centered AI institute, including Fei-Fei Li and Christopher Manning, claims that “The tide of openness in AI is receding.” As AI labs turn inward, carrying on the mantle of “open science” and AI for “public good” has fallen to universities. | “When AI knowledge becomes privatized, we lose more than transparency: we lose the cross-pollination of ideas that drives genuine scientific progress,” the researchers note. “Universities and public institutions are uniquely positioned to sustain this public-good role because they are not structured primarily around shareholder return or product rollout.” |
| | |
| | | LINKS | | | | Perplexity Flight Tracker: Track status changes of commercial flights within Perplexity’s platform. LoveartAI Hailuo 2.3: Dance video generation tool with realistic physics and motion. Fal Minimax Music 2.0: An AI audio generation tool for lifelike vocals across genres. Usage4Claude: A macOS menu bar app for monitoring how much you rely on Anthropic’s Claude. FormAI: Your AI-powered gym companion for offering real-time corrections and recommending exercises.
| | OpenAI: Deployed Researcher, Strategic Deployment Stripe: Machine Learning Engineer, Foundation Model Salesforce: Research Scientist - Salesforce AI Research Snap: Research Scientist, Generative AI
|
| | |
| | | | A QUICK POLL BEFORE YOU GO | AI companionship apps should be regulated most like... | |
| | |
| The Deep View is written by Nat Rubio-Licht, Faris Kojok, and The Deep View crew. Please reply with any feedback.
Thanks for reading today’s edition of The Deep View! We’ll see you in the next one. | | | “The base of the bark didnt seem real and consistent in the AI image” | | “Biome characteristics more consistent with the flora species shown.” | | “The dragon blood trees only grow on Socotra (Yemen), which does not have high, rugged mountains like depicted in the background on the fake image. ” |
|
| | “Wut?! Neither of those look real. What planet are we on?” | | “Not an easy one. There were no tell-tale indicators in either image, as far as I could see.” | | “The focus seemed a bit off in the real one. Almost tilt-shift, with the trees looking like miniatures.” |
|
| | Take The Deep View with you on the go! We’ve got exclusive, in-depth interviews for you on The Deep View: Conversations podcast every Tuesday morning. | | | If you want to get in front of an audience of 450,000+ developers, business leaders and tech enthusiasts, get in touch with us here. | |
|