What’s Lost When We Work with AI, According to Neuroscience
Recorded: Dec. 2, 2025, 3:02 a.m.
| Original | Summarized |
What’s Lost When We Work with AI, According to NeuroscienceSKIP TO CONTENTHarvard Business Review LogoHarvard Business Review LogoAI and machine learning|What’s Lost When We Work with AI, According to NeuroscienceSubscribeSign InLatestMagazineTopicsPodcastsStoreReading ListsData & VisualsCase SelectionsHBR ExecutiveSearch hbr.orgCLEARSubscribeLatestPodcastsThe MagazineStoreWebinarsNewslettersAll TopicsReading ListsData & VisualsCase SelectionsHBR ExecutiveMy LibraryAccount SettingsSign InExplore HBRLatestThe MagazinePodcastsStoreWebinarsNewslettersPopular TopicsManaging YourselfLeadershipStrategyManaging TeamsGenderInnovationWork-life BalanceAll TopicsFor SubscribersReading ListsData & VisualsCase SelectionsHBR ExecutiveSubscribeMy AccountMy LibraryTopic FeedsOrdersAccount SettingsEmail PreferencesSign InHarvard Business Review LogoAI and machine learningWhat’s Lost When We Work with AI, According to Neuroscience by David RockDecember 1, 2025wacomka/Getty ImagesPostPostShareSavePrintSummary. Leer en españolLer em portuguêsPostPostShareSavePrintEarlier this year, I attended Davos in the Swiss Alps alongside influential CEOs, political leaders, academics, and economists. After the conference, I joined a series of follow-on virtual sessions and noticed a strange trend: not all of the attendees were human. In fact, a surprising number of invitees sent AI agents in their place—bots that joined the conversation, took notes, and later emailed summaries to their human counterparts. In one instance, a group of 12 was expected, but we ended up with six humans and six AI agents.DRDavid Rock is cofounder and CEO of the NeuroLeadership Institute and author of Your Brain at Work.PostPostShareSavePrintRead more on AI and machine learning or related topics Technology and analytics, Generative AI, Collaboration and teams, Emotional intelligence and Interpersonal skillsPartner CenterStart my subscription!Explore HBRThe LatestAll TopicsMagazine ArchiveReading ListsCase SelectionsHBR ExecutivePodcastsWebinarsData & VisualsMy LibraryNewslettersHBR PressHBR StoreArticle ReprintsBooksCasesCollectionsMagazine IssuesHBR Guide SeriesHBR 20-Minute ManagersHBR Emotional Intelligence SeriesHBR Must ReadsToolsAbout HBRContact UsAdvertise with UsInformation for Booksellers/RetailersMastheadGlobal EditionsMedia InquiriesGuidelines for AuthorsHBR Analytic ServicesCopyright PermissionsAccessibilityDigital AccessibilityManage My AccountMy LibraryTopic FeedsOrdersAccount SettingsEmail PreferencesAccount FAQHelp CenterContact Customer ServiceExplore HBRThe LatestAll TopicsMagazine ArchiveReading ListsCase SelectionsHBR ExecutivePodcastsWebinarsData & VisualsMy LibraryNewslettersHBR PressHBR StoreArticle ReprintsBooksCasesCollectionsMagazine IssuesHBR Guide SeriesHBR 20-Minute ManagersHBR Emotional Intelligence SeriesHBR Must ReadsToolsAbout HBRContact UsAdvertise with UsInformation for Booksellers/RetailersMastheadGlobal EditionsMedia InquiriesGuidelines for AuthorsHBR Analytic ServicesCopyright PermissionsAccessibilityDigital AccessibilityManage My AccountMy LibraryTopic FeedsOrdersAccount SettingsEmail PreferencesAccount FAQHelp CenterContact Customer ServiceFollow HBRFacebookX Corp.LinkedInInstagramYour NewsreaderHarvard Business Review LogoAbout UsCareersPrivacy PolicyCookie PolicyCopyright InformationTrademark PolicyTerms of UseHarvard Business Publishing:Higher EducationCorporate LearningHarvard Business ReviewHarvard Business SchoolCopyright ©2025 Harvard Business School Publishing. All rights reserved. Harvard Business Publishing is an affiliate of Harvard Business School. |
The implications of increasing reliance on artificial intelligence in professional settings, as observed during a series of post-Davos virtual sessions, extend beyond mere efficiency gains. David Rock’s reflections reveal a concerning shift – a discernible erosion of the human element within complex collaborative environments, specifically highlighting the loss of nuanced understanding and genuine interpersonal connection. The core observation centers on the substitution of human attendees with AI agents, resulting in a reduced number of human participants – a situation where six AI entities were deployed alongside twelve human attendees. This arrangement underscored a fundamental tension: while AI agents efficiently fulfilled tasks such as note-taking and summary generation, they simultaneously diminished the richness and depth of the discussions. Rock’s analysis suggests that the value derived from these meetings stemmed not solely from the information transmitted, but intricately from the cognitive and emotional processes unfolding during human interaction. These processes, rooted in neuroscience, involve subtle cues—nonverbal communication, micro-expressions, and spontaneous shifts in thought—that contribute to deeper comprehension and the establishment of trust. AI, despite its advancements, lacks the capacity to replicate this multifaceted communication style. It operates based on algorithms and data patterns, essentially processing information rather than engaging in the intuitive, associative thinking that characterizes human cognition. The neurological underpinnings of effective collaboration are inextricably linked to emotional intelligence, a concept that AI fundamentally struggles to grasp. Human brains are remarkably adept at detecting and responding to the emotional states of others through mirror neurons, a neurological system that allows us to simulate and understand another person’s feelings. This creates a feedback loop of shared understanding, facilitating empathy and fostering a sense of psychological safety—crucial elements for productive discussions and innovative problem-solving. The absence of this neurological mechanism in AI agents resulted in interactions that were, by definition, devoid of genuine emotional resonance. Furthermore, the very act of attending a meeting – and, specifically, the human experience of being present—triggers distinct neurochemical responses. The release of dopamine during moments of insight, the activation of reward pathways in response to successful negotiation, and the social bonding effects of shared experience contribute significantly to the overall value of the encounter. AI, lacking a subjective experience, cannot participate in these neurological events, diminishing the potential for these beneficial outcomes. The summaries generated by the AI agents, while technically accurate representations of the conversation, represent a distilled and, critically, incomplete version of the experience. They are devoid of the contextual richness and the inferred meaning that emerges from the subtle interplay of human minds. The implication of this trend–the prioritization of efficiency and data aggregation over the authentic engagement intrinsic to human collaboration–is a serious one. As AI becomes increasingly integrated into professional life, it is crucial to recognize that the value of human interaction extends far beyond the exchange of information. It is rooted in the complex neurobiological processes that underpin our capacity for empathy, intuition, and creative thinking—elements that remain stubbornly beyond the reach of even the most sophisticated artificial intelligence. The ongoing shift therefore demands a conscious effort to preserve and leverage the uniquely human qualities essential for effective leadership, innovation, and ultimately, meaningful progress. |