LmCast :: Stay tuned in

Ring can verify videos now, but that might not help you with most AI fakes

Recorded: Jan. 23, 2026, 3:03 a.m.

Original Summarized

Ring can verify videos now, but that might not help you with most AI fakes | The VergeSkip to main contentThe homepageThe VergeThe Verge logo.The VergeThe Verge logo.TechReviewsScienceEntertainmentAIPolicyHamburger Navigation ButtonThe homepageThe VergeThe Verge logo.Hamburger Navigation ButtonNavigation DrawerThe VergeThe Verge logo.Login / Sign UpcloseCloseSearchTechExpandAmazonAppleFacebookGoogleMicrosoftSamsungBusinessSee all techGadgetsExpandLaptopsPhonesTVsHeadphonesSpeakersWearablesSee all gadgetsReviewsExpandSmart Home ReviewsPhone ReviewsTablet ReviewsHeadphone ReviewsSee all reviewsAIExpandOpenAIAnthropicSee all AIVerge ShoppingExpandBuying GuidesDealsGift GuidesSee all shoppingPolicyExpandAntitrustPoliticsLawSecuritySee all policyScienceExpandSpaceEnergyEnvironmentHealthSee all scienceEntertainmentExpandTV ShowsMoviesAudioSee all entertainmentGamingExpandXboxPlayStationNintendoSee all gamingStreamingExpandDisneyHBONetflixYouTubeCreatorsSee all streamingTransportationExpandElectric CarsAutonomous CarsRide-sharingScootersSee all transportationFeaturesVerge VideoExpandTikTokYouTubeInstagramPodcastsExpandDecoderThe VergecastVersion HistoryNewslettersExpandThe Verge DailyInstallerVerge DealsNotepadOptimizerRegulatorThe StepbackArchivesStoreSubscribeFacebookThreadsInstagramYoutubeRSSThe VergeThe Verge logo.Ring can verify videos now, but that might not help you with most AI fakesComments DrawerCommentsLoading commentsGetting the conversation ready...NewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechAmazonCloseAmazonPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All AmazonRing can verify videos now, but that might not help you with most AI fakesThe company’s new tool will only tell you if a video hasn’t been altered in any way, and if a video fails the test, Ring can’t tell you how it was edited.The company’s new tool will only tell you if a video hasn’t been altered in any way, and if a video fails the test, Ring can’t tell you how it was edited.by Jay PetersCloseJay PetersSenior ReporterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Jay PetersJan 23, 2026, 12:57 AM UTCLinkShareGiftIf you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.Image: RingJay PetersCloseJay PetersPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Jay Peters is a senior reporter covering technology, gaming, and more. He joined The Verge in 2019 after nearly two years at Techmeme.Ring has launched a new Ring Verify tool that the company says can “verify that Ring videos you receive haven’t been edited or changed.” But since Ring won’t verify videos that have been altered in any way, it probably won’t be able to verify those videos you see on TikTok that look like they’re from security camera footage but are actually made with AI.All videos downloaded from Ring’s cloud now include a “digital security seal,” Ring says. To check and to see if a video is authentic, go to the Ring Verify website and select a video from your device to upload it. When Ring Verify says a video is “verified,” that means “the video hasn’t been changed in any way since it was downloaded from Ring.” (Ring Verify is built on C2PA standards, according to spokesperson Kaleigh Bueckert-Orme.)Any change to the video, including something small like tweaking the brightness, will make a video fail the test. Ring cannot verify videos that “were downloaded before this feature launched in December 2025, or videos that have been edited, cropped, filtered, or altered in any way after download (even trimming a second, adjusting brightness, or cropping)” or “videos uploaded to video sharing sites which compress the video.” Videos recorded with end-to-end encryption turned on can’t be verified, either.RelatedRing’s Jamie Siminoff thinks AI can reduce crimeIf Ring can’t verify the video as authentic, it also can’t tell you exactly what was changed about it. “Ring’s verification only confirms that a video has not been modified at all since download,” Ring says. If you want an original version of a video, Ring suggests asking the person who shared it with you to share a link from the Ring app.Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Jay PetersCloseJay PetersSenior ReporterPosts from this author will be added to your daily email digest and your homepage feed.FollowFollowSee All by Jay PetersAmazonCloseAmazonPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All AmazonNewsCloseNewsPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All NewsSecurityCloseSecurityPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All SecurityTechCloseTechPosts from this topic will be added to your daily email digest and your homepage feed.FollowFollowSee All TechMost PopularMost PopularClaude Code is suddenly everywhere inside MicrosoftSony announces its first turntables in yearsWhat a Sony and TCL partnership means for the future of TVsEveryone can hear your TV in their headphones using this transmitterHow much can a city take?The Verge DailyA free daily digest of the news that matters most.Email (required)Sign UpBy submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.Advertiser Content FromThis is the title for the native adMore in NewsVimeo lays off ‘large portion’ of staff after Bending Spoons buyoutSen. Markey questions OpenAI about ‘deceptive advertising’ in ChatGPTTesla is finally doing unsupervised robotaxi ridesRing says it’s not giving ICE access to its camerasSubstack is launching a TV app, and not everyone is happyFable is coming to PS5 when it finally launches this fallVimeo lays off ‘large portion’ of staff after Bending Spoons buyoutEmma RothJan 22Sen. Markey questions OpenAI about ‘deceptive advertising’ in ChatGPTEmma RothJan 22Tesla is finally doing unsupervised robotaxi ridesAndrew J. HawkinsJan 22Ring says it’s not giving ICE access to its camerasJennifer Pattison TuohyJan 22Substack is launching a TV app, and not everyone is happyMia SatoJan 22Fable is coming to PS5 when it finally launches this fallJay PetersJan 22Advertiser Content FromThis is the title for the native adTop StoriesJan 22Epic and Google have a secret $800 million Unreal Engine and services dealJan 22The state attorneys general are as mad as you areAn hour agoThe TikTok deal is done, finallyJan 22Claude Code is suddenly everywhere inside MicrosoftJan 22Why nobody’s stopping GrokVideoJan 19It’s worse than it looks in MinneapolisThe VergeThe Verge logo.FacebookThreadsInstagramYoutubeRSSContactTip UsCommunity GuidelinesArchivesAboutEthics StatementHow We Rate and Review ProductsCookie SettingsTerms of UsePrivacy NoticeCookie PolicyLicensing FAQAccessibilityPlatform Status© 2026 Vox Media, LLC. All Rights Reserved

Ring’s new “Verify” tool, launched in December 2025, aims to provide users with confidence that videos received through the Ring system haven’t been manipulated. However, the tool’s limitations significantly diminish its potential utility, particularly in the current landscape of rapidly evolving artificial intelligence. The system operates by attaching a “digital security seal” to all videos downloaded from Ring’s cloud. Users can then upload a selected video to the Ring Verify website to assess its authenticity. Upon successful verification, Ring confirms that the video’s integrity was maintained since its download from Ring – meaning no alterations have occurred. This verification process relies on the C2PA (Content Authenticity Initiative) standards, as explained by spokesperson Kaleigh Bueckert-Orme.

Crucially, the system’s validation is exceedingly narrow. Any modification to a video, however minor – including alterations to brightness, cropping, or even trimming a second – will immediately trigger a failure. This immediately renders the tool ineffective against the burgeoning problem of AI-generated video deepfakes. Sophisticated AI models can now produce remarkably realistic synthetic media, and even small adjustments, like brightness tweaks, are often undetectable to the human eye and, consequently, to Ring’s verification process. The system’s rigid criteria prioritize absolute unaltered integrity, a standard virtually impossible to maintain when dealing with the dynamic and increasingly complex nature of digital content.

Furthermore, Ring’s verification process doesn’t offer any insight into *how* a video was altered. The system simply confirms that no changes have taken place since the download. This lack of granular detail makes the tool almost useless for investigating potential deepfakes, as it provides no clues about the methods used to create or manipulate the video. The inability to ascertain the nature of the alteration prevents users and investigators from understanding the sophistication of the manipulation or identifying the source of the false content.

The limitations extend to videos originating from external platforms, such as TikTok, where AI-generated content is prevalent. Since videos uploaded to these sites are compressed, Ring’s verification process cannot assess them. This creates a critical vulnerability, as many Ring users receive videos shared from these sites, without any guarantee of their authenticity.

Ultimately, the Ring Verify tool, while representing a proactive step by the company, is fundamentally constrained by its design. It reacts to alterations rather than proactively detecting them. Given the rapid advancement in AI technology, particularly in the creation of synthetic media, this tool’s current capabilities are likely to prove insufficient. The system’s inability to handle even minor modifications, combined with its lack of information regarding the alteration process, severely limits its effectiveness against the evolving threat of AI-generated fakes. To truly address this concern, Ring would require a verification system capable of identifying and flagging deeper manipulations, a challenge that technology currently does not fully address.