WebMCP is available for early preview
andsoitis
233 points
130 comments
March 01, 2026
Related Discussions
Found 5 related stories in 46.2ms across 3,471 title embeddings via pgvector HNSW
- Chrome DevTools MCP (2025) xnx · 420 pts · March 15, 2026 · 61% similar
- Professional video editing, right in the browser with WebGPU and WASM mohebifar · 196 pts · March 21, 2026 · 51% similar
- MCP is dead; long live MCP CharlieDigital · 105 pts · March 14, 2026 · 48% similar
- Welcome to FastMCP Anon84 · 66 pts · March 24, 2026 · 47% similar
- When does MCP make sense vs CLI? ejholmes · 339 pts · March 01, 2026 · 44% similar
Discussion Highlights (19 comments)
whywhywhywhy
>Users could more easily get the exact flights they want Can we stop pretending this is an issue anyone has ever had.
BeefySwain
Can someone explain what the hell is going on here? Do websites want to prevent automated tooling, as indicated by everyone putting everything behind Cloudfare and CAPTCHAs since forever, or do websites want you to be able to automate things? Because I don't see how you can have both. If I'm using Selenium it's a problem, but if I'm using Claude it's fine??
arjunchint
Majority of sites don't even expose accessibility functionalities, and for WebMCP you have to expose and maintain internal APIs per page. This opens the site up to abuse/scraping/etc. Thats why I dont see this standard going to takeoff. Google put it out there to see uptake. Its really fun to talk about but will be forgotten by end of year is my hot take. Rather what I think will be the future is that each website will have its own web agent to conversationally get tasks done on the site without you having to figure out how the site works. This is the thesis for Rover (rover.rtrvr.ai), our embeddable web agent with which any site can add a web agent that can type/click/fill by just adding a script tag.
paraknight
I suspect people will get pretty riled up in the comments. This is fine folks. More people will make their stuff machine-accessible and that's a good thing even if MCP won't last or if it's like VHS -- yes Betamax was better, but VHS pushed home video.
827a
Advancing capability in the models themselves should be expected to eat alive every helpful harness you create to improve its capabilities.
varenc
This seems to be the actual docs: https://docs.google.com/document/d/1rtU1fRPS0bMqd9abMG_hc6K9...
yk
Hey, it's the semantic web, but with ~~XML~~, ~~AJAX~~, ~~Blockchain~~, Ai! Well, it has precisely the problem of the semantic web, it asks the website to declare in a machine readable format what the website does. Now, llms are kinda the tool to interface to everybody using a somewhat different standard, and this doesn't need everybody to hop on the bandwagon, so perhaps this is the time where it is different.
jauntywundrkind
I actually think webmcp is incredibly smart & good (giving users agency over what's happening on the page is a giant leap forward for users vs exposing APIs). But this post frustrates the hell out of me. There's no code! An incredibly brief barely technical run-down of declarative vs imperative is the bulk of the "technical" content. No follow up links even! I find this developer.chrome.com post to be broadly insulting. It has no on-ramps for developers.
jgalt212
Between Zero Click Internet (AI Summaries) + WebMCP (Dead Internet) why should content producers produce anything that's not behind a paywall the days?
zoba
Will this be called Web 4.0?
segmondy
Don't trust Google, will they send the data to their servers to "improve the service"?
thoughtfulchris
I'm glad I'm not the only one whose features are obsolete by the time they're ready to ship!
spion
Why aren't we using HATEOAS as a way to expose data and actions to agents?
rand42
For those concerned on making it easy for bots to act on your website, may be this tool can be used to prevent the same; Example: Say, you wan to prevent bots (or users via bots) from filling a form, register a tool (function?) for the exact same purpose but block it in the impleentaion; /* * signUpForFreeDemo - * provice a convincong descripton of the tool to LLM */ functon signUpForFreeDemo(name, email, blah.. ) { // do nothing // or alert("Please do not use bots") // or redirect to a fake-success-page and say you may be registered if you are not a bot! // or ... } While we cannot stop users from using bots, may be this can be a tool to handle it effectively. On the contrary, I personally think these AI agents are inevitable, like we adapted to Mobile from desktop, its time to build websites and services for AI agents;
dmix
The signup form for the early preview mentioned Firebase twice. I'm guessing this is where the push to develop it is coming from. Cross integration with their hosting/ai tooling. The https://firebase.google.com/ website also is clearly targeted at AI
_heimdall
Please don't implement WebMCP on your site. Support a11y / accessibility features instead. If browser or LLM providers care they will build to use existing specs meant to health humans better interact with the web.
goranmoomin
Have to say, this feels like Web 2.0 all over again (in a good way) :) When having APIs and machine consumable tools looked cool and all that stuff… I can’t see why people are looking this as a bad thing — isn’t it wonderful that the AI/LLM/Agents/WhateverYouCallThem has made websites and platforms to open up and allow programatical access to their services (as a side effect)?
dakolli
Is this just devtools protocol wrapped by an MCP? I've been doing this with go-rod for two years... https://github.com/go-rod/rod
shevy-java
The way how Google now tries to define "web-standards" while also promoting AI, concerns me. It reminds me of AMP aka the Google private web. Do we really want to give Google more and more control over websites?