Readit News logoReadit News
maxwellg · 3 months ago
I can't wait for first-party remote MCP servers to become more common. Right now we're taking a strange detour of everyone trying to proxy everyone else's APIs and do manual API Key juggling because platforms aren't running their own MCP servers and clients don't support the latest OAuth changes.

In a year from now, Github will run a single public Github MCP server that you will connect to via OAuth - you won't need to install it locally or faff around with tokens or environment variables at all.

niel · 3 months ago
> In a year from now

You can get a taste of this already.

While they still call it a prototype/beta, Sentry's MCP server [0] is a model for others to follow when it comes to convenience and usefulness.

Remote-first with OAuth. The biggest hurdle to using it as-is at the moment, is that most clients don't natively support OAuth yet, so often you'll rely on a local proxy server, like mcp-remote [1], to handle auth. Clients will catch up.

[0] https://mcp.sentry.dev/

[1] https://github.com/geelen/mcp-remote

cyberge99 · 3 months ago
The step after that is paid access to any apis.
joshstrange · 3 months ago
I agree that we will probably move to first-party remote MCP servers in the near future which puts a lot of registries/etc in limbo.

That said, I think there might be a market for MCP servers that do more than the first-party client, it will really depend on what first-party support looks like. Did they implement all of their existing API in MCP or just a few parts?

However, my experience with MCP servers so far (and it’s super early days, I know), has taught me that in a lot of cases it’s better/easier to write your own MCP server/tools. A lot of MCP servers out there are sloppy and/or hard to run/debug. Since most tools are a thin layer over existing API/SDK calls it’s not hard to write (or LLM generate) the needed code which has the added bonus of giving you full control.

Even when an MCP server works 100% and is easy to run, it doesn’t always map 1:1 with the API and so I’ve run into “Yes, you can retrieve data object X but you can’t filter by Y because they didn’t implement that filter in the tool call”.

meander_water · 3 months ago
This is kind of what smithery does already. You can choose to install a local server, or connect to a remotely hosted server on smithery after authenticating through your GitHub OAuth.
rvz · 3 months ago
> I can't wait for first-party remote MCP servers to become more common

> In a year from now, Github will run a single public Github MCP server that you will connect to via OAuth - you won't need to install it locally or faff around with tokens or environment variables at all.

That sounds horrific. GitHub is known for their unreliability and centralizing everything to GitHub which isn't a good idea.

Combining two bad standards (MCP and OAuth) doesn't make remote MCP servers secure either.

jjfoooo4 · 3 months ago
I’ve been seeing MCP compared to extensions in web browsers. Which I find telling, since I wouldn’t exactly say web extensions have been a great success - it’s a pretty niche dev market, and the security posture remains pretty anxiety inducing
owebmaster · 3 months ago
Extensions were a huge success, it was what made Firefox dethrone IE and then Chrome taking the lead. But then the smartphone era came and most people access the internet through them and extensions are not 1st class citizens in mobile.
reustle · 3 months ago
Here are a few more:

- https://smithery.ai/

- https://github.com/wong2/awesome-mcp-servers

- http://mcp.so/servers

- https://cursor.directory/mcp

But as mentioned above, there is an ongoing discussion for the Anthropic registry https://github.com/modelcontextprotocol/registry

tkellogg · 3 months ago
FYI https://mcp.so/ is the exact same thing as was posted. Not sure why they directed to the github instead of the actual site..
Maxious · 3 months ago
There's some movement on https://github.com/modelcontextprotocol/registry

> The MCP Registry service provides a centralized repository for MCP server entries. It allows discovery and management of various MCP implementations with their associated metadata, configurations, and capabilities.

connor4312 · 3 months ago
@ VS Code we've been collaborating on this and plan to ship initial support for registries in our next release.
SafeDusk · 3 months ago
Instead of connecting to a server with 1000(s) of tools, I'm going the opposite direction and claim that you only need <10 sharp tools/small function for most use cases.

As an example, today I re-implemented Google's AlphaEvolve with <7 tools (https://toolkami.com/alphaevolve-toolkami-style/).

joshstrange · 3 months ago
100% this.

Next steps are auto-generate or auto-mashup tools (a couple of projects are doing this) and small, reusable agents that only have access to the handful of tools they need.

“Auto-mashup” refers to (I just made it up) a concept of chaining existing tools with a bit of logic so that instead of having to round trip to the LLM for common cases you can call “Get the load, and the last N log lines, and procstat the top 10 procs, …” all into a “check_server_status”. Similar to some systems that let the LLM write and reuse tools, this would be the same thing, just leveraging other/existing MCP tools. Maybe “auto-composition” is a better name.

jappgar · 3 months ago
Is this like 10 years ago when you could find a Directory of GraphQL Servers?

Seems silly in retrospect no?

Too · 3 months ago
The difference is that GraphQL requires explicit integration with every single API. With MCP you just add the endpoint (close your eyes for security issues) and voilá, several new capabilities were added that you can talk to using human language.
malablaster · 3 months ago
I agree. There’s no need to centralize this list.
slimslenders · 3 months ago
Community MCP servers available as Docker images are also being listed here https://hub.docker.com/catalogs/mcp
jillesvangurp · 3 months ago
There's a big market opportunity here. Countless SAAS solutions are currently trying to figure out how to deal with this new AI thing. If it has some kind of API, creating an MCP server for it isn't technically hard. You can probably generate one with an LLM. It's so easy that you wonder why this is a thing at all. Let the LLMs sort it out; this is low level plumbing stuff it shouldn't require my brain to work hard.

What is hard is integrating across SAAS solutions that haven't done this yet in a way that is secure and easy. Most MCP things out there are so far about exposing things that have a very low value. All the high value stuff is locked up behind APIs, authorization, secure networking (i.e. not publicly accessible typically), etc.

Bridging that stuff is going to generate a lot of work in the next few years and more importantly, companies are going to spend large amounts of money on this because it can deliver a lot of value to them.

People that believe that this is going to be a done deal in six months are dreaming. It's more like ten years. But that just means that there is good money to be made by people that can do this stuff and that can navigate the decades of byzantine digital cruft in the corporate world. You can already see the usual suspects (big consultancy companies) sniffing around this topic. There will be lots of such companies doing a brisk business by the end of this year.

buremba · 3 months ago
> People that believe that this is going to be a done deal in six months are dreaming. It's more like ten years.

You might be underestimating how fast the current ETL / integration companies can pivot to provide reliable MCP servers as the lift is pretty small.