TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: MCP-Compatible OpenAI Agents SDK

5 pointsby saqadri2 months ago
Hey HN,<p>OpenAI released the Agents SDK yesterday, which is great because of its simplicity. I just added MCP support for it, which is currently available as a fork here: <a href="https:&#x2F;&#x2F;github.com&#x2F;lastmile-ai&#x2F;openai-agents-mcp" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;lastmile-ai&#x2F;openai-agents-mcp</a> (and on pypi as the openai-agents-mcp package).<p>You can specify the names of MCP servers to give an Agent access to by setting its `mcp_servers` property.<p>The Agent will then automatically aggregate tools from the MCP servers, as well as any `tools` specified, and create a single extended list of tools. This means you can seamlessly use MCP servers, local tools, OpenAI-hosted tools, and other kinds of Agent SDK tools through a single unified syntax -- and have them interact in the same Agent run loop!<p>Everything else stays exactly the same.<p>```<p>agent = Agent( name=&quot;MCP Assistant&quot;,<p><pre><code> instructions=&quot;You are a helpful assistant with access to MCP tools.&quot;, tools=[your_other_tools], # Regular tool use for Agent SDK mcp_servers=[&quot;fetch&quot;, &quot;filesystem&quot;] # Names of MCP servers from your config file (see below)</code></pre> )<p>```<p>The servers are configured in an `mcp_agent.config.yaml` file, very similar to how they are configured for Claude Desktop:<p>```<p>$schema: &quot;<a href="https:&#x2F;&#x2F;raw.githubusercontent.com&#x2F;lastmile-ai&#x2F;mcp-agent&#x2F;main&#x2F;schema&#x2F;mcp-agent.config.schema.json" rel="nofollow">https:&#x2F;&#x2F;raw.githubusercontent.com&#x2F;lastmile-ai&#x2F;mcp-agent&#x2F;main...</a>&quot;<p>mcp:<p><pre><code> servers: fetch: command: &quot;uvx&quot; args: [&quot;mcp-server-fetch&quot;] filesystem: command: &quot;npx&quot; args: [&quot;-y&quot;, &quot;@modelcontextprotocol&#x2F;server-filesystem&quot;, &quot;.&quot;] slack: command: &quot;npx&quot; args: [&quot;-y&quot;, &quot;@modelcontextprotocol&#x2F;server-slack&quot;] </code></pre> ```<p>I have submitted an issue and PR into the openai-agents-python repo [2], and my plan is instead of a fork, I will create an extension package for MCP support (coming later today).<p>I was able to do this pretty quickly (got it working yesterday) because I&#x27;ve been building the mcp-agent library (<a href="https:&#x2F;&#x2F;github.com&#x2F;lastmile-ai&#x2F;mcp-agent" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;lastmile-ai&#x2F;mcp-agent</a>), which makes MCP server aggregation&#x2F;connection really easy. I did a Show HN about it a few weeks ago [3].<p>Wanted to share here to get community feedback on whether this is useful, which will help me decide if I should dedicate more time to it.<p>[1] - <a href="https:&#x2F;&#x2F;github.com&#x2F;lastmile-ai&#x2F;mcp-agent" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;lastmile-ai&#x2F;mcp-agent</a><p>[2] - <a href="https:&#x2F;&#x2F;github.com&#x2F;openai&#x2F;openai-agents-python&#x2F;issues&#x2F;23" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;openai&#x2F;openai-agents-python&#x2F;issues&#x2F;23</a><p>[3] - <a href="https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=42867050">https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=42867050</a>

1 comment

juan-abia2 months ago
Do you there is a chance that openai accept this change? It would be awesome if this became part of the library
评论 #43346002 未加载