TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Quick Primer on MCP Using Ollama and LangChain

131 点作者 bswamina大约 1 个月前

7 条评论

minimaxir大约 1 个月前
In the case of MCPs, this post is indeed a quick primer. But from a coding standpoint, and despite the marketing that Agent/MCP development simplifies generative LLM workflows, it’s a long coding mess that is hard to tell if it’s even worth it. It’s still the ReAct paradigm at a low level and if you couldn’t find a case for tools then, nothing has changed other than the Agent/MCP hype making things more confusing and giving more ammunition to AI detractors.
评论 #43682244 未加载
gsibble大约 1 个月前
MCP is great for when you’re integrating tools locally into IDEs and such. It’s a terrible standard for building more robust applications with multi-user support. Security and authentication are completely lacking.<p>99% of people wouldn’t be able to find the API keys you need to feed into most MCP servers.
评论 #43677493 未加载
评论 #43677102 未加载
评论 #43677065 未加载
bongodongobob大约 1 个月前
Is anyone really still using langchain? Has it gotten better? Seemed like a token burning platform the last time I used it.
评论 #43678297 未加载
评论 #43680313 未加载
WD-42大约 1 个月前
If you need to define and write the functions to calculate interest… what exactly is the llm bringing to the table here? I feel like I’m missing something.
评论 #43677172 未加载
评论 #43677177 未加载
评论 #43680690 未加载
gatienboquet大约 1 个月前
You know it&#x27;s going to be a great article when the design is from 1995
gclawes大约 1 个月前
This website design is blessed. A great return to the past
评论 #43678479 未加载
评论 #43677323 未加载
trebligdivad大约 1 个月前
The units for the free memory are interestingly wrong; &#x27;Executing shell command: free -m&#x27; The total system memory is 64222 bytes, with used (available) 8912 bytes.<p>which given that there seems to be no way to specify any data structure or typing in this MCP interface is hardly surprising!