TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: How do you deploy Python-based LangChain or LlamaIndex code on prod?

2 pointsby pinglin9 months ago

1 comment

efriis9 months ago
Erick from LangChain here - would recommend either deploying with LangGraph Cloud or anywhere that deploys a FastAPI easily.<p>LangGraph Cloud: <a href="https:&#x2F;&#x2F;langchain-ai.github.io&#x2F;langgraph&#x2F;cloud&#x2F;" rel="nofollow">https:&#x2F;&#x2F;langchain-ai.github.io&#x2F;langgraph&#x2F;cloud&#x2F;</a> Example (chat-langchain): <a href="https:&#x2F;&#x2F;github.com&#x2F;langchain-ai&#x2F;chat-langchain?tab=readme-ov-file">https:&#x2F;&#x2F;github.com&#x2F;langchain-ai&#x2F;chat-langchain?tab=readme-ov...</a><p>FastAPI example with LangServe (old chat-langchain implementation): <a href="https:&#x2F;&#x2F;github.com&#x2F;langchain-ai&#x2F;chat-langchain&#x2F;blob&#x2F;langserve&#x2F;backend&#x2F;main.py">https:&#x2F;&#x2F;github.com&#x2F;langchain-ai&#x2F;chat-langchain&#x2F;blob&#x2F;langserv...</a><p>Or just use regular fastapi endpoints instead of langserve.add_routes with a fastapi server: <a href="https:&#x2F;&#x2F;fastapi.tiangolo.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;fastapi.tiangolo.com&#x2F;</a>