Erick from LangChain here - would recommend either deploying with LangGraph Cloud or anywhere that deploys a FastAPI easily.<p>LangGraph Cloud: <a href="https://langchain-ai.github.io/langgraph/cloud/" rel="nofollow">https://langchain-ai.github.io/langgraph/cloud/</a>
Example (chat-langchain): <a href="https://github.com/langchain-ai/chat-langchain?tab=readme-ov-file">https://github.com/langchain-ai/chat-langchain?tab=readme-ov...</a><p>FastAPI example with LangServe (old chat-langchain implementation): <a href="https://github.com/langchain-ai/chat-langchain/blob/langserve/backend/main.py">https://github.com/langchain-ai/chat-langchain/blob/langserv...</a><p>Or just use regular fastapi endpoints instead of langserve.add_routes with a fastapi server: <a href="https://fastapi.tiangolo.com/" rel="nofollow">https://fastapi.tiangolo.com/</a>