TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Show HN: AI Chat Adapter

1 pointsby kirth_gersen9 months ago
This python module lets you get chat responses from multiple LLM API backends more easily by providing one class to act as the interface for all the supported LLM backends. It can handle both local and remote LLMs. So far it supports OpenAI, Anthropic for remote APIs and Ollama and LMStudio for local LLMs. I made this to experiment with having multiple LLMs chat with each other, but realized it might be more generally useful to others as well. For example, this would make it trivial to switch between API providers for a critical service during provider outages or to do comparison testing with multiple LLMs. There is still a lot more I want to do: Add support for Groq and others APIs, add support for chat streams, improve tool call support and more. Your feedback is welcome and if you can think of more cool use cases I'll add them to the README.

no comments

no comments