TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Local LLM Assistant in Zed Editor

2 pointsby sumanmichaelover 1 year ago

1 comment

sumanmichaelover 1 year ago
The GitHub issue #4424 for Zed relates to the lack of a feature for using local large language models (LLMs). In response to this, I proposed a workaround that enables the integration of local LLMs into Zed. This solution addresses the need for a non-proprietary, offline alternative to mainstream models like ChatGPT, potentially increasing privacy and control for users.<p>To integrate a custom model in Zed, I bypassed the limitation of only using OpenAI models. I did this by running the Mistral model from the Ollama library and cloning it to appear as &quot;gpt-4-1106-preview.&quot; The steps included pulling and running the Mistral model, then using Ollama&#x27;s commands to clone it. I updated Zed&#x27;s settings to point to the local API URL of the cloned model. Restarting Zed applied these changes, enabling the use of the local LLM within Zed&#x27;s environment.<p>For more details, you can refer to the GitHub issue directly: <a href="https:&#x2F;&#x2F;github.com&#x2F;zed-industries&#x2F;zed&#x2F;issues&#x2F;4424#issuecomment-1958886354">https:&#x2F;&#x2F;github.com&#x2F;zed-industries&#x2F;zed&#x2F;issues&#x2F;4424#issuecomme...</a>