TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

LM Studio 0.3 – Discover, download, and run local LLMs

241 点作者 fdb9 个月前

21 条评论

yags9 个月前
Hello Hacker News, Yagil here- founder and original creator of LM Studio (now built by a team of 6!). I had the initial idea to build LM Studio after seeing the OG LLaMa weights ‘leak’ (<a href="https:&#x2F;&#x2F;github.com&#x2F;meta-llama&#x2F;llama&#x2F;pull&#x2F;73&#x2F;files">https:&#x2F;&#x2F;github.com&#x2F;meta-llama&#x2F;llama&#x2F;pull&#x2F;73&#x2F;files</a>) and then later trying to run some TheBloke quants during the heady early days of ggerganov&#x2F;llama.cpp. In my notes LM Studio was first “Napster for LLMs” which evolved later to “GarageBand for LLMs”.<p>What LM Studio is today is a an IDE &#x2F; explorer for local LLMs, with a focus on format universality (e.g. GGUF) and data portability (you can go to file explorer and edit everything). The main aim is to give you an accessible way to work with LLMs and make them useful for your purposes.<p>Folks point out that the product is not open source. However I think we facilitate distribution and usage of openly available AI and empower many people to partake in it, while protecting (in my mind) the business viability of the company. LM Studio is free for personal experimentation and we ask businesses to get in touch to buy a business license.<p>At the end of the day LM Studio is intended to be an easy yet powerful tool for doing things with AI without giving up personal sovereignty over your data. Our computers are super capable machines, and everything that can happen locally w&#x2F;o the internet, should. The app has no telemetry whatsoever (you’re welcome to monitor network connections yourself) and it can operate offline after you download or sideload some models.<p>0.3.0 is a huge release for us. We added (naïve) RAG, internationalization, UI themes, and set up foundations for major releases to come. Everything underneath the UI layer is now built using our SDK which is open source (Apache 2.0): <a href="https:&#x2F;&#x2F;github.com&#x2F;lmstudio-ai&#x2F;lmstudio.js">https:&#x2F;&#x2F;github.com&#x2F;lmstudio-ai&#x2F;lmstudio.js</a>. Check out specifics under packages&#x2F;.<p>Cheers!<p>-Yagil
pcf9 个月前
In some brief testing, I discovered that the same models (Llama 3 7B and one more I can&#x27;t remember) are running MUCH slower in LM Studio than in Ollama on my MacBook Air M1 2020.<p>Has anyone found the same thing, or was that a fluke and I should try LM Studio again?
评论 #41339353 未加载
评论 #41338417 未加载
评论 #41336382 未加载
评论 #41381474 未加载
评论 #41336702 未加载
smcleod9 个月前
Nice, it’s a solid product! It’s just a shame it’s not open source and its license doesn’t permit work use.
评论 #41338668 未加载
mythz9 个月前
Originally started out with LM Studio which was pretty nice but ended up switching to Ollama since I only want to use 1 app to manage all the large model downloads and there are many more tools and plugins that integrate with Ollama, e.g. in IDEs and text editors
xeromal9 个月前
I never could get anything local working a few years ago and someone on reddit told me about LM Studio and I finally managed to &quot;run an AI&quot; on my machine. Really cool and now I&#x27;m tinkering with it using the built in HTTP server
pornlover9 个月前
LM Studio is great, although I wish recommended prompts were part of the data of each LLM. I probably just don&#x27;t know enough but I feel like I get hunk of magic data and then I&#x27;m mostly on my own.<p>Similarly with images, LLMs and ML in general feel like DOS and config.sys and autoexec.bat and qemm days.
TeMPOraL9 个月前
Does anyone know if there&#x27;s a changelog&#x2F;release notes available for <i>all</i> historical versions of this? This is one of those programs with the annoying habit to surface only the list of changes in the most recent version, and their release cadence is such that there are some 3 to 5 updates between the times I run, and then I have no idea what changed.
评论 #41341407 未加载
swalsh9 个月前
I LOVE LM studio, it&#x27;s super convenient for testing model capabilities, and the OpenAI server makes it really easy to spin up a server and test. My typical process is to load it up in LM studio, test it, and when I&#x27;m happy with the settings, move to vllm.
qwertox9 个月前
Yesterday I wanted to find a conversation snippet in ChatGPT of a conversation I had maybe 1 or 2 weeks ago. Searching for a single keyword would have been enough to find it.<p>How is it possible that there&#x27;s still no way to search through your conversations?
评论 #41354079 未加载
评论 #41336701 未加载
评论 #41338050 未加载
评论 #41336643 未加载
评论 #41338987 未加载
mark_l_watson9 个月前
Question for everyone: I am using the MLX version of Flux to generate really good images from text on my M2 Mac, but I don’t have an easy setup for doing text + base image to a new image. I want to be able to use base images of my family and put them on Mount Everest, etc.<p>Does anyone have a recommendation?<p>For context: I have almost ten years experience with deep learning, but I want something easy to set up in my home M2 Mac, or Google Colab would be OK.
评论 #41342277 未加载
fallinditch9 个月前
Does anyone know what advantages LM Studio has over Ollama, and vise versa?
评论 #41339395 未加载
评论 #41339197 未加载
webprofusion9 个月前
Cool, it&#x27;s a bit weird that the Windows download is 32-bit, it should be 64-bit by default and there&#x27;s no need for a 32-bit windows version at all.
评论 #41335884 未加载
IronWolve9 个月前
Been using LM studio for months on windows, its so easy to use, simple install, just search for the LLM off huggingface and it downloads and just works. I dont need to setup a python environment in conda, its way easier for people to play and enjoy. Its what I tell people who want to start enjoying LLM&#x27;s without the hassle.
dgreensp9 个月前
I filed a GitHub issue two weeks ago about a bug that was enough for me to put it down for a bit, and there’s been not even a response. Their development velocity seems incredible, though. I’m not sure what to make of it.
评论 #41341204 未加载
alok-g9 个月前
See also: Msty.app<p>It allows both local and cloud models.<p>* Not associated with them in any way. Am a happy user.
评论 #41338688 未加载
2browser9 个月前
Running this on Windows on an AMD card. Llama 3.1 Instruct 7B runs really well on this if anyone wants to try.
BaculumMeumEst9 个月前
If you&#x27;re hopping between these products instead of learning and understanding how inference works under the hood, and familiarizing yourself with the leading open source projects (i.e. llama.cpp), you are doing yourself a great disservice.
评论 #41339027 未加载
评论 #41339171 未加载
评论 #41338645 未加载
Tepix9 个月前
Neat! Can i use it with Brave browser‘s local LLM festure?
a1o9 个月前
What is the recommended system settings for this?
评论 #41339640 未加载
grigio9 个月前
can somebody share benchmarks on AMD ryzen AI with and without NPU ?
评论 #41336635 未加载
navaed019 个月前
Congrats! I’m a big fan of the existing product and the are some great updates to make the app even more accessible and powerful
评论 #41386340 未加载