TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Sidekick: Local-first native macOS LLM app

325 点作者 volemo2 个月前

30 条评论

cadamsdotcom2 个月前
What an great looking tool.<p>Amazingly generous that it’s open source. Let’s hope the author can keep building it, but if they need to fund their existence there <i>is</i> precedent - lots of folks pay for Superwhisper. People pay for quality software.<p>In a past tech cycle Apple might’ve hired the author, acquired the IP and lovingly stewarded the work into a bundled OS app. Not something to hope for lately. So just going to hope the app lives for years to come and keeps improving the whole way.
评论 #43332943 未加载
mentalgear2 个月前
Looks great, kudos for making it open-source! Yet as with any app that has access to my local file system, what instantly comes to mind is &quot;narrow permissions&quot; &#x2F; principle of least permissions.<p>It&#x27;d be great if the app would only have read access to my files, not full disk permission.<p>As an end-user, I&#x27;m highly concerned that files might get deleted or data shared via the internet.<p>So ideally, Sidekick would have only &quot;read&quot; permissions and no internet access. (This applies really to any app with full disk read access).<p>Also: why does it say Mac Silicon required? I can run Llama.cpp and Ollama on my intel mac.
评论 #43337716 未加载
评论 #43335289 未加载
abroadwin2 个月前
Neat. It would be nice to provide an option to use an API endpoint without downloading an additional local model. I have several models downloaded via ollama and would prefer to use them without additional space being taken up by the default model.
评论 #43334384 未加载
pzo2 个月前
some other alternatives (a little more mature &#x2F; feature rich):<p>anythingllm <a href="https:&#x2F;&#x2F;github.com&#x2F;Mintplex-Labs&#x2F;anything-llm" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;Mintplex-Labs&#x2F;anything-llm</a><p>openwebui <a href="https:&#x2F;&#x2F;github.com&#x2F;open-webui&#x2F;open-webui" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;open-webui&#x2F;open-webui</a><p>lmstudio <a href="https:&#x2F;&#x2F;lmstudio.ai&#x2F;" rel="nofollow">https:&#x2F;&#x2F;lmstudio.ai&#x2F;</a>
评论 #43337327 未加载
unshavedyak2 个月前
Looks super neat!<p>Somewhat related, one issue i have with projects like these is it appears like everyone is bundling the UX&#x2F;App with the core ... pardon my ignorance, &quot;LLM App interface&quot;. Eg We have a lot of abstractions for LLMs themselves such as Llama.cpp, but it feels like we lack abstractions for things like what Claude Code does, or perhaps this RAG impl, or whatever.<p>Ie these days it seems like a lot of the magic in a quality implementation is built on top of a good LLM. A secondary layer which is just as important as the LLM itself. The prompt engineering, etc.<p>Are there any attempts to generalize this? Is it even possible? Feels like i keep seeing a lot of good ideas which get locked behind an app wall and no ability to switch them out. We&#x27;ve got tons of options to abstract the LLMs themselves, but i&#x27;ve not seen anything which tackles this <i>(but i&#x27;ve also not been looking)</i>.<p>Does it exist? Does this area have a name?
评论 #43334412 未加载
评论 #43336499 未加载
atonse2 个月前
When I bought my new MBP, I was wondering whether to just upgrade the memory to 48GB thinking that it will become more likely that I will run local models in the next 3-4 year cycle of this laptop. So I took the leap and just upgraded the memory.<p>Hoping that these kinds of tools will run well in these scenarios.
Isn0gud2 个月前
This is not local, but uses the Tavily cloud (<a href="https:&#x2F;&#x2F;tavily.com&#x2F;" rel="nofollow">https:&#x2F;&#x2F;tavily.com&#x2F;</a>) ?!
评论 #43336829 未加载
评论 #43333070 未加载
rubymamis2 个月前
Some interesting features. I&#x27;m working on similar native app with Qt so it will support Linux, macOS and Windows out of the box. I might open source it as well.<p><a href="https:&#x2F;&#x2F;www.get-vox.com" rel="nofollow">https:&#x2F;&#x2F;www.get-vox.com</a>
AnonC2 个月前
Looks nice, and I greatly appreciate the local only or local first mode.<p>The readme says:<p>&gt; Give the LLM access to your folders, files and websites with just 1 click, allowing them to reply with context.<p>…<p>&gt; Context aware. Aware of your files, folders and content on the web.<p>Am I right in assuming that this works only with local text files and that it cannot integrate with data sources in Apple’s apps such as Notes, Reminders, etc.? It could be a great competitor to Apple Intelligence if it could integrate with apps that primarily store textual information (but unfortunately in their own proprietary data formats on disk and with sandboxing adding another barrier).<p>Can it use and search PDFs, RTF files and other formats as “experts”?
评论 #43332771 未加载
评论 #43334217 未加载
pvo505552 个月前
What differentiates this from Open WebUI? How did you design the RAG pipeline?<p>I had a project in the past where I had hundreds of PDF &#x2F; HTML files of industry safety and fatality reports which I was hoping to simply &quot;throw in&quot; and use with Open WebUI, but I found it wasn&#x27;t effective at this even in RAG mode. I wanted to ask it questions like &quot;How many fatalities occurred in 2020 that involved heavy machinery?&quot;, but it wasn&#x27;t able to provide such broad aggregate data.
评论 #43333646 未加载
nebulous12 个月前
The name gave me a flashback to Borland Sidekick
评论 #43332665 未加载
评论 #43333199 未加载
john_alan2 个月前
Pretty slick, I&#x27;ve been using Ollama + <a href="https:&#x2F;&#x2F;github.com&#x2F;kevinhermawan&#x2F;Ollamac" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;kevinhermawan&#x2F;Ollamac</a> - not sure this provides much extra benefit. Still love to see it.
aa-jv2 个月前
Trying to put this through its paces, I first set out to build my own local binary (because why not, and also because code-reading is fun when you&#x27;ve got your own local build) ..<p>But I get this far:<p><i>&#x2F;Users&#x2F;aa-jv&#x2F;Development&#x2F;InterestingProjects&#x2F;Sidekick&#x2F;Sidekick&#x2F;Logic&#x2F;View Controllers&#x2F;Tools&#x2F;Slide Studio&#x2F;Resources&#x2F;bin&#x2F;marp: No such file or directory</i><p>It seems there is a hand-built binary resource missing from the repo - did anyone else do a build yet, and get past this step?
评论 #43331766 未加载
typeiierror2 个月前
I&#x27;ve been looking for something like this to query &#x2F; interface with the mountain of home appliance manuals I&#x27;ve hung onto as PDFs - use case being that instead of having to fish out and read a manual once something breaks, I can just chat with the corpus to quickly find what I need to fix something. Will give it a shot!
angst_ridden大约 2 个月前
Any idea how long Experts should take to import&#x2F;index data? I pointed an expert at a big directory of source files on a M4 iMac with 32G RAM, and it pinned a CPU at 100% for 24 hours but was not finished.<p>A single file seems to finish quickly, but folders (even with just a few files) seem to be very slow.
AutoAPI2 个月前
An option to use a local LLM on network without needing to download the 2GB &quot;default model&quot; would be great
评论 #43332407 未加载
delijati2 个月前
Does anyone know if there is something like this or <a href="https:&#x2F;&#x2F;github.com&#x2F;kevinhermawan&#x2F;Ollamac" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;kevinhermawan&#x2F;Ollamac</a> for linux ... both are build with swift and swift also supports linux!?
评论 #43332211 未加载
toomuchtodo2 个月前
Great work! Please consider a plugin mode to support integrating with Dropbox, S3 compatible targets, where users might be storing large amounts of data off device (but still device accessible), as well as email providers via IMAP&#x2F;JMAP.
sickmartian2 个月前
Very cool, trying it out, I&#x27;m unable to make it do a search tho, on the experts it says it&#x27;s deactivated on the settings but I couldn&#x27;t find a setting for it, maybe it&#x27;s model dependent and the default model can&#x27;t do it?
webprofusion2 个月前
Nice, just needs a computer&#x2F;browser use mode and thinking&#x2F;agent mode. e.g. &quot;Test this web app for me. Try creating a new account and starting a new order&quot; etc.
gcanyon大约 2 个月前
Does it actually highlight the cites when opening cite docs? Or were the highlights in the screen shot just there by chance?
user999999992 个月前
Looking forward to when there will be a broad llm api accessible in the browser via js
sansieation2 个月前
Why no MLX?
评论 #43334773 未加载
thomasfl2 个月前
This needs 164 MB of disk space. Not to bad. Thank you to the author for this.
评论 #43334755 未加载
dev2132 个月前
looks like an awesome tool! I just found it funny that in code interpreter demo, javascript is used to evaluate mathematical problems (especially the float comparison)
oigursh大约 2 个月前
Could this help categorize and prune backups?
whoitsjustme2 个月前
Does it support MCP?
MichaelTheGeek2 个月前
Very nice.
nottorp2 个月前
&gt; Image generation is availible on macOS 15.2 or above, and requires Apple Intelligence.<p>... so image generation is not fully offline?<p>This tool looks like it could be worth a try to me, but only if I&#x27;m sure I can run it into a mode that&#x27;s fully offline.
评论 #43331744 未加载
评论 #43334851 未加载
评论 #43331435 未加载
0xferruccio2 个月前
Really cool! I hope they&#x27;ll roll out MCP support so that we can add support for it in our MCP app store (<a href="https:&#x2F;&#x2F;github.com&#x2F;fleuristes&#x2F;fleur" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;fleuristes&#x2F;fleur</a>)<p>Right now only code editors and Claude support MCPs, but we&#x27;d love to see more clients like Sidekick