TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: iPad Pro as an LLM backend?

2 pointsby cal85about 1 year ago
I’ve seen iOS apps that let you download open source LLMs like Llama&#x2F;Mistral and run them locally. But is there any app&#x2F;solution that would let me use an iPad as an inference backend from another computer on my LAN?<p>I’m curious to see whether it might be worth getting the new iPad Pro M4, which I’m guessing should be pretty fast at inference, but it’s obviously a very locked down system so I’m not sure if it’s viable.

1 comment

talldayoabout 1 year ago
You would get a better price&#x2F;performance out of almost literally any new hardware in it&#x27;s price range. There are sub-$1000 Nvidia laptops that would run circles around it purely for backend purposes.<p>It really does not make sense to pay for a screen and form factor you won&#x27;t use though. You could make a $500 headless inferencing server with a few used 3060s and buy a used iPad Pro with the savings.