TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Launch Lamini: The LLM Engine for Rapidly Customizing Models as Good as ChatGPT

123 pointsby sharonzhouabout 2 years ago

13 comments

primordialsoupabout 2 years ago
Congrats! I went to your demo and asked for words that end in agi. This is what I got:<p>--<p>agi, agi, agi, agi, agi, agi, agi<p>These are some of the words that end in agi. You can also use the word agi in a sentence. For example, &quot;I am going to the grocery store to get some agi.&quot;<p>These are some of words that end in agi.<p>These are some words that end in agi.<p>maximize, maximize, maximize, maximize, maximize, maximize, maximize, maximize<p>These are some words that ends in agi<p>--<p>So I think this needs more work to get to &quot;as good as ChatGPT&quot;. But having said that, congrats on the landing
评论 #35746348 未加载
评论 #35746307 未加载
评论 #35746553 未加载
sharonzhouabout 2 years ago
Hi HN!<p>I’m super excited to announce Lamini, the LLM engine that gives every developer the superpowers that took the world from GPT-3 to ChatGPT!<p>I’ve seen a lot of developers get stuck after prompt-tuning for a couple days or after fine-tuning an LLM and it just gets worse—there’s no good way to debug it. I have a PhD in AI from Stanford, and don’t think anyone should need one to build an LLM as good as ChatGPT. A world full of LLMs as different &amp; diverse as people would be even more creative, productive, and inspiring.<p>That’s why I’m building Lamini, the LLM engine for developers to rapidly customize models from amazing foundation models from a ton of institutions: OpenAI, EleutherAI, Cerebras, Databricks, HuggingFace, Meta, and more.<p>Here’s our blog announcing us and a few special open-source features! <a href="https:&#x2F;&#x2F;lamini.ai&#x2F;blog&#x2F;introducing-lamini" rel="nofollow">https:&#x2F;&#x2F;lamini.ai&#x2F;blog&#x2F;introducing-lamini</a><p>Here’s what Lamini does for you: Your LLM outperforms general-purpose models on your specific use case You own the model, weights and all, not us (if foundation model allows it, of course!) Your data helps the LLM, and build you an AI moat Any developer can do it today in just a few lines of code Commercial-use-friendly with a CC-BY license<p>We’re also releasing several tools on Github: Today, you can try out our hosted data generator for training your own LLMs, weights and all, without spinning up any GPUs, in just a few lines of code from the Lamini library. <a href="https:&#x2F;&#x2F;github.com&#x2F;lamini-ai&#x2F;lamini&#x2F;">https:&#x2F;&#x2F;github.com&#x2F;lamini-ai&#x2F;lamini&#x2F;</a><p>You can play with an open-source LLM, trained on generated data using Lamini. <a href="https:&#x2F;&#x2F;huggingface.co&#x2F;spaces&#x2F;lamini&#x2F;instruct-playground" rel="nofollow">https:&#x2F;&#x2F;huggingface.co&#x2F;spaces&#x2F;lamini&#x2F;instruct-playground</a><p>Sign up for early access to the training module that took the generated data and trained it into this LLM, including enterprise features like virtual private cloud (VPC) deployments. <a href="https:&#x2F;&#x2F;lamini.ai&#x2F;contact" rel="nofollow">https:&#x2F;&#x2F;lamini.ai&#x2F;contact</a>
评论 #35747641 未加载
评论 #35747142 未加载
ec109685about 2 years ago
This headline is totally editorializing. Stick with the source one. “Introducing Lamini, the LLM Engine for Rapidly Customizing Models”<p>So much click bait in the LLM space.
评论 #35750299 未加载
评论 #35750030 未加载
iguanaabout 2 years ago
Trivial examples show that this isn&#x27;t nearly as good as ChatGPT. The headline should be changed.
评论 #35753803 未加载
furyofantaresabout 2 years ago
The actual post doesn&#x27;t say &quot;as Good as ChatGPT&quot;, why does the HN title?<p>I don&#x27;t really care to click on something I know is obviously lying to me.
batch12about 2 years ago
I&#x27;ve been playing a bit with stacking transformer adapters to add knowledge to models and so far it has met my needs. It doesn&#x27;t have the same illusion of intelligence, but so far it&#x27;s just as good as a multitasking intern, so I am still having fun with it. I wonder if this is basically doing the same thing.
评论 #35750489 未加载
eschluntzabout 2 years ago
Very exciting! Glad to finally be able to get beyond prompt engineering. What&#x27;s the pricing model like?
评论 #35744387 未加载
atulika612about 2 years ago
If I want to export the model and run it myself, can I do that?
mise_en_placeabout 2 years ago
Why wouldn’t we use something like DeepSpeed? It’s a one-click on Azure. What’s the value add?
cultofmetatronabout 2 years ago
I hope this turns out be as good as chatgpt and not &quot;we have chatgpt at home&quot;
评论 #35749999 未加载
digitcatphdabout 2 years ago
GPT at this point is more than an LLM, it is a baseline layer of logic using the underlying transformer technology. This will be challenging to replicate without the same size of data sets
gdiamosabout 2 years ago
Noting that the Github repo includes a data pipeline for instruction fine tunining.<p>What&#x27;s the difference between this and other data pipelines like Alpaca?
评论 #35748188 未加载
8thcrossabout 2 years ago
looks great...looking forward to trying it out