TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

From no-code to co-code

1 pointsby paulfitzover 1 year ago

1 comment

paulfitzover 1 year ago
The exact LLM used in the experiment mentioned in this post was upstage-llama-2-70b-instruct-v2.ggmlv3.q2_K. Grist was configured to use it via llama-cpp-python and <a href="https:&#x2F;&#x2F;github.com&#x2F;gristlabs&#x2F;grist-core#ai-formula-assistant-related-variables-all-optional">https:&#x2F;&#x2F;github.com&#x2F;gristlabs&#x2F;grist-core#ai-formula-assistant...</a>