TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Ask HN: What are your tech predictions for 2024?

8 点作者 ciccionamente超过 1 年前

5 条评论

tristenharr超过 1 年前
The year of local first.<p>Postgres + SQLite will get married.<p>Somebody will start a project to rebuild Kafka in Rust, hopefully open-sourced.<p>People will over-do it putting things on the edge, freak out about the pricing, then scale back, and there will be backlash because of poor implementation putting too much data on edge. This will happen to companies who try to get in on it without understanding why they want&#x2F;need it.<p>The idea of pseudo-elimination of loading on the web via using a state machine and pre-caching every state transition within X taps locally will become more popular&#x2F;known.<p>There will be an AI model trained specifically on propositional logic that’ll surpass the abilities of any other current models in reasoning and mathematics.<p>Somebody will come up with a law about the speed of AI’s development and correlate it with Moore’s law.
Flux159超过 1 年前
Local AI models (LLMs, Diffusion models) will take off in popularity as their performance will be good enough for many use cases without having to use a server. Also would allow a lot of startups to leverage client computation without having to host models on expensive GPU servers.
I_am_tiberius超过 1 年前
People will finally realize that OpenAI is a privacy nightmare.
评论 #38756056 未加载
评论 #38755138 未加载
ExploreColorado超过 1 年前
My only prediction is that we start encountering privacy issues related to training models - whether it’s how user data and PII is used or accidental data leaks.
spansoa超过 1 年前
More on-device AI. Shift from off-prem compute to on-prem. Open source LLMs run locally, no more black boxes.