The year of local first.<p>Postgres + SQLite will get married.<p>Somebody will start a project to rebuild Kafka in Rust, hopefully open-sourced.<p>People will over-do it putting things on the edge, freak out about the pricing, then scale back, and there will be backlash because of poor implementation putting too much data on edge. This will happen to companies who try to get in on it without understanding why they want/need it.<p>The idea of pseudo-elimination of loading on the web via using a state machine and pre-caching every state transition within X taps locally will become more popular/known.<p>There will be an AI model trained specifically on propositional logic that’ll surpass the abilities of any other current models in reasoning and mathematics.<p>Somebody will come up with a law about the speed of AI’s development and correlate it with Moore’s law.
Local AI models (LLMs, Diffusion models) will take off in popularity as their performance will be good enough for many use cases without having to use a server. Also would allow a lot of startups to leverage client computation without having to host models on expensive GPU servers.
My only prediction is that we start encountering privacy issues related to training models - whether it’s how user data and PII is used or accidental data leaks.