Very few people will object if a _site operator_ would know what they did on their site exactly, if they were a returning visitor, how much time they spent per page, etc.<p>The principal privacy-related concern with any form of analytics is that of tracking by _a 3rd party_ across _unrelated sites_. And this concern is fully addressed by simply not using external analytics services and relying on a self-hosted one instead.<p>So I really don't understand the whole exercise with not using random techniques that may be abused by 3rd party analytics services and then somehow claiming pro-privacy focus, whereby the solution is to NOT make yet another analytics service.<p>The only right way to do "privacy-focused" analytics is to offer a self-hosted option. Whoever makes a proper clone of the the original Urchin, before it mutated into GA, will strike pure gold.
My first experience with 'analytics' was awstats. I felt like I discovered god mode!<p>Always wanted to experiment with the empty_gif[2] module from nginx, and process the logs offline. A quick search shows a bunch of guides offering exactly that.<p>1. <a href="https://www.awstats.org/" rel="nofollow">https://www.awstats.org/</a>
2. <a href="https://nginx.org/en/docs/http/ngx_http_empty_gif_module.html" rel="nofollow">https://nginx.org/en/docs/http/ngx_http_empty_gif_module.htm...</a>
I've often wondered why there are so few server-side log parsing libraries (for the likes of Nginx and Apache). Putting that 1x1 pixel image on your front end and serving it from a host that only collects and aggregates anonymous logs about pageviews of each URL would be the best simple analytics most Web site owners would need. Basically the Netlify analytics without the $9 /month/site price tag.<p>(Yes, it would obviously need a bit more than just a log parser but that would be the easy part, IMO. Separate the backend from the front and let JS folks write their own UI for This Weeks New JS Framework. Maybe you could even use PiratePx as the frontend.)