Hi there,
I have a small app (frontend, backend, database) with <100 users. I would like to use my log files to analyze basic metrics, like for example
- number of requests
- number of comments created
- number of shared links
per unit of time.<p>All of this runs on one single server, no big infrastructure, no big data. I am unable to find a SIMPLE tool to collect my logs in some way, parse them, and visualize them in histograms for example.<p>I know there exist ELK stack, Splunk, Graylog and many others, but all these solutions are much too complex. Especially I do not want to spend weeks setting this up correctly. Further more, most of these solutions need an extra server for aggregating the logdata in some timeseries db.<p>I would be very happy if you know about any opensource tool which can do this job.
Maybe checkout LNAV: <a href="http://lnav.org/" rel="nofollow">http://lnav.org/</a><p>There's also angle-grinder, which has less features, but also pretty useful: <a href="https://github.com/rcoh/angle-grinder" rel="nofollow">https://github.com/rcoh/angle-grinder</a>
Grafana´s Loki may have a lighter weight than the other examples you gave above.<p>For some kinds of logs there are tools for summarization and reports (like awstats for web or pflogsumm for mail servers).<p>And, of course, for particular queries on existing logs the standard text tools in a linux box let you generate a lot of info.
For web logs, I still use Webalizer. For everything else, as long as we are not talking tens of gigs, I’ll be using some mix of Perl, Python, Shell, Awk, etc.