Cool! Handling each event record as a structured data (JSON) is very interesting, because it can connect with MongoDB(DocumentDB) seamlessly for stream output or aggregation. This slide is written about a collaboration with Fluentd and MongoDB.
<a href="http://www.slideshare.net/doryokujin/an-introduction-to-fluent-mongodb-plugins" rel="nofollow">http://www.slideshare.net/doryokujin/an-introduction-to-flue...</a>
Excellent, and I am going to try it. But I still think using something like Flume has some advantages mainly because of Hadoop eco-system. For instance you can plug-in the log data to HBase and use Hive to write to high level abstracted queries and run on Hadoop. I am guessing but seems like there are plugins on the way for various systems but not Hadoop.
Update:
Also Flume can use any data stream, for instance Twitter stream so not limited to log Analysis only.