So, 'Big Data' has been simplified to 'Problems Solvable by MapReduce'? <i>sigh</i>.<p>Not every problem can be reduced into a completely cache-able batch job, trivially parallelizable across all of your data. 'Big Data' isn't about breaking up your batch processing into three layers, it's about being smart enough and knowledgeable enough in compsci, statistics, calculus, text processing, regexes, machine learning, business analysis, <i>et cetera</i>, to design an effective system which harvests <i>useful</i> insights from a large bank of atomic, messy, inconsistent data, with an appropriate level of availability and consistency.<p>The real work is not in using/configuring Hadoop—it's about figuring out what information would bring greater-than-marginal value to a business, and how to compute that efficiently from an existing corpus of data.<p>There's no silver bullet. Remember?<p>EDIT: I think the following is particularly disingenuous: "The lambda architecture solves the problem of computing arbitrary functions on arbitrary data in real time by decomposing the problem into three layers"<p>This is such a ridiculous promise, that it put me in a strongly skeptical mood for the rest of the article.