I'm a physicist, so I'm one of the people this guy says standard deviation is still good for. However, despite some "oddities" (pointed out by others here) in his article, I'm more than willing to admit a simpler, easier to understand term would be helpful for explaining many things to the general public. Hell, it would be helpful for explaining things to <i>journalists</i>, who we then trust to explain things to the public!<p>Look at an reputable news site or paper. Odds are they post articles based on polls several times a day. How many report confidence intervals or anything of the sort? These are <i>crucial</i> for interpreting polls, but are left out more often than not. Worse yet, many stories make a big deal about a "huge" shift in support for some political policy, party or figure, when the previous month's figure is actually well within the confidence interval of the current month's poll!<p>Standard deviation, confidence intervals, etc. are all ways of expressing uncertainty, and it's become abundantly clear that the average journalist, to say nothing of the average person, has no clue about what the concept means. If the goal is to communicate with the public, then we really need to take a step back and appreciate the stupendously colossal wall of ignorance we're about to butt our heads against. When we talk about the general public, we should keep in mind that rather a lot of people know so little about the scientific method that they interpret the impossibility of proving theories as justification for giving religious fables equal footing in schools. This kind of ignorance isn't a nasty undercurrent lurking in the shadows. It's running the show, as evidenced by many state laws in the U.S.! There is absolutely <i>no</i> hope of explaining uncertainty to most of these people.<p>There <i>is</i> hope of explaining basic statistics to journalists, if only because they are relatively few in number and it's a fundamental part of their job to understand what they are reporting. Yes, I just said that every journalist who has reported a poll result, scientific figure, etc. without the associated uncertainty has <i>failed</i> to adequately perform their job. We need to make journalists understand <i>why</i> they are failing. If simplifying the way we report uncertainties will assist with this, then I'm all for it. Bad journalism is a root cause of a great deal of ignorance, but it's not an insurmountable task to fix it.<p>If you are a scientist who speaks to journalists about your work, make sure they include uncertainties. If you are an editor, slap your peons silly if they write a sensationalistic poll piece when the uncertainties say it's all a bunch of hot air. If you are a reader, please mercilessly mock bad articles and write numerous scornful letters to the editor until those editors pull out their beat-sticks and get slap-happy. We should not tolerate this kind of crap from people who are <i>paid</i> to get it right.