Parallelism giveth, and concurrency taketh away.<p>Concurrency, as used by people who think hard about these things, is about what you do to keep a system coherent and balanced with parallel activities going on: the overhead of coordinating parallel activity that prevents you from getting ideal xN performance. This includes stuff you wouldn't need to do at all with one thread, and also stalls when coordinating access to shared resources.<p>It doesn't include useful work not done in parallel because it's inherently serial, or that is thrown away because it was unnecessarily done more than once. That all is the domain of Amdahl's law.
I think a better title would be "What is the difference between concurrency and parallelism?" (that is, the title of the OP), and then leave a comment (on HN) saying "Questions like these on SO should be taught in CS-101". Just my 2¢.
It's a little overly succinct, but I've heard:<p>Parallelism is for many things working at the same time. Concurrency is for many things waiting at the same time.