The big reason is that different languages do different things well, and new languages try to capture the good traits of other languages or language families. Maybe you like the JVM and the surplus of packages that are available for it, but you hate writing Java code and want a Lisp-like language instead: that's how you get Clojure. Or maybe you like C and its syntax and native performance, but you want the type-safety, memory-safety, and support for concurrency that modern languages offer: you've got Rust. Or maybe you just love Ruby's syntax and idioms but want the performace of machine code: that's the motivation behind Crystal, which is a newer, less popular language.<p>It's all about the transmission of ideas. "I'm going to take traits A and B from language X, and traits C and D from language Y, all while avoiding traits E and F from language Z." You try to isolate the good ideas from languages and suppress the bad ones -- but your new language is sure to have some bad ideas of its own, so somebody's bound to reinterpret it later if your language gets sufficiently popular. There's no language that's perfect for every single use-case, and there never will be, so this will probably continue on forever.
We have no idea what we are doing.<p>More seriously, Languages are linked to cognition and thought processes. I wouldn't go so far as to say they're a serialisation or marshalling of thought but you could see language as the rules that dictate the semantics of communication.<p>Like human languages, they simultaneously evolve and cross pollenate as we improve our 'vocabulary of semantics'.
Anything that is "general purpose" can have many many variations. This is exactly the case with general purpose programming languages. Special purpose language (eg: SQL) doesn't lead to such situations.