>Right now, most of what the computers in the world do is to execute tasks we basically initiate. But increasingly our world is going to involve computers autonomously interacting with each other, according to computational contracts. Once something happens in the world—some computational fact is established—we’ll quickly see cascades of computational contracts executing. And there’ll be all sorts of complicated intrinsic randomness in the interactions of different computational acts.<p>I don't think it takes a math genius to see how this is a bad idea. In the same way that trading algorithms can get into a feedback loop that crash markets, these "computational contracts" can cause cascading failures that hurt society as a whole. This is why human intelligence is so critical in running society: it has the ability to question whether its "programming" is correct and having the intended effects, and adjust accordingly. Computational contracts have no such introspection, by definition. They resolve because the rules are satisfied, for better or worse.<p>And all of that isn't even considering the attack surface area for malicious actors to target.
I find Stephen Wolfram to be a an interesting person. On one hand, he is undeniably exceptional and created an impressive computational system. On the other hand, the "Wolfram language" is really a pretty poor design as far as programming languages go, and would not even be noticed if it wasn't for the giant standard library that gets shipped with it, called Mathematica. I use the "Wolfram language" because I have to, not because I want to.<p>In other words, if I could easily use the Mathematica library from Clojure, I wouldn't give the Wolfram Language a second glance. I can't think of even a single language aspect that would tempt me to use the Wolfram Language [context: I've been a Mathematica user since 1993] over Clojure. I have (much) better data structures, a consistent library for dealing with them, transducers, core.async, good support for parallel computation, pattern matching through core.match, and finally a decent programming interface with a REPL, which I can use from an editor that does paren-matching for me (ever tried to debug a complex Mathematica expression with mismatched parens/brackets/braces?).<p>This is why the man is a contradiction: his thoughts are undeniably interesting, but his focus and anchoring in the "Wolfram Language" is jarring.
Not directly related to the article, but going through the comments, I felt the vibes and wanted to express my theory about what it is that irritates many of the commenters here<p>The highest pursuit is the search for truth. This includes the challenging act of discarding things that we'd really like to be true, but are false. The success of science can be attributed to extreme selfless intellectual honesty. The more ego gets in the way, the more the truth is compromised. Intelligence can be used in service of finding truth or in feeding our delusions. My view on the distinctions made here between genius and madness are that they correspond to the degree in which intelligence serves truth or delusions. Therefore I'd expect the most outstanding scientists in history were also very humble (perhaps someone with better knowledge on the personality of great scientists can shed more light on this).<p>And to keep things consistent - I might be wrong and thus welcome challenges to this theory :)
I had this thought when I was 13 and first learned the basics of programming "hey dad if everything is determined by physics and chemistry and computers are real good at math can't we analytically deduce the future."<p>I eventually learned this was just the 13 year old's version of "why's the sky blue?" into "but why is Rayleigh scattering a thing?" and there are several limitations to both human understanding of science, and theoretical computation limits -- a computer to have sufficient accuracy to model the world would by definition need to have as much memory as the world, and model itself <i>in</i> it. I moved on from that idea shortly after learning of that.<p>Is Stephen Wolfram just an overgrown child? Maybe unironically that's what being a genius is about.
To be honest I haven’t understood much of this.<p>Am I also the only one who’s very skeptical of AI? I see no correlation between what we call “biological thinking” and computation.<p>Even though I don’t know much of the theory behind AI, to me it’s similar to saying that since we have lots of simple calculators, we can arrange them together in some specific way and emergent intelligence will arise.<p>Sure yes I mean but you can say that about everything: let’s arrange a bunch of forks together and intelligence might emerge.<p>And actually from a math point of view you could luckily arrange some forks together and have intelligence since intelligence seems to emerge from an arrangement of atoms.<p>I don’t see why computation has to get a go at intelligent, while anything else not. What’s so unique about computation?
I watch his live CEOing youtube occasionally. Is quite interesting, but at some point I feel like I am stuck in a work meeting, with no real objections (understandably) being posed to Wolframs proposals.<p>I also think the word computation is used a bit too grandiosely by Wolfram. Which is evidenced in the writing here.<p>I do admire Wolfram for even advertising his CEOing meetings though. He goes into the detail to a level you would not necessarily think a CEO would do. Credit where credit is due, he is not shy. Many CEO's would not bother.<p>I mean, technically, everything is a computation, but we accredit actually complex things to that term, not everything. To use it "whilly-nilly" deflates the words impact.
The idea of small rules leading to large patterns is not something Wolfram invented, but he seems to think he did. I think Conway's game of life predates it, as does Mandelbrot, then there is the Durer pentagon.<p>Maybe there is a something I am missing though?
So he's comparing his eponymous programming language to the arrival of mathematical notation 400 or so years ago, and the advent of written language.
perhaps wolfram is parodying his main idea in his work output. from a basic origin he's creating monstrous bloat and complexity, as a type of demonstrative self-referential proof.<p>there are people who write code until it works, and people who rewrite until it doesn't. room enough for both.
If anyone is interested, Wolfram does what he calls “Live CEOing”, and streams his sprints on YouTube: <a href="https://www.youtube.com/user/WolframResearch" rel="nofollow">https://www.youtube.com/user/WolframResearch</a>
Think of all the people in prison, justly or unjustly, under our current system where humans have to interpret and carry out our legal code. We may extend these practices into every domain at negligible cost. Why do we desire authority without conscience? I can think of two reasons: one, we do not wish to hold ourselves accountable; two, we desire to see the strong afflict the weak.
Wolfram might be a genius, but I personally find his writing and lectures extremely dull. The only thing he seems to write is how everything is "computational", and how we need to embrace his idea that computing is what the whole universe is about. For example in this blog post alone, we can find this following word count:<p><pre><code> computer - 19x
computation* - 96x
computational - 63x
</code></pre>
Make everything computational!<p><pre><code> computational intelligence - 2x
computational contracts - 7x
computational universe - 7x
computational language - 18x
computational thinking - 2x
computational fact - 1x
computational acts - 1x
computational equivalence - 8x
computational irreducibility - 6x
computational system - 1x
computational process - 1x
computational work - 1x
computational essays - 2x
computational law - 1x
</code></pre>
Throwing all these words around may sound smart but they lose any meaning or relevance that they were supposed to deliver if being overused in such a larger-than-life manner.<p></rant>
> And, yes, there’s only basically one full computational language that exists in the world today—and it’s the one I’ve spent the past three decades building—the Wolfram Language.<p>:/