Initially I thought that this could be one of those automatically generated nonsensical but scientific sounding texts (<a href="https://pdos.csail.mit.edu/archive/scigen/" rel="nofollow">https://pdos.csail.mit.edu/archive/scigen/</a>)<p>But upon closer reading I now think it's actually the reverse. The author apparently tried very hard to obfuscate the fact that he does have something meaningful to say by using utterly nonsensical sounding language.
The content of this whole wing of thinking is "you have to have a brain-like system to have a brain-like thing". Which, fair. But what's crazy-making about it is that people have decided this sort of insight says something about mathematics and metaphysics, which it <i>does not</i>.<p>For example, this:<p><i>This observation which they refer to as the “hard problem of content” or the “covariance-is-not-content principle” is that systems acting on covariance information, while acting on information, do not constitute content-bearing systems, because to bear content is to embody claims about how things stand, when in fact they merely embody capacities to affect the world.</i><p>is just complete nonsense, and to extract the charitable reading I put in quotes above, you have to read closely for paragraph after paragrah to see that what's going on is the word "content" is reserved to mean "things brain-like things do in a brain-like way to other brain-like things in a context built for brain-like things."<p>Again, okay! But: it's <i>wildly</i> misleading to frame this as being about mathematical logic or the metaphysics of symbols, syntax and semantics.
the title seems to be an unfortunate pun. from a quick look at the article, it's mostly cogsci stuff, where "proposition" has a very different meaning from the one "Propositions as types" normally refers to:<p><i>"The proposition is a concept borrowed by cognitive psychologists from linguists and logicians. The propostion is the most basic unit of meaning in a [mental] representation."</i>
[1]<p>the article seems to mostly be talking about AI and the problems of making it "actually refer to the real world". so i think title could be paraphrased as sth like "Information about the world ([mental] propositions) is hard to represent with symbolic, digital things (types)".<p>[1](<a href="http://www.bcp.psych.ualberta.ca/~mike/Pearl_Street/Dictionary/contents/P/proposition.html#targetText=University%20of%20Alberta%20Dictionary%20of%20Cognitive%20Science%3A%20Proposition&targetText=The%20proposition%20is%20a%20concept,judged%20either%20true%20or%20false" rel="nofollow">http://www.bcp.psych.ualberta.ca/~mike/Pearl_Street/Dictiona...</a>)
I'm very confused about what this blog post is trying to argue. Most of what it discusses isn't related to the Curry-Howard correspondence at all?<p>At the risk of sounding anti-intellectual, perhaps Orwell's "Politics and the English Language" [0] is relevant here?<p>[0] <a href="http://www.orwell.ru/library/essays/politics/english/e_polit" rel="nofollow">http://www.orwell.ru/library/essays/politics/english/e_polit</a>
On what grounds is it argued that humanity operates on "content" grounds? If the stuff in my head operates purely on syntactic, formal, biological, chemical, physics methods, then the introduction of semantics or "content" is a red herring, even if you argue that multiple humans is the magic factor, it seems to me.
Sorry. This text is absolutely useless to me. What is it trying to say? Don't get it, although I agree that propositions are not necessarily types.
This nonsensicaly-titled article is somewhere between a rehash of "the map is the not the territory / one cannot pass from the informal to the formal by purely formal means" and Sokal-esque gobbledygook.
I couldn't follow this. I'm usually capable of following technical papers. The jargon was thick as a jungle, it seems auto-generated. Use plainer language next time, please.