I like this and have a similar frame of mind. I wish more languages would make their sum types true duals of their structs (with a single payload datum - instead we have tuple variants and structure variants and so-on complicating matters for at best a character of syntax (but careful design could eliminate even that).<p>The author may be interested in coeffects as the dual of effects. They are pretty much as stated: coeffects are the context environment the that program executes in. This link has some prototypes of languages that let you create coeffects in your code: <a href="https://tomasp.net/coeffects/" rel="nofollow">https://tomasp.net/coeffects/</a><p>I was going to bring up covariance vs contravariance but the author mentions they were skipped for brevity.<p>Finally I really think simplifying to the point we ignore the idea of consumed-produced values is unfortunate. Consider how “inout” parameters aren’t too much different to a mutable reference (though the change in binding may be more explicit in a pure functional language). From the article it’s pretty clear why inout/reference parameters need to be invariant (both covariant and contravariant). And there is a whole lot that this kind of reasoning can bring to understanding Rust’s borrow checker and possibly how to create something simpler with the same strengths. But for that we’d need to model immutable references, unique mutable references and dare I say also volatile references (the environment IS volatile, whether it’s your physical peripherals or your application database, and IMO not having a language construct for that makes life harder than needs be).
There is a similar duality between “require” and “provide” that I’ve been trying to work more with recently. The idea that we can require some behavior verified via a test and also provide that behavior via a fake might mean that the two can be unified (test is the dual of fake, so maybe you can get tests and fakes from the same code, that’s the hope anyways).<p>Identifying a duality means that there might be some opportunity to transfer or unify concepts, but they might not be strong enough, or the duality isn’t pure enough to be very useful. The obvious dual of a precondition is not effect, but postcondition (like require/provide, import/export, in/out, etc…). So I’m having a hard time seeing the fruit of calling precondition and effect a dual.
Besides those mentioned in the article, most everything in functional programming has a dual, usually by prefixing "co-". Cofunctor, coapplicative, comonoid, comonad, &c. All things with reversed arrows and varying degrees of usefulness.
There exists a rich theory about duality in computation, a forgotten
twin of lambda calculus:
sequent calculus
<a href="https://ps.cs.uni-tuebingen.de/publications/ostermann22introelim/" rel="nofollow">https://ps.cs.uni-tuebingen.de/publications/ostermann22intro...</a>
I recommend you check it out if you are at least a little curious
about duality in programming.<p>I've been thinking about duality at the core of programming language
design for a while now. Motivated by asynchronous computation and
mainly focused on the question: How to build programming abstractions
from input-output-duality? It is fascinating seeing the common
abstractions just naturally evolve from there.<p>As a preliminary result I wrote a theory of computation:
<a href="http://perma-curious.eu/e3lli/core/" rel="nofollow">http://perma-curious.eu/e3lli/core/</a>
It describes how to go from input-output-duality to an advanced lisp dialect.
A formal account regarding what the author puts as effects and pre conditions <a href="https://arxiv.org/abs/1001.1662" rel="nofollow">https://arxiv.org/abs/1001.1662</a>
The reason null values are a problem is because most compilers do not track the difference between a nullable and a nonnullable type. If they would, then null values would indeed be similar to Nothing in the maybe type.