Languages that use types for elaborate proofs always seem to me like they only prove the most uninteresting properties of programs, namely the parts that deal with data transformations. Not that data transformations aren't important, but the properties they need to preserve are usually easy to be convinced of without any assistance by the compiler.<p>At the end of the day, programs are written for their side effects (unless your program is a compiler, which is the classical example of an "interesting", non-trivial data transformation, and possibly the only example). What I'm most interested in is proofs that, say, in a concurrent environment, a function that closes a socket will never be called as long as there are pending tasks to write to the socket. Or, because I write concurrent data structures, I'm interested to prove that a certain function will eventually release all locks it acquires. Are there any such languages?<p>I've heard of effect systems, but I couldn't find enough information on them. Are effect systems capable of the kind of proofs I'm interested in? Are there practical programming languages with powerful effect systems?<p>EDIT: Some Googling yielded this: <a href="http://lambda-the-ultimate.org/node/4768" rel="nofollow">http://lambda-the-ultimate.org/node/4768</a> tldr: Not yet.<p>EDIT: At least the lock proof seems to be implemented as one of Java 8's pluggable type systems: <a href="http://types.cs.washington.edu/checker-framework/current/checker-framework-manual.html#lock-checker" rel="nofollow">http://types.cs.washington.edu/checker-framework/current/che...</a><p>It lets you add a type (Java 8's pluggable types are intersection types) to a function called @EnsuresLockHeld(locks...) that proves some locks will be held when a function returns and even @EnsuresLockHeldIf(locks..., result) that proves a function grabs certain locks if its boolean result is equal to the one specified in the type.
I really like these "learn you a" books.<p>I have no reason why but I just like the freaking pictures. It keeps me going. It's like a children book but it also teaches me the subject I like, programming languages.
I like to show you the immense beauty of the Agda standard library: <a href="http://agda.github.io/agda-stdlib/html/README.html" rel="nofollow">http://agda.github.io/agda-stdlib/html/README.html</a> (all clickable and…you'd better have proper unicode fonts).
First time I read about 'mixfix' operators, one less thing I'll have to invent I guess. <a href="https://www.google.fr/search?q=mixfix+notation" rel="nofollow">https://www.google.fr/search?q=mixfix+notation</a>
It lost me at this point:<p><i>Type the following into an emacs buffer, and then type C-c C-l (use \_1 to type the subscript symbol ₁):</i><p><pre><code> proof₁ : suc (suc (suc (suc zero))) even
proof₁ = ?
</code></pre>
On my newly created emacs setup from the earlier instructions, it's not clear what "type into an emacs buffer" means. Do I create a new file with C-x C-f? Create a new buffer with C-x b? I tried both, and promptly lost agda-mode.<p>I then tried restarting emacs and opening a new .agda file so I get agda mode. Loaded the earlier module and then tried loading this, and it didn't seem to know about <i>suc</i> in the context of this buffer/file. So now I'm lost.
If I wrote a natural number calculator in Agda; would all my numbers be represented as lists of successions from zero or can the compiler convert them to two's-complement integers as we know and love them?<p>In case the compiler can do that conversion: is it is programmed to do that for some subset of numeric types or can it infer an optimal binary representation of a value somehow?<p>On the other hand, if they really were represented as lists then I presume this language is intended as purely academic work and has no application in a production environment. Am I wrong?
<p><pre><code> If we can construct a value of a certain type, we have simultaneously constructed a proof that the theorem encoded by that type holds.
data _even : ℕ → Set where
</code></pre>
I do not understand how to interpret this passage. _ even, given a number, returns a type? in this case, zero even is a type, but we have not created any value with that type. What am I missing?
Recently learned this language in a functional programming course. Very cool language. The Curry-Howard Isomorphism proves that programming languages are equivalent to mathematical proofs and you can really feel it here. I could see this language being very useful in math-y applications such as cryptography and verification.
I always wondered about dependent types. Even though it was a little bit too technical for me, at least I have some idea about what it really is.<p>By the way, Z notation rocks.
Ok, that's neat. Is there a proof checker and an editor/IDE other than Emacs' Agda mode?<p>Because I'm probably going to stick with Coq for its tooling.
My thanks OVERFLOWS. For those who are 'older but not quite
Learn U Agda vs 'wiser strategies' HELPFUL.<p>Think eric raymond and too bad that perl is just another dead language -- your wisdom though as recompense.
When you are older, it becomes EASIER for crunchfit and yes,
it sometimes becomes harder to exercise to point of vomit -
"ad nauseum' in Latin, but the pain is much easier.
Don't you do three hundred (300) pushups a day?<p>So, the routine is simple. Get stuck running Haskell (another bump on the road to the true language of Coq)
- go ahead and flame - do the pushups. Then laugh
cause it releaxes and then refrshed back to AGDA.<p>PS. Engineering school was arduous and graduate school.
The personal private books/research were just hobbies and no
I will not mention my name. Even the so-called Yourdon
'death marches' are just a bump on the road.<p>It's worse than some females - over the period of a 'big city man.' - go ahead flame - Just take code from arxiv,
run it in Haskell and Agda and there are only three explanations
1.)the programmer is stupdi - yup that's me
2.)scientific fraud - always possible
3.)the 'compiler' or the strange AMD CPU is fickle in a
feminine way?
4.)the language paradigm is 'slightly broken.'
5.) all of the above<p>Fair Warning and Serpents be Here. Haskell, Agda and even
Coq are DEFINED as EXPERIMENTAL RESEARCH Languages.
Of course, you can always go Python (which version?) and
Javascript, which is the bubble gum and string that holds
together the Internet.. along with BAsh shellshock.<p>It this why they call it a bit of 'hacking'?
PPS. learn to ride bicycle in big city and survived with
most of my fingers intact.
Ah yes, dependent types. They tried to teach this at my university because some researchers there were working on it. Too bad they didn't bother to teach something more useful, like what malloc and free do.<p>But let's see. Perhaps the field has become less wonkish and more relevant to real programmers in the last ten years.<p><i>Most language tutorials start with the typical “Hello, World” example, but this is not really appropriate for a first example in Agda. Unlike other languages, which rely on a whole lot of primitive operations and special cases for basic constructs, Agda is very minimal - most of the “language constructs” are actually defined in libraries.</i><p>That sounds reasonable ...<p><i>Agda doesn’t even have numbers built in, so the first thing we’re going to do is define them</i><p>That's much less reasonable ...<p><i>This is a very elegant way to define infinitely large sets. This way of defining natural numbers was developed by a mathematician named Giuseppe Peano, and so they’re called the Peano numbers .... Now we can express the number one as suc zero, and the number two as suc (suc zero), and the number three as suc (suc (suc zero)), and so on.</i><p>Oh dear. This is a new use of the word elegant I have not encountered before.<p>I was expecting the next stage of the tutorial to describe how to make numbers in Agda resemble number systems actually used by humans, or even machines, but no such luck. They stick with a Lisp-style Peano representation throughout the entire thing. Having defined an entirely unworkable method of representing integers because their standard library apparently doesn't do so (?!) they then go on to prove things like 1 + 1 == 2.<p>Yeah, I think I'll skip. Perhaps in another few decades someone will have bothered to make a dependently typed language that doesn't use untypeable characters as part of its syntax, and has an integer type built in, and we'll all get awesome statically checkable code for free. Until then I'm gonna stick with FindBugs.