When const generics are fully implemented (they are partially usable on nightly rust now) this will enable an even stronger version of this pattern, allowing you the full power of enums to represent the state.<p><pre><code> enum SenderState {
ReadyToSendHello,
HasSentHello,
HasSentNumber,
HasReceivedNumber
}
Sender<const S: SenderState> {
...
}
impl Sender<{SenderState::ReadyToSendHello}>{
...
}
</code></pre>
One can then give the individual states extra parameters:<p><pre><code> HasSentNumber {
number: u32
}
</code></pre>
(Note that in this case that doesn't make much sense, the number that is sent is more associated data then an actual type parameter. There is no real difference, at the type level, between HasSentNumber { number: 3 } and HasSentNumber { number: 6 }, and the compiler generating two types for this would be unnecessary. It is only an example of the syntax.)
A good sign in a type system for me is that the types are accurately modeling behavior that I find in day to day code. To the point where moving back to a language that lacks this feature feels weirdly unsafe or not expressive enough. Going back to C APIs, with their states that need to be read via bit masks or through special integer return values, is just painful after using discriminated unions. Likewise being able to have multiple owners to a value is a little weird after using Rust.<p>Also what's the deal with the fi ligature in the code? It throws off the kerning and holds no purpose.
We do something like this to encode our Redux states at compile-time (using TypeScript, obviously). Previously, using regular JS, the Redux devtools made tracking down incorrectly implemented reducers/state transition reasonably straightforward -- but you still had to trigger a bug before you knew you had to track it down, and implement tests etc.<p>This kind of design pattern measurably saves us time; it reduces the volume of unit tests we need to write/update when we make changes (we still test, just... with less fear), and it prevents newer developers from making mistakes. I haven't played with Rust (beyond a couple of toy projects) yet, and articles like this remind me I'm looking forward to sinking my teeth in over the Christmas break.
This is one of my favorite patterns from Haskell, which I've known as "type-level programming". One of the most common examples is a "sized vector"[1] which allows compile-time bounds checking. This is achieved by annotating each vector type with a phantom type variable representing its size.<p>Nice to see something like this in Rust! One thing that's a bit of a bummer, and I'm sure there are very good reasons for this, is that we HAVE to use every type argument of struct in its definition. If this restriction were to be relaxed, we wouldn't need the "state" field in the struct at all, and we could make the state type variable truly "phantom".<p>[1] <a href="https://www.schoolofhaskell.com/user/konn/prove-your-haskell-for-great-safety/dependent-types-in-haskell" rel="nofollow">https://www.schoolofhaskell.com/user/konn/prove-your-haskell...</a><p>EDIT: Typos.
I wrote something similar in C++, completely compile time that disappears at runtime: <a href="https://www.fluentcpp.com/2019/09/24/expressive-code-for-state-machines-in-cpp/" rel="nofollow">https://www.fluentcpp.com/2019/09/24/expressive-code-for-sta...</a><p>Although the language itself doesn't guarantee that the value is not used again after a move, good static analyzers will provide a warning in that case, so it can still be safely used.
Basically, they are using Rust to encode a state machine using types. This is brilliant! The nice thing is that the transitions are determined at compile time so there can be performance benefit as well as compile time correctness testing.<p>Really neat techniques!
Another example, implementing IMAP with Rust's affine types.<p><a href="https://insanitybit.github.io/2016/05/30/beyond-memory-safety-with-types" rel="nofollow">https://insanitybit.github.io/2016/05/30/beyond-memory-safet...</a>
Maybe not everyone will know this but Typestate was one of the original 'headline features' of Rust when Graydon Hoare started it. It was based on the Strom/Yemeni paper[0] for the NIL language. There's a mention of in this SO answer from 2010[1] and in the LtU discussion[2].
I'm not sure why it was de-emphasised, I think it didn't work as well as anticipated in the 'real world'<p>[0]: <a href="http://www.cs.cmu.edu/~aldrich/papers/classic/tse12-typestate.pdf" rel="nofollow">http://www.cs.cmu.edu/~aldrich/papers/classic/tse12-typestat...</a><p>[1]: <a href="https://stackoverflow.com/questions/3210025/what-is-typestate" rel="nofollow">https://stackoverflow.com/questions/3210025/what-is-typestat...</a><p>[2]: <a href="http://lambda-the-ultimate.org/node/4009" rel="nofollow">http://lambda-the-ultimate.org/node/4009</a>
Scala/Haskell has the same thing, and it's an amazing feature. The proper type is that of<p><pre><code> trait IndexedStateT[A, B, C]
</code></pre>
Which signifies a typelevel state machine moving from state A to state B emitting a value of type C.<p>I can only speak for scala, but i'm assuming haskell has singleton and literal types as well. Meaning that code like this works great.<p><pre><code> object DoorOpen
object DoorClosed
class Door {
def open: IndexedState[DoorClosed.type, DoorOpen.type, Unit]
def close: IndexedState[DoorOpen.type, DoorClosed.type, Unit]
}
val d = Door()
for {
_ <- d.open() //works
_ <- d.close() // works
// _ <- d.close() //compile error
}
</code></pre>
By my understanding of the article, it uses the borrow/move state to implement the state transistion. Is this generalizable to arbitrary state machines, or only a simple 2-state one?
This is a really useful pattern when you want to have rigidly defined/enforced state transitions to ensure that the data is never in an invalid state for a given operation.<p>It's pretty awful to deal with when you're unsure of what the state machine should look like, or if there needs to be a lot more flexibility in how the data is accessed. Maintainability nightmare.<p>An example of this I ran into is a data processing pipeline architecture where each vertex of the processing graph had a processing function called in a loop on its own dedicated thread. Using the type state pattern helped clearly define the "life cycle" of each vertex and enforce it, which provided for some powerful synchronization guarantees (e.g. we could provide <i>some</i> elements of memory safety even when loading things through shared libraries). If you dug into it you could break things, but that would be more work than just following the pattern.
Are there other languages that can do the trick done with "close()" on the file there? A type with a method that can make the type compile time unusable after being called!<p>that is pretty neat.
> 2. we have seeked in a file that was already closed.<p>> The second error, however, is much harder to catch. Most programming languages support the necessary features to make this error hard, typically by closing the file upon destruction or at the end of a higher-order function call, but the only non-academic language that I know of that can actually entirely prevent that error is Rust.<p>Why not simply add an `is_closed` flag and throw an error if it is?
This is one my favorite features of the language and I believe to be fairly unique. It can make for some slick and safe state machine like code.<p>I do wish that we had reliable RVO so that this could come at zero cost.
This line caught my eye.<p>my_file.open(); // Error: this may fail.<p>1) If this can fail, then it should be a compile error to not test the result code.<p>2) IMHO it would be nice if there was something like Python's with statement to correctly close a file.<p><pre><code> with open(filename, 'r') as f:
f.read()
# f.close() invoked automatically here
</code></pre>
This prevents trying to close a file that is not opened.<p>The idea of encoding a state machine into the types seems interesting.