My opinion with 9+ years since first learning Go, multiple of those using it for a full time job:<p>Putting the end first, my rule of thumb for using generics in Go is: Don't go down the OOP road of over planning and programming with fancy type work. 99% of the time, the common Go programmer won't need to write any generics. Instead, just focus on actually solving the problem and manipulating the data like you would normally. If you encounter a place where code is repeated and complicated enough to be worth a new function, move it to one. If you find yourself repeating multiple functions but with different data types, turn that into one generic function.<p>Generics are an incredibly useful addition to the language that I'll almost never use. Really to be more precise, Go has had some generics this whole time: Maps, slices, arrays, and channels all have type parameters, and have covered the vast majority of my needs. There are a few times where I've wanted more general generics, though:<p>- The sort and heap packages are rough to use. You need to specify a bunch of nearly identical functions just to get them to work on any custom type. The generic versions (not coming in 1.8's standard library, iirc) will be much easier to use.<p>- Was writing an Entity-Component-System game for fun, and needed a weird custom container. Turned to code generation, and really that turned out to be necessary anyways because it did more than any (non-metaprogramming) generics could do.<p>- We had one very complicated multiple Go routine concurrent data structure that needed to be used for exactly 2 different types. Others were writing the code, and very afraid of using interface{}. This is despite there only being a handful of casts. In reality if they caused a bug, it would be found immediately. There's a strong hesitation around type safety dogma that isn't risky in practice. Still, generics would've been the preference here.<p>- I was parsing WASM files, and there's a common pattern for arrays where it encodes the length of the array, then that many objects in a row. It led to a lot of minor code repetition. Replacing that with a generic function that took a function to parse a single object, and returned the array of those objects was a nice, but relatively minor win.<p>On the other hand:<p>I've never really been bothered by having to do sets like map[int]struct{}. There was one case where I saw someone put set operations out into a different library. I eventually found to my dismay that the combination of how the set library was used, and how it was implemented caused a performance critical part of the code to be several orders of magnitude slower than it needed to be. Had this code been more idiomatically inlined, this flaw would have been more immediately obvious.<p>I really don't like seeing map/reduce/filter type functional programming coming into Go usage. This type of code tends to need more dramatic changes due to minor conceptual changes, more than direct procedural code does. Also like the set example, how you iterate and manipulate objects can have large performance implications that using such functions hides away.