Hey. I am currently writing a blog post about my experience with Elixir. I aim to write about things I liked, but also I want to describe stuff that I just hated or disliked. Because I've noticed that people always try to praise the new tech they are using, but rarely point out bad things. And the truth is that learning a new technology is many times very time/money consuming process.
* Is there a clear reason the new tech exists? What differentiates it from its competitors? This alone rules out like 90% of new front-end web frameworks / widgets / plugins.<p>* Bonus points for tools whose authors have made the effort to explicitly compare it with competitor tools, <i>particularly</i> ones that acknowledge points where the competitor might have the advantage. ("Our new tech is better than existing old tech in every possible way" gets the side-eye from me; "Our new tech is better than existing old tech for these particular purposes, but old tech may still be more appropriate for these other purposes" goes a tremendous way towards confirming that the new tech has a real reason to exist.<p>* Is there documentation? Is it any good? This is a really low bar, but far too many new tools have no documentation at all ("just check out the source code") or have minimal, incomplete, or tautological docs ("bar.foo(): executes the foo method of bar"). A message board or IRC channel is nice, but not a substitute.<p>* How big is the API surface? Does it need to be that big? I tend to avoid tools where there are six different ways to do the same thing -- looking at you, Angular -- it suggests the developers are unfocused or in disagreement, and makes it harder to find support or documentation on any particular issue. Same thing if the API has undergone major breaking changes or paradigm shifts between versions (looking at you, Angular...)<p>* What does the tag look like on stackoverflow? This serves as a good indicator of whether the tech is too new or obscure to bother with, what the common pain points are, the average skill/knowledge level of its users, and whether help will be available if I get stuck when using it.<p>* Is there a relatively simple way to try it out? I'm much more likely to experiment with something where I can clone a repo and get going with simple but nontrivial example code; if I have to reconfigure half the settings on my machine just to get a hello world, I'm not going to bother.
What I want to see is at least a few really gnarly bugs that were eventually solved. I want to make sure the tech is backed by people who are 1) highly skilled 2) actually motivated to solve the issues. In short, the kind of people Mickens refers to in his infamous essay ( <a href="https://www.usenix.org/system/files/1311_05-08_mickens.pdf" rel="nofollow">https://www.usenix.org/system/files/1311_05-08_mickens.pdf</a> ).<p>I have built this standard up after a few mistakes, where we chose a tool just to realize pretty much everyone working on it was a extremo ultimate rockstar ninja, better known as first-year CS dropout with a god complex.<p>NB: I'm not saying all first-year dropouts are bad, or that having a degree is in any way mandatory in this field. All I'm saying, is that when our site is down, I'd like to have a few people with 1980s MIT Electrical Engineering degrees on my team. The kind of people who know what a process is, how TCP passes packets, et cetera.
Almost all tech decisions come with a set of tradeoffs. I wouldn't trust articles praising certain technology because they are probably ignoring the tradeoffs that come with it.<p>Elixir is great for a chat application. At the same time it's probably not the right choice for a Machine Learning project.<p>Of course there are a number of absolutes that I look for regardless of the context:<p>* How good is the documentation?<p>* Is it actively maintained?<p>* Is it mature?<p>* etc.
The first new "tech" I learned was in the early 1980s, so I benefit from the perspective of seeing how my choices have panned out over the short to long term.<p>And now this might seem weird, but I have had good results following tech trends that attracted the widespread interest of hobbyists. These tend to be things that are easier to obtain, install, and use, and that have communities built up around them. The hobbyists will tend to weed out the things that are just too painful to use.<p>Examples over the years include Turbo Pascal, Visual Basic, PIC microcontrollers, Arduino, and Python.<p>Some of these are proprietary technologies, of course, but their vendors didn't abuse us too much (until Visual Basic went overboard with dot-net, whereupon I switched to Python). I got a good solid decade or more out of each of these things.<p>Of equal or greater importance to the use of these tools is what I can learn. I don't mind throwing a vendor a few bucks if I will use their tool to expand my knowledge, and if that knowledge is applicable to a broader range of things. For instance through Python and the community of developers, I've learned new disciplines that have made me a better programmer in any language.
What’s the goal of learning/using a new tech? Is it for a task? Or is it for your own learning?<p>If it’s for a task, I’d say:<p>1. Really spend some time to understand what you’re trying to solve
2. (If applicable) What pain point are you experiencing with existing solution?
3. Find tech (old or new) that might be a good solution, and understand their limitations/trade offs that you'll be making by choosing this tool.<p>Then, make a decision.<p>If it’s for learning, it ultimately is what are you trying to get out of the learning experience, and see if it fits your goal. (It is totally OK to learn a new tech just because it sounds cool - Learning more about something cool is a type of goals as well.)
The job boards.<p>My time is limited for learning any new to me technology. If it doesn’t either lead to me making more money in the future or remaining competitive, I don’t learn it.
1. Is there something interesting/novel/intriguing? Something has to capture my interest and show significant value. I mostly write in Clojure, so the bar is set pretty high for this one. New syntax doesn't impress me, building large reliable systems does.<p>2. Is it elegant? In the general sense: things that are elegant are often also good designs.<p>3. Do people use it? This is a hard one, and stems from 25+ years of experience. No matter how beautiful or impressive the new technology is, there will be problems, and I am too old to iron them out myself. I've got things to do. So, tough as it may sound, these days I will not start using things that have not seen reasonable adoption. This does not mean I won't read about them, just not use them. And my criteria for "reasonable adoption" are not "everybody and their dog uses it", I just need to see a user base. Also, the threshold is higher for databases. Practical examples: Clojure and ClojureScript are fine, Datomic is not. RethinkDB just fell below the threshold and I have to migrate; FoundationDB is barely getting to the threshold.<p>Taking Elixir as an example, it fulfills all three criteria. I know what it is, I've seen it in action, I read about it and I keep in the back of my mind as a tool I might want to use when needed.
Maybe I’m strange by HM standards, but unless it’s sometjing I’m just learning for fun, it must be used by 2-3 major companies or projects before I’ll consider it worth seriously picking up. It must solve a problem I have that I can’t solve with technology I already know. It must be relatively approachable and have at least minimal documentation.
Interestingly, those blog posts that are of the form "I/We started using X. It's so much better than Y, we used before!" tend to attract more attention than a nuanced analysis. Probably because a positive blog post is inherently more exciting than one that goes over the subtle but important details.<p>The most important is an analysis of what use-cases is Technology X good for, and why? Every technical decision is a list of pros and cons. If it fits the use-case perfectly, that is the most important factor.<p>After that the most important factor I consider is community and momentum.<p>It's possible a technology is immature and lacks good documentation. BUT - if it has a rapidly growing community and momentum, these 'cons' will disappear rapidly.
Here's my check-list based on personal experience:<p>1. Is it road-tested for a few years at least? Unless you are in R&D or a high-risk start-up, don't volunteer to be a guinea pig. 5 years is my rule of thumb.<p>2. Are there successes in similar organizations? One size does NOT fit all. Make sure it's useful in your particular organization in terms of domain (subject matter), culture, and company size.<p>3. Do the benefits over-emphasize a few narrow factors while ignoring others? There's rarely a free lunch; most decisions are balancing various trade-offs. There are probably down-sides that vendors or fans don't want you to know about or failed to notice due to enthusiasm bias.<p>4. Does it over-extrapolate current trends? For example, just because more UI's are going mobile does not mean every business is throwing out their mouse and big monitors. You may be limiting yourself by trying to make your UI's both mouse-friendly and finger-friendly even though most actual business users will be on a desktop. It's not always good to keep up with the Tech Kardashians; they are not always rational or timely.<p>5. Does it require a big learning curve or lots of prerequisites? If the new technology turns out to be mostly a fad instead of a real improvement, a long learning curve or expensive investments will drain your time and budget. Look for incremental improvements first.<p>6. Vague buzzwords or promises: Lack of specifics and realistic examples is a sign you are being had.<p>7. Experimenting is fine & recommended, but don't do it on production projects. If possible, introduce it to production gradually.
This might sound a bit esoteric, but it works for me every time: First of all, it's a feeling. Tech A feels more 'right', or resonates better, than tech B (where both are in the same category).
Of course this is not the only metric. I'll look at longevity prospects of the tech, how much it breaks with 'tradition' and I'll try to focus on the negatives people have about the tech. But if more than one solution presents itself and I'm at an impasse - feeling wins.<p>I also acknowledge that one day, I might wake up to a stinking pile of tech-crap, but even then at least I know that at the time it felt right.<p>For example, many years ago, in a decade far far away (The 90s), I went with ColdFusion as a tech stack. Back then it was that, Perl or maybe TCL. ColdFusion felt right because it allowed very rapid prototyping with a clear syntax and batteries included. There was nothing like it. Fast forward a few years later and that tech was so smelly it made everyone nauseous but I knew at the time it made perfect sense, and by then other options presented themselves.
Basically these key factors:<p>* are there any tutorials / code examples? (to see if I like its API/philosophy AND there is a way to pick it up)<p>* are there any practical, working projects out there (used by companies, etc)? (otherwise it may be not useful for bigger projects)<p>* is it in active development? (otherwise there is a risk that it will cease to be useful)<p>* how does it match against other tools (e.g. maybe it is easy to pick, but so are all other frameworks).<p>Also, I did write some comparisons. Vide: <a href="https://deepsense.ai/keras-or-pytorch/" rel="nofollow">https://deepsense.ai/keras-or-pytorch/</a> (got popular here).
What kind of information do you look for before learning/using new tech?<p>I look for a book. Books in general are usually miles better than technical blogs and you can find a ton of information addressed about a subject in one place, so there's the convenience factor (not that you couldn't create some program that indexes your bookmarks, but still have the problem of providing meaningful titles and organization for your bookmarks).<p>I also find that books are generally peer reviewed, especially textbooks, so the BS is kept to a minimum. Makes for a more boring read, but its more accurate.<p>That being said, I like language analysis posts. People tend to bring up a lot of things I haven't thought of, and there was one done on C a while back on HN that looked very good: <a href="https://eev.ee/blog/2016/12/01/lets-stop-copying-c/" rel="nofollow">https://eev.ee/blog/2016/12/01/lets-stop-copying-c/</a> (tbh, I skimmed the article because I didn't have a lot of time to read it at the moment).
I’m surprised this hasn’t been mentioned:
Can I read the code?<p>This applies to everything open source, including languages. If you open up the codebase and go “wtf” that’s a problem. If you open up the codebase and go “I don’t understand this, but it looks clean and with some effort I could understand this” that’s gold.<p>This is especially true for libraries, you get a sense for the right size and right amount of complexity in dependencies.
Two things really.<p>One, usability over existing systems. It is a difficult one to actually answer but I find people talking about new technologies all the time and when you ask them - Okay, what can we do with it which older technology couldn't? Mostly I hear murmurs or barely justifiable answers. But, if the explanation is sound, I go for number 2.<p>Does the person talking about it has significant exposure into the problem space? Mostly you will see people talking about how X is great but never having to work with the nuances of an older tech Y.<p>Now this process has some bias built-in. As a supporter of older tech Y it is entirely possible to never find a reasonable explanation. The only way around it is to talk to as many people as you can.
This is an excellent question. I think this maybe a fault of mine, but I tend to let myself be dragged into new technologies rather than actively pursue them. (What I'll offer in my defense is that I also tend to pursue roles where I'll be pushed to use new things.)<p>Rationale is that in a world with limited time, I'd rather focus on solving problems external to the technology itself. If I have something in my toolbox that will work, then it's generally the easiest thing to use, rather than learning something new. There's less of an immediate learning curve and fewer of the issues associated with being an early adopter.<p>Where this changes are in situations where either the investment to learn a new technology is low enough or the potential for return high enough that the learning might be expected to produce a high ROI.<p>So what that means practically is that I'm looking for things where I either have an immediate commercial need to know it or a strong feeling it's likely to be useful in a way that none of my existing toolset will fulfill.<p>From the perspective of something like a programming language, this is part of the reason I've tended to like Lisp-family languages as an adjunct to C-family languages. They're different enough that they're more likely to be complementary to each other and it's likely to be easier to make choices about what code goes in which language.
I think your post would be valuable. I specifically look for (and sometimes make) lists of annoyances w/ any language I work with. A lot of times at a glance you can't tell if a compiler is buggy, some parts don't feel ergonomic, there is limited support/ecosystem, etc. I don't use the info to make my decision, I use it to temper my expectations.
Same questions for learning old technology (eg : lisp) - why is it used, what is it good for, how can it change how a problem is approached, what domain is it intended?<p>and of course compare with existing and known.
Very rarely does an old or new tech measure up against comparison... unless it offers something not present elsewhere.<p>And of course, bug reports (<i>). If there's a long queue of bug reports I won't touch it until a significant number are addressed. (when I worked in commercial drupal dev, this is how we picked modules to use).
If there are no bug reports or any kind of online reputation of ignoring or denying bug reports, it gets avoided. This is not unusual in project with particularly fragile egos in charge and therefore unreliable.
(</i>) bug reports include feature requests, documentation requests and similar as well.
I always look for articles that talk about issues that arised with that specific tech during / after development.
Edge cases and workarounds, things you cannot do at all, maintenance, etc - this is IMO the most interesting bits when you are evaluating a new technology.
I look for what a technology is good at and what it isn't so suited for. All technology has its advantages and disadvantages, and I want to find out what those are. What sort of applications is it suitable for? What operating systems does it run under? What does it not do well?<p>I'm also learning Elixir and writing about it at <a href="https://inquisitivedeveloper.com/" rel="nofollow">https://inquisitivedeveloper.com/</a>. In the first couple of posts, I talk about what Elixir is really good at and what it isn't good at. I'm generally positive about it but I do grumble when I encounter something I feel could use improvement or doesn't make sense to me.<p>For example, Elixir is great for concurrent and scalable software. I'd use it to build a web service or game server, but it is unsuitable for a game client, physics simulations, or OS development. It's just not low-level enough for those purposes.<p>To sugar coat it is to just set up your reader for disappointment further down the line when they hit the limitations and realize that it's not suitable for their needs.<p>I also keep a general awareness of what technologies are out there and what they're good for. I've never used Redis, for example, but I know what it's good for. The same applied to RabbitMQ. I knew what it was good at even though I didn't use it. That lasted until I encountered a situation where it would be useful and I ended up introducing it into my organization.
How useful is this tech? Some tech is just pointless or part of someone's scammy business.<p>How Eternal is the knowledge? Math is Eternal in the true sense, while that new webpage framework will be around for maybe just 2-5 years, and is thus not worth learning. (Although I am sometimes happy that people sacrifice their lifes on stuff like that, I would never do it.)<p>Vendor lock-in? I steer clear from that.<p>Is it being maintained? (Some things are okay not being maintained though. It depends on the type of tech.)<p>Surveillance potential. I am somewhat paranoid in my own personal life.<p>When procuring network-attached tech I usually dig through exploit-db and the CVEs and try to make myself a picture of how their security is being handled. Small companies are difficult to judge from this because they have little recorded history. (So I didn't buy any mikrotik router because of their shenanigan attitude to security, for example.)<p>Do the creator of the tech try to keep my hands away from digging into the machinery? Like hiding that it is really just a Linux/bsd box underneath? I stay away from that, if possible. (It's not always possible, almost all tech runs on Linux nowadays.)<p>I can be bribed to ignore any of these things that I usually avoid. For example, I learned MS Windows and some of their tech because I got bribed.
When I’m choosing a new technology, it’s almost always for a specific project, and I’m almost always choosing between 2-3 fairly similar options (Django or Rails? React or Vue or Angular? Swift or React Native? Redis or RabbitMQ? Java or Scala? Etc.)<p>There are few cases where one technology is objectively and universally better than another. Most of the time, each has strengths and weaknesses. Often there is overlap between use cases but one is better for one type of usage and the other for another type (such as Redis/Rabbit). In other cases they’re pretty much interchangeable, but one may match more closely to my particular mental model (my experience with React/Vue/Angular); or one may be easier to get started with.<p>With that in mind, I’m usually looking for balanced information about specific trade-offs. What is this technology good at that the most similar alternatives aren’t as good at? Where do the alternatives excel that this tool struggles. How do the philosophies of the tools differ? What’s the learning curve like? What limitations did you stumble over only after using them for a while? How do the ecosystems compare?
Basically, I'm interested in knowing that adopting this tech will make my life happier and less complicated.<p>* Look at the open issues -- Is it full of buggy edge cases and "3d chess"?<p>* Quality and maturity -- Stuff works, as documented, under all possible conditions. Maintaining existing features / "tech debt" is prioritized over new features.<p>* What problems does it solve? What problems might it introduce? Does it actually solve the problems it's claimed to solve?<p>* Is it designed in a way that will totally break a possible future use case?<p>* How much of a "custom ecosystem" is there (bad), and how much of it is "open standards based" interfaces (good)<p>* Is it getting more or less complicated to use it over time? Can you learn it as a "simple cognitive model" or do you learn it as a "huge collection of random facts" e.g. "a cognitive edge case LUT"<p>* Documentation must be okay (no elasticsearch)<p>* History and sustainability -- How long has it been around, is it updated regularly (regularly != often), do updates break stuff (no pipenv)
It depends how core to what I'm doing the tech in question is.<p>Programming languages are their own category. They tend to not just be a project investment but a career investment. I generally at least get a feel for the language by reading some projects written in it and looking at code that does things I know how to do in the languages I already know. If there's some known strengths to the language (eg. "Go is good for servers", "Rust is good for system programs", …) I may be tempted to use the language for one such project as my first experience with it.<p>Software tech, things like what framework/DB engine to use, proprietary services such as AWS/GCE services and what not: Those I'll find when looking up how to solve a technical challenge I'm having at a particular moment. Reading up on it (documentation, known use cases, success stories, failure stories) is the only way to really decide. Then it's a matter of choosing the best fitting solution from the ones I found.<p>Important criteria:<p>- How well does this solve the problem at hand?<p>- Is it open source?<p>- If closed source, is it hard lock in or does it have open source tooling? (eg. Redshift uses postgres-compatible tooling/syntax, a huge benefit)<p>- How good is the code? (And what's it written in?)<p>- How good is the documentation?<p>- How popular is it? (= how likely am I to find help when faced with an obscure bug) / Is it an easy tech to hire for?<p>- How much do I trust its maintainers? (Especially: How much do I trust them to keep maintaining it and keep the project healthy)<p>- Is the pricing/licensing compatible and affordable for my use case?<p>The most difficult part of this process is knowing when not to use your own shitty hammer when there's much better hammers available for the current nail. It's easy to get stuck in a mindset of "The current tech, which I know, doesn't work really well for this use case but it can be made to work somewhat and that's good enough, no reason to look up better alternatives".<p>I'm always blown away when I discover a new piece of equipment I might have completely missed, were it not for actually googling/asking around for solutions to a new problem. Immediate discoverability of solutions kinda sucks when you aren't sure what to look for.
One that I've found increasingly important:<p>Documentation specifically addressing how to use in multiple different tech stacks. More generally, not assuming that everyone has the exact same pipeline as the core developer(s), and some evidence that they've made good faith efforts for the tool to work in other situations.
Another small datapoint, but I’ll shamefully admit I pretty much only adopt new tech (programming languages, popular libraries, trendy design patterns) if I hear a lot of people talking about them or using them. Programming for me isn’t so much about choosing the right tool for the job, as I as a single human will always be the bottleneck regardless of what tools I use. I’d rather maximize the chances that other people (hopefully better than I am) will come along or be hired who can help. I’ve experimented with most programming languages or interesting frameworks as I’ve heard about them, but I don’t really ever use them in projects as it feels counterproductive towards my goals. What point is there in choosing a slightly better niche technology that 1000 people know when the mainstream equivalent has 100,000 or even millions?
For the most part, most tech is usually good enough for everything.<p>A lot of the things people hate about a tech is workable. It's better to pick something unusually good at something with multiple flaws (e.g. PHP/JS) rather than something with no flaws but isn't particularly good at anything.
I'm actually quite interested in finding out which London tech companies use which languages/frameworks/tools. I write PHP because it's what I work with, but I'd like to learn the language that will help me find the most jobs. Anyone know of a tool that does that?
1. Does it satisfy my use case?<p>2. What makes it better/worse than other solutions?<p>3. How is the project run/what is the maintainer situation like?<p>4. What is the community like? Will I have a hard time finding someone else to work on this project competently if I use this tech?<p>5. How long has it been around / will it be around in 5 years?
The very first thing I look for is license. If it's not gpl, preferably gplv3+, or a similar compat license like apache 2, it is highly likely I will immediately close the tab and move on. The next thing I look for is what language it was written in, closely followed by when the last commit was, then at how well done the documentation/website is, the goal being to get a general understanding of how much the dev cares about the little things. Then I tend to add it to my list of interesting things and will compare it to alternatives in the same category. If after being compared to existing similar solutions, if still seems worthy, it gets tested, and if still deemed worthy it goes in the main list of my stack of software.
At one point in 2011, I quipped: "There is no dearth of new languages. Everyone who knows how to lex/parse can build one. And they will build one."<p>It's the same thing with any other piece of technology for me. It's easy to build tech. But In my opinion, when I look at new tech, here is when it is genuinely better tech for me:<p><pre><code> - Cross platform, solves toolchain issues
- Provides a great debugger/observability interface
- Integrate well with native/os/hardware. Open interface
- Has decent basic primitives and not the kitchen sink
- Well designed that the core is stable and is compatible for 5 years. "It's the design, stupid."
</code></pre>
That's it for me.
How well the documents are. If I got to go to stack overflow or a community forum to try and figure something out that should be found in docs then that is a warning sign.<p>What scope does the technology cover. I believe in the rule of "Do one thing and do it well". If it suppose to do A, but can also be used for B and C is a warning sign to stay away.<p>Is it something a engineer built or a hacker built. Not saying a hacker can't be a engineer and such, but long term vs short term goals for technology need to be well planned out and executed. I use a ton of technology built by hackers, but anything I put in production is built by engineer.
First, I like to see what language-feature(ish) boxes it ticks. I like to see something that excites me :P<p>Actor model? Type-classes? Algebraic data-types? Immutability by default? Type-level programming?<p>Second, what is the tool support? Do I get an IDE? A languace server? A config for vim/emacs? At least syntax-highlighting? How do the error messages look like? Compiler? Testing frameworks? Lint?<p>Third, I like to evaluate practicality. Can I imagine maintaining a cli app in this? A web server? Would I be able to connect to databases? Easily serialize data-structures?
Community momentum, corporate support, and how significant the changes were between the last two big versions. That last point was a big determinant for me when I was deciding to learn react vs vue. The vue 1 -> 2 upgrade broke a ton of community packages, plus a lot of unhelpful tutorials. In the end I chose React over Vue, and only _sometimes_ regret it :)<p>Edit: also, active reddit community. I don't ask a lot of questions there, but it's a reasonably good metric of the health of something IMO.
Adding to whats already said in other replies:<p>Plus:<p>* is it tech i wanted before i knew it existed.<p>* tech clear on limitations / being unfit for case XYZ.<p>* community with low frustration levels.. where clear answers get clear questions.<p>Con:<p>* tech that is just the opinionated alternative of the month. Even if it catched on.<p>* tech website cant (or forgets to) explain solved problem in simple laymans terms.<p>* not actively maintained, must if facing external dependencies. (includes os, browsers, libraries, etc)<p>* docs are inconsistent/unversioned, or old versions are not kept available.
The question I ask, that everyone asks, is: "What's in it for me?" and "Why should I care?"<p>The features? I don't care. I need to understand the problem(s) that it solves. I need to understand the benefits __to me__.<p>Time is important. Absolutely! And while a free trial is helpful it is not a replacement for crisp and transparent communication. I cannot afford to spead half a day trialing only to find out Solution X isn't a good fit.
* Does it address an area that I find wanting in currently used tools/tech?<p>* Does it introduce a new way of doing something compared to how I'm currently doing it and is it plausible that it's substantially better.<p>* Does it do something similar to what I'm currently using but with significantly higher performance or reliability?<p>If any of the above are true, I will investigate and follow it and try it out on tests/projects matching its maturity.
I always try to find some negative but thorough anecdotes about the new tech. Either where it is known to fall short or even better some unexpected issues.
Mostly how does new tech integrate with existing tooling for the target platform.<p>Having less IDE features available, lack of graphical debugger, additional build systems, having to manually write FFI wrappers, not exposing all the features available in the platform languages, impedance mismatch to create libraries to be consumed by the platform languages are all issues that I might consider negative when evaluating new tech.
Do I need to give the creator my credit card to test it out? That's normally a hard pass. Do I need to sign up for a mailing list? Or log into something? Pass pass pass. Does the software opaquely datamine me? Is Linux supported as a first class citizen. Not 'eventually', but now and feature complete.
There is no way for me to reply to all of that answers, as I have to work myself. But I really would like to thank all of you for answering my questions. Now not only I know, what information to include in my post, but also what kind of things to have in mind when looking for new tech :)
What yak shaving does it reduce/eliminate? Am I currently having to do that kind of yak shaving?<p>There's way more tech out there than I have time to learn. If it doesn't make some of my problems go away, then I don't have time for it...<p>... unless my employer pays for me to learn it.
It usually comes down to how easily I can dive in unless it's something super intetesting.<p>I examine how easy it is to start playing with the capabilities of the technology today and the possibilities that it could be used for.<p>Can't learn swimming by study, research and reviews alone.
One of the big things for me is the maturity of the language. Also, what sort of support it has. For example, Microsoft is backing C# so that probably has a bright future.
* What problem this technology solves?<p>* How big is the learning curve?<p>* How is better than the competitors?<p>* How costly is to use it?<p>* Is it mature enough?
1. Who uses it and what have they done with it?<p>2. Is it actively maintained? (How) are issues dealt with?<p>3. How popular is it and what is the trajectory?<p>I don't want to be the first person to run into all the issues. I'll let the kids do that, they love shiny new technology.<p>I don't want to invest into a stopgap technology (like Coffeescript). My gut feeling is that Elixir is one of these technologies, it has some good ideas that will probably show up elsewhere in due time.<p>I do value "XYZ sucks" posts to some degree, but unless I already know the technology reasonably well I probably won't be able estimate the impact of the issues.