Your question is rather more complex than simply "easier" or "harder." There are a number of axes which have shifted over time.<p>I would say that it has become cheaper, faster and easier to identify software vulnerabilities that are low-hanging fruit, and conversely more difficult, time-consuming and expensive to identify software vulnerabilities in higher levels of abstraction or complexity; by way of association, overall exploitation ease ("hacking") for respective complexity classes has shifted commensurately.<p>Overall, I would say that it has become <i>easier</i> to find and exploit software vulnerabilities in websites in the <i>aggregate</i>, and that it is significantly more difficult than it has been in the past to find and exploit vulnerabilities in <i>specific, individual websites.</i> I view this as being the result of a confluence of factors, some of which have to do with the commoditization of many parts of the security industry (including human skillsets), and some of which have to do with the proliferation of ever-increasing amounts of software:<p>1. As you mentioned, there are many sophisticated tools for quickly identifying and exploiting vulnerabilities, especially low-hanging fruit. This makes it easy to find vulnerabilities in websites with no attention, and significantly more difficult to find vulnerabilities in websites with previous attention.<p>2. New software is constantly produced, and existing software is constantly augments, both in sheer lines of code and overall feature complexity. This makes the overall number of vulnerabilities increase, and makes vulnerabilities easier to find overall.<p>3. Software continually undergoes changes in abstraction across the industry. There are entire categories of software developers who know nothing about compiled languages and who have never developed in them. This makes it both easier to produce vulnerabilities due to poor understanding of what a code is doing "under the hood" and harder to find someone capable of exploiting or patching those vulnerabilities once they are introduced.<p>4. The rise of bug bounties has introduced a gold rush for being the first person to identify security vulnerabilities, which has significantly raised the competition. This has made it harder overall to find vulnerabilities in applications that have received previous attention.<p>5. The automated prevention or identification of vulnerabilities has made it easier to prevent them before they appear. Simple SQL injection flaws or cross-site scripting errors can be found with fully automated software. Even complex cross-site scripting in the DOM can be reasonably identified through automated means. Cross-site request forgery may no longer be a serious threat in the next five years due to simple browser changes. This has made it harder to find vulnerabilities, but mostly because fewer of these vulnerabilities are being introduced.<p>6. Security has become increasingly mainstream, which means that, (much like bug bounties), there are teams constantly trawling through monolithic, open source code to find serious deficiencies. As technical debt from outdated, insecure modes of software development is reduced, it becomes harder to find vulnerabilities, though it certainly appears as though vulnerabilities are <i>increasing</i> due to how mainstream findings are these days.<p>I still find things like cross-site scripting when I'm on security assessments, but it's frankly harder than it used to be. In contrast, things like insecure crypto, insecure direct object references and API auth logic errors are on the rise and have been for a few years now.<p>Net competition to find vulnerabilities has increased across every sector of the industry, from the work bug bounty hunters are doing, to the work I do in consulting, to the work Google's Project Zero does. The tools you mention have been disproportionately developed for the more competitive areas of security (bug bounties and web/mobile app sec), while the more complex and specialized vulnerabilities of the sort Tavis Ormandy finds still require a phenomenal level of manual research and discovery work before they can be found.