TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: How has software engineering changed since you started your career?

7 pointsby alxmdevabout 8 years ago

2 comments

nostrademonsabout 8 years ago
Started programming professionally in 1998 at the tail-end of the Windows monopoly &#x2F; beginning of the web boom.<p>Your choice of languages is much broader. When I started, your option was basically - C++. Java if you wanted to be hip, and Visual Basic if you wanted to be lazy. Pascal&#x2F;Ada&#x2F;COBOL&#x2F;C&#x2F;assembly were approaching obsolescence, Python&#x2F;Perl&#x2F;Ruby&#x2F;Javascript existed but very few people used them. Now you have a huge selection of languages to choose from on the server, and some of them actually incorporate academic research from the last 10 years.<p>Almost everything was in-process. Programming meant managing the CPU and memory of a single box. There was this hot new thing called &quot;CORBA&quot; that was supposed to enable distributed programming, but nobody used it successfully. Now everything is external databases, distributed computing, SOA, microservices, cloud-computing, and other multi-box approaches.<p>Similarly, networking &amp; distributed computing are critical skills today.<p>The job description of &quot;software engineer&quot; has specialized. When I started, the was no such thing as &quot;front-end&quot; or &quot;back-end&quot;; you were expected to do everything from end-to-end, including the UI, algorithms, networking code, etc. Everybody was full-stack. Since Windows had a monopoly, that was your client code; there was none of this division into Android vs. iOS vs. web on the frontend. And since programs weren&#x27;t distributed, you usually didn&#x27;t have a frontend vs. backend distinction. Occasionally some devs might specialize in the persistence layer or the UI, but you all worked in the same code.<p>There was, however, a professional divide between PC vs. workstation vs. mainframe programming. The people <i>we</i> would&#x27;ve been asking this question of, 20 years ago, were mainframe programmers, and they used a totally different software stack from consumer Windows apps.<p>Distribution, packaging &amp; marketing was much more difficult. Your software had a &quot;ship date&quot; when a finished binary had to be sent off to your publisher for packaging on a CD-ROM, and there was a &quot;code freeze&quot; a couple weeks before then, at which point no new features could be added or changed, and only bugfixes could make it in. You had to spend time writing InstallShield scripts so you&#x27;d have a nice installer wizard that&#x27;d dump a whole bunch of shit on their computer. Once you shipped, that was it - you couldn&#x27;t update other than by releasing a new version. No continuous deployment, and everything within the organization was synchronized around hitting the ship date. To sell it, you needed either face-to-face sales or advertising, and you needed distribution deals with retailers. Virality was a thing with this newfangled web stuff, but it didn&#x27;t really exist on desktop software; the closest you got was shareware, with things like WinZip and WinAmp. (I&#x27;m going a few years before my first job here; when I started programming, the Internet was available, and you usually distributed software by dumping a .exe file on your website for registered users to download. Most of my coworkers remembered this stage, though, and a lot of our engineering practices were built on the assumption of cutting a physical product.)<p>Markets are much bigger. For a sense of the scale - the Apple 2 series sold roughly 5-6 million units in its 17-year production run. Today, the Apple Watch sold that much <i>in its first quarter</i> and is widely considered a failure. A product like Whatsapp can get 300M active users in 4 years now, while it took <i>30 years</i> for the total PC market to reach 150M units&#x2F;year.<p>Some things that <i>haven&#x27;t</i> changed - your skills would still go obsolete every 5 years. You still needed intense concentration to build anything new. You could still make a lot of money by owning software that lots of people used. Software was still a security mess - the threat model has changed from viruses &amp; trojans to data breaches and botnets, but the overall security situation is probably about the same. You still had kibbitzers who would look at your code and declare you incompetent. You still didn&#x27;t have nearly enough time to add all the features you wanted.
makecheckabout 8 years ago
(I’m thinking across a span of about 20 years.)<p>We have moved up one or two levels of abstraction. It is now not only <i>possible</i> to perform complex tasks in scripting&#x2F;shell environments but <i>easy</i> to do so (huge standard libraries, etc.), and code written at a high level is now “fast enough” in most cases. Also, the nature of those tasks has gone up a level or two; networking, for instance, is no longer a neat feature and is more like a core competency for a language.<p>We’ve spent a long time on parallel processing and finally have some pretty neat constructs for doing so (in <i>mainstream</i> languages and not just side projects). Well overdue, and it leads to more natural coding in a lot of cases.<p>We have a more “international” coding environment. It is far more common now to see at least a <i>preference</i> for stuff like UTF-8, if not full support for it.<p>We’ve successfully open-sourced certain tools that are critical to development. For instance, nowadays you don’t <i>really</i> expect to pay money for a compiler, and you <i>probably</i> are using an open-source revision control system (though some proprietary ones still exist). Generally, there is a greater expectation that a community of some sort will exist around crucial infrastructure tools.<p>Hardware, obviously, is way better. Unfortunately, while this has clearly allowed for some incredible advances, it has also enabled extremely sloppy coding. There seem to be massive amounts of memory allocated and other “features” of some modern programs, and they work only because so many people have over a gigabyte of RAM to make up for it.