Very first bullet<p>> AI can now generate code faster, and often better, than humans.<p>The author doesn't know what they're talking about. Feel free to skip this one.
Goldman Sachs does not look for coders who have studied philosophy. For the simple reason that Goldman Sachs is not a person, but a firm. It has 16000 engineers, which means many, many different groups. The "strats" are about 25% of that, and they are split into numerous groups too. Groups are made of people, and individual people have their own interviewing style. There might be a few people who are looking for coders who have studied philosophy, but it's not a general trend. Goldman actually does have some interviewing guidelines, you need to pass some internal training to even be allowed to interview candidates. And none of these guidelines mentions anything about philosophy. They are the typical HR interviewing guidelines that you can find anywhere: technical competence, problem solving, business acumen, ethical conduct, communication, initiative, collaboration, goes the extra mile, etc, etc.
Do these people really think writing code is the hard part? It’s knowing what to write, and AI may be a multiplier there, but that knowledge isn’t obtained from asking AI the right questions.
Amazing accidental perspective on how little the typical business person or journalist understands about programming and gen ai. You can see how they think the hard part of programming is writing code, which, of course, is false. The hard part is writing the correct code and maintaining it.<p>I can’t wait until prompt engineering as a profession completely dies out. Its total vapor ware and its existence promotes problematic ideas about what AI is capable of