The primary audience here isn't average people, or even engineers at the company. The primary audience is investors - the idea is to trick idiots into dumping their money into <i>this</i> overvalued company rather than the infinite other ones that also claim to solve groundbreaking problems with AI.<p>In the past, it was <i>hiring</i> (or at least the appearance of hiring). But that's an expensive signal and not sustainable for long in a post-ZIRP period. On the other hand, bullshitting about AI (and before that, <i>blockchain</i>) is much cheaper in comparison, and seems to do just as good without actually needing to hire anyone or even pretend to.<p>Whether AI is actually used, or helps the bottom-line, is irrelevant (it's not possible to conclusively say whether a piece of code was authored by AI or not, so the subsequent narrative will be tweaked as necessary to fit the market conditions at the time).
> We will add AI usage questions to our performance and peer review questionnaire<p>Not kidding, but I'm actually afraid people will check AI usage and start nagging us that:<p>> "You are slow because you don't use enough AI. Look at Tom (single 28 yo guy working every weekend on Adderall), he uses AI and he is fast. Gonna add a note to check up on your AI usage metrics in a month, hopefully it will improve".<p>Our company has Cursor, which I sometimes use, but 1. for lots of tasks, the most precise language is the programming language, 2. I don't love it, prefer other editors, and I go for in-browser search + AI.<p>If this letter was published by my CEO, I would either 1. ignore it, as CEOs are often out of touch when it comes to actual day to day work or they need to jump on the AI train to show how visionary they are even if they are aware of the limitations, 2. start looking for a job, because honestly, today it's a letter like this, in 3 months, you get a negative performance review because you didn't send enough queries to Cursor.
> all of us who did have been in absolute awe of the new capabilities and tools that AI can deliver to augment our skills, crafts, and fill in our gaps.<p>Am I fundamentally missing something about the experience upper management has with AI versus the experience of a mid/senior level developer in the trenches? I also have access to Cursor and a few other LLMs.<p>They're handy for bouncing ideas around and for some simple tasks, but I've hardly felt like it was a force multiplier.<p>When it comes to coding, the amount of times it's hallucinated a function that didn't exist or used a deprecated implementation has felt like a net neutral at best.<p>> We will add AI usage questions to our performance and peer review questionnaire.<p>> AI must be part of your GSD Prototype phase.<p>I can understand asking your devs and other employees to try out AI in their workflows in an attempt to get some productivity gains. Sounds like a responsible CEO trying to milk some value out of his existing headcount. But, it sounds absolutely dystopian to tie performance metrics or changing the project planning process to be AI-centric.<p>I doubt every project fits that paradigm.
> > We will add AI usage questions to our performance and peer review questionnaire<p>whatever happened to "never go full retard"?<p>talk about being out of touch.<p>I'm down to try and use AI to enhance processes and encourage people to do that, but making it part of performance reviews is just silly.
I don't want to be rude, but it feels like this is written by ramblingwordsaladGPT.<p>This message should be 10 to 20x shorter, to the point and clearly actionable. Instead it feels like we got the output of prompting "can you turn these few bulletpoints into a 3 season telenovella "?
We joke about AI at work a lot but man if our CEO told me I <i>had</i> to start using it and that my performance would be judged on that - yeah I'm out. Why don't you let your developers decide what and how much AI they want or need to use.
Was talking to my buddy who works there and they have a section in the performance review for how eagerly you're using AI. It honestly sounds like that organization is run by cargo cult clowns though.
Wow Shopify has fallen so far down now it's just a trashy dumb company. No longer a tech company doing elite things. I wish companies didn't become mid so soon. Tobi really needs to step down and resign. It's a horrible look and tremendously bad for morale. And he seems to be feeling the pressure. When you have to expect people will spread it in bad faith, you really have some skeletons in your closet.
<i>AI Can (Mostly) Outperform Human CEOs</i> - <a href="https://hbr.org/2024/09/ai-can-mostly-outperform-human-ceos" rel="nofollow">https://hbr.org/2024/09/ai-can-mostly-outperform-human-ceos</a> - September 26, 2024
My company jumped on the AI bandwagon last year because some executives fell for the hype.<p>We jumped off the bandwagon this year. AI isn't useful for what we do, and it required too much handholding and reviewing. The only thing it was good for was taking meeting notes, and even there it tripped up on names...despite having access to all the names of the attendees.<p>LLMs are just the Big Blues of today.
If managers are trying to push LLMs in order to squeeze more productivity out of developers and potentially replace them, why aren't developers just returning the favour?<p>Surely, the work output of most managers, which contains many more natural language artifacts than the typical coders' outputs, should be a <i>better</i> fit for an LLM than code.
In the past, they just call this a hiring freeze. The CEO doesn't want to admit that the company isn't doing great, so the excuse is now "you must explain why the job can't be done by AI".<p>I mean, if the job can be done by AI, he should go the extra step and lay off half of the employees, that's the sensible thing to do for any CEO. Why not do that today?<p>This is pretty dumb. I am looking forward to the news when the CEO is ousted and AI can't even save him.
> I’ve seen many of these people approach implausible tasks, ones we wouldn’t even have chosen to tackle before, with reflexive and brilliant usage of AI to get 100X the work done.<p>Ah yes, the 100x engineer delusion. Why not a 1000x?<p>If the engineer is 10x, the AI is 10x, and they buy a laptop with 10x flops, install lights with 10x the lumens in their office, give them coffee ground 10x finer, they could all easily be 100,000x engineers.
The general opinion on the post seems to be justifiably negative. I have s few things where AI has helped and one where I'm not sure how i could have done it before.<p>1. It's great to cure "starting trouble"<p>2. A specific case is to brainstorm ways to layout libraries to handle cases.<p>3. When I code, using an AI completer is usually faster than looking up docs but with lsp and familiarity with the language, it's not that great.<p>4. When I'm setting up new software for the first time, i give the AI a few blog posts and books and then have it guide me through while explaining concepts.<p>5. As a Google substitute to look for things. I get clear answers for most things which aren't too complex.<p>6. I've used it to explain concepts to me. eg. the difference between ac and dc current.<p>7. I've used it to create throwaway apps which i don't care about maintaining.<p>8. I wanted to get a copy a proprietary program to integrate into my application. I couldn't get it. I had an LLM create s dummy version which produces the same output and did my integration. It more or less worked when I substituted with the real tool.<p>As for the hype, yeah.. Genuinely s problem but there's some meat in there.
Reading that thread, it's depressing to see the amount of people @grok to summarize an article of maybe 3/4k words. I think it's disgusting that these same companies that have wrestled our attention spans from us the last 15 years, are now pushing "companion AIs" to quicken the rate at which we consume content on their platforms.
>"Before asking for more Headcount and resources, teams must demonstrate why they cannot get what they want done using AI. What would this area look like if autonomous AI agents were already part of the team? This question can lead to really fun discussions and projects."<p>And there it is. None of us will be fired or laid off "because AI". The next person simply won't be hired. And then one day, that next person will be you or I.<p>Really this is a direct, perfect analogy to the industrial revolution. One small person operating a steam shovel could all of the sudden do the work (faster, with higher quality) that <i>for all of human history</i> required hundreds of strong men to do, and it changed the entire world practically overnight.<p>Businesses simply do not need the mass droves of SWEs who exist to type out CRUD code anymore. We will go the same way of the manufacturing workers of the 20th century, where the small percentage of those who could deeply specialize and adapt and master the processes were ok, but the vast majority never recovered, and ended up in minimum wage misery.
Gaslighting engineers with "you don't know how to work with AI" is like telling Doctors - "you don't know how to look up WebMD" for not finding a cure for cancer!!