TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Ask HN: Is Prompt Engineering a Thing?

18 pointsby maddermusicalmost 2 years ago
I'm trying to research the subject but I don't see much evidence that companies are racing to hire prompt engineers. Matter of fact, I'm not entirely clear what prompt engineering entails. Words of wisdom from experts would be most welcome.

15 comments

jstx1almost 2 years ago
Two different questions.<p>&gt; Is Prompt Engineering a Thing?<p>Yes, it&#x27;s a dumb name for the skill of modifying your prompts and questions to the LLM in a way that produces better results than if you just asked for what you wanted plainly. As language models get better, this might become obsolete.<p>&gt; I&#x27;m trying to research the subject but I don&#x27;t see much evidence that companies are racing to hire prompt engineers.<p>Because it&#x27;s not really a job. Think of it like using the Google search engine - being able to search well is something you can get better at but being a &quot;Google search-er&quot; isn&#x27;t a career or a job you&#x27;ll see openings for.
评论 #36974106 未加载
goodsidealmost 2 years ago
I’m a Staff Prompt Engineer @ Scale<p>Is PE a fast-growing career: Not really. Lots of developers are writing prompts, very few doing it full time or making it their job title. I do like Kaparthy’s suggestion for “AI Engineer” to describe the growing ecosystem of engineering around LLMs though — the name of that activity isn’t settled yet but it’s many people’s full time jobs in practice. PE is probably the closest fit right now.<p>Most of the full-time PEs I can think of work for the LLM vendors themselves. In that environment it’s a mix API evangelism&#x2F;docs, developing prompts&#x2F;training for high-value customers, or in some cases helping ML teams maintain large prompt&#x2F;completion corpora for tuning (a “prompt librarian”).<p>I work for Scale who provides labeling and RLHF data to LLM vendors. My job is a mix of the above, particularly the prompt librarian aspect but with a focus on adversarial testing and red teaming.
muzanialmost 2 years ago
I think it&#x27;s as much a skill as &quot;typing&quot; or &quot;using email&quot; or &quot;writing 6-pagers&quot;. It&#x27;s low level and likely nobody is getting hired just to do this. But a copywriter who knows how to do it is valuable, far more than someone who knows prompt engineering and tries to apply it to copywriting. Domain expertise is important.<p>Most of the really bad &quot;soulless&quot; outputs people get stems from inadequate communication. &quot;write a poem about a dog&quot;. Are you surprised that it writes a generic fluffy dog poem? What about dogs do you want to say?<p>&quot;Write a poem comparing loyal servants of tyrants to dogs, in a positive manner, but with a darker undertone.&quot; Now that gives you a much more interesting output.<p>As AI approaches reasoning levels similar to GPT-4, it&#x27;ll probably be much less beneficial to know how to talk to it. The AI ends up just needing enough context to figure out what you really want. It makes assumptions and asks for your input on those assumptions.<p>But chat models are a beginner&#x27;s level, designed to handhold people to learning how to use AI. The completion models like OpenAI&#x27;s `davinci` (not `text-davinci-`) are a lot more interesting, creative, and unhinged, but also harder to control. These require a higher level of skill.<p>There&#x27;s also being aware of its limitations. How to get it to do math properly. How to prompt hack and prevent prompt hacking. How to steer it towards a certain tone. How to pass it a table format it understands. How to get to write things past 2000 words. Being aware of security holes in code written by AI with certain prompts.
keiferskialmost 2 years ago
In terms of a career, probably not. It will likely just be an added skill required by already-existing jobs. Lawyers will need to know how to use ChatGPT for legal purposes, etc.<p>Personally, I have experimented with customizing prompts for creating Anki cards, and I guess you could call this prompt engineering:<p><a href="https:&#x2F;&#x2F;neurotechnicians.com&#x2F;p&#x2F;generative-ai-and-anki-part-1-chatgpt" rel="nofollow noreferrer">https:&#x2F;&#x2F;neurotechnicians.com&#x2F;p&#x2F;generative-ai-and-anki-part-1...</a>
inconfident2021almost 2 years ago
I don&#x27;t know what I can say about others, as for me, I am quite good at getting what I want from LLMs. I created a chat app using chatgpt without ever having background in frontend web development. I managed to do it in 2 days. Think of going from 0 to fully functional client in 2 days without having any idea about frontend.<p>It&#x27;s a prototype. It doesn&#x27;t have many states. But it gets the job done!<p>I have different other experiences where prompting has helped me. The only issue is I think it will be irrelevant in future given how fast these models are improving.
tikkunalmost 2 years ago
In short, from a perspective of &quot;can I train and get a job as a prompt engineer and just write great LLM prompts as a full time job&quot; - no.<p>From the perspective of - is there a skill to writing prompts that get good results from LLMs - yes, definitely. Just that ~no companies are truly hiring for that as a full time role.<p>I wrote more on this a few weeks back here: <a href="https:&#x2F;&#x2F;llm-utils.org&#x2F;How+to+become+a+prompt+engineer" rel="nofollow noreferrer">https:&#x2F;&#x2F;llm-utils.org&#x2F;How+to+become+a+prompt+engineer</a> - but it basically says what I wrote above, just with some more details.
vunderbaalmost 2 years ago
To a certain degree writing a prompt is more of an art than a science, I would prefer that we called it prompt crafting as engineering implies a certain level of technical ability that in my experience is unnecessary.<p>As open AI has released better and better models (particularly with zero shot learning) the barrier to writing a good prompt has gotten progressively lower.
jytechdevopsalmost 2 years ago
I went to a hackathon where riley goodside gave an elaborate, hour-long presentation on his work at scale ai. There were about 500 people in the building. Of the 500, 499 of them didn&#x27;t listen to a single thing. maybe it is, maybe it isn&#x27;t. The CEO of the company also came out and said it didn&#x27;t seem useful--paraphrasing his comment.
评论 #36972673 未加载
chewxyalmost 2 years ago
You can think of prompt engineering as mining the model for what it knows (I hesitate to call it knowledge). Thus prompt engineering is highly specific to the model. Good thing we only have a handful of models trained in very similar ways huh? (that last sentence is sarcasm, and my displeasure is at the lack of diverse models)
gabrielsrokaalmost 2 years ago
George Hotz thinks so <a href="https:&#x2F;&#x2F;youtu.be&#x2F;dNrTrx42DGQ?t=9226" rel="nofollow noreferrer">https:&#x2F;&#x2F;youtu.be&#x2F;dNrTrx42DGQ?t=9226</a><p><a href="https:&#x2F;&#x2F;www.instagram.com&#x2F;georgehotz&#x2F;" rel="nofollow noreferrer">https:&#x2F;&#x2F;www.instagram.com&#x2F;georgehotz&#x2F;</a>
msoadalmost 2 years ago
I personally don&#x27;t think this can be a long lasting trade or profession if it is a thing as you question.<p>Ideally the language models should understand a question that is not really well formed. Like how Google can understand the queries that lack a ton of context but can figure it out.
m_rpnalmost 2 years ago
No need for prompt engineers when you can just implement an adversarial system to figure out the best prompts for you automagically.<p>Oooops, sorry prompt engineers, but if you try to steal our jobs from us we human software engineers will retaliate XD.
j0hnylalmost 2 years ago
Yes absolutely. Prompt engineering is not just writing a detailed instruction that will be an input for an LLM, it also requires sourcing the data for the prompt and integrating it into whatever LLM workflow you are using. This can get highly technical.
paweldudaalmost 2 years ago
You&#x27;ll be able to tell when leetprompt emerges with stories how someone failed interview at FAANG because interviewer found their prompt for perfect boiled egg gave answer off by 1 minute<p>Otherwise it&#x27;ll be probably a nice-to-have skill on top
mustafa_pasialmost 2 years ago
If you have to have a prompt monkey to figure out the best prompts, that defeats the whole point of having an LLM powered knowledge system.