Okay but I can't avoid noticing the bug in the copilot-generated code. The generated code is:<p><pre><code> async function isPositive(text: string): Promise<boolean> {
const response = await fetch('https://text-processing.com/api/sentiment', {
method: "POST",
body: `text=${text}`,
headers: {
"Content-Type": "application/x-www-form-urlencoded",
},
});
const json = await response.json();
return json.label === "pos";
}
</code></pre>
This code doesn't escape the text, so if the text contains the letter '&' or other characters with special meanings in form URL encoding, it will break. Moreover, these kinds of errors can cause serious security issues; probably not in this exact case, the worst an attacker could do is change the sentiment analysis language, but this class of bug in general is rife with security implications.<p>This isn't the first time I've seen this kind of bug either -- and this class of bug is always shown by people trying to showcase how amazing Copilot is, so it seems like an inherent flaw. Is this really the future of programming? Is programming going to go from a creative endeavor to make the machine do what you want, to a job which mostly consists of reviewing and debugging auto-generated code?