Not remotely equivalent.
Calculators didn't hallucinate and lead students down a path of confusion and wrong answers.<p>For what it gets correct, it is the replacement for everything. Thought itself can be outsourced. Education's purpose isn't simply to have bodies in seats serving as proxies to a machine.<p>And if LLMs could fix the hallucination problem, then they are more in a position to replace education itself versus being a tool for the student.
Well, what happens when the kids get to university and they have to hand write essay based exams?<p>I know these were big in the humanities, and where I came from, they were sometimes worth 30%-40% of the final grade.<p>This seems like setting the kids up for a rude awakening...and probably more mental health problems when they fall behind their peers, who had limited access to GPT and had to do things the old fashioned way...<p>Just my opinion...
> And at the end of the day, who does this hurt?<p>> The students.<p>> Embracing innovation and adapting to modern educational tools can lead to more empowered and capable students, ready to face the challenges of the future.<p>Bullshit. Go train your own ChatGPT with only AI-generated content; see how well that performs at real-world tasks.<p>Calculators in the 80s weren't confidently wrong and they didn't engender their content with a tone. Furthermore, they didn't make mental math obsolete. We all have calculators in our pocket, but hardly anyone uses their phone to calculate change or ballpark a tip. Banning ChatGPT is the same expectation as trusting your students not to plagiarize an answer from another student or encyclopedia.