ChatGPT and other LLMs just turned the screws on a already broken system.<p>1. Most people get a degree to pass degree checks in jobs.<p>2. College is stupid expensive.<p>3. Unlike with other goods and services, there is no way to get your money back on bad service/goods here<p>4. Professors can basically do whatever they want<p>5. Fed student loans are in the same group as criminal judgements of debt.<p>When I look at all of these, logically, you expect a return from the cost. And the return isn't knowledge, but a diploma. And humans being good tool users, use a tool to raise the chances of getting that (half-assed promised but not really) diploma.<p>Hopefully, this will destroy universities as diploma mills, and go back to proper academic rigor and study. But one can be ever hopeful, I guess.
Last semester I gave my students a take home final exam. I told them they could use AI. I just asked that they document their usage of external sources. This was precalculus class.<p>The first problem was a graph x(t) and y(t). The component functions of a parametric curve. The question was: Estimate the times the particle is at the origin.<p>I got an email from a student: I put problem 1 into ChatGPT and it couldn’t answer it. What are we supposed to do then?<p>My response: Think!<p>Students have mostly been trained that anything that can’t immediately be solved is too hard. They have no academic grit. It’s bad.
When I read articles like this I don't know whether to think I have some decent job security or not. On the one hand, people graduating having used LLM's for a significant majority of their work are almost certainly deficient at critically thinking about computer systems and programming them (there's a good quote in the article stating this much better than I am here). On the other, if they're cheating so much to get well-paying jobs will I --without cheating, relying on just the merits and knowledge from my career-- be competitive enough to find work in the future? Been wondering if anyone else feels this way.
For a number of reasons I got my university degree in Spanish(and Spanish speaking country), where very few people spoke English - I was one of two in total as a matter of fact. I always felt like I was cheating my way through cause I had access to infinitely more and better quality resources and all I had to do was translate them into Spanish and I was instantly 10 steps ahead of everyone else. With all the LLM bollocks these days, I'm suddenly feeling a lot less guilty.
Controversial opinion - perhaps getting free tutoring from LLMs is a good thing?<p>Some thoughts:<p>0. Rich kids (not me) got expensive tutoring. Now that is available to everyone. How is this not great? Was it cheating when rich people paid for it also?<p>1. At my undergrad (Cornell) about 20% of TAs couldnt speak english (or did not want to.) Some of the group office hours would tangent off into other languages. This was terrible to those who only understood the standard language of the school (English.) ChatGPT is available to everyone.<p>3. You dont even need to pay exorbitant tuition to get the same level of support as everyone in the world, this is a great democratizer. Isnt this what people have always wanted?
I found this a very depressing read. Worse than the one about how kids at even elite universities can't really read or engage with a book anymore.
As usual, this article totally conflates "to help with homework" with "to cheat on homework". If you use it like a better Google, learning from its output rather than directly using the text, that’s definitely not cheating by any definition.
The words of Lee in this essay highlight the problem. Paraphrasing: "I didn't do it because I did not feel like doing it, because it felt unimportant/irrelevant/dumb."<p>This is the hubris of youth talking.<p>While this kind of thinking will teach you some practical skills, ultimately you become an operator, and not an engineer.<p>CNC machine designers earn more than CNC machine operators.<p>Car engine designers earn more than car mechanics.<p>Computer chip engineers earn more than people who put computers together.<p>Production line engineers earn more than production line workers.<p>In software development, the titles are unclear, but there is a clear distinction between operators and engineers. Unfortunately we call them all programmers or developers or engineers, as the titles don't truly reflect the kind of work they are doing. When you work with someone though, you know which title they really deserve.<p>It's hard to put it into words, but going forward, the operators will have a harder time finding work as automated AI-based tools will replace them. The only solution is to learn to become engineers, but with the kind of mindset that Lee flaunts in the above article, that is not going to happen.<p>My advice: Learn the fundamentals. Don't cheat. Be curious and strive for deep understanding. It's tempting to cut corners, but already today I see a deluge of people who just don't have what it takes to engineer software.
Consider that college tuition is often equal to what an employer would pay you to do similar levels of work. The opportunity cost for attending college is therefore 2x the tuition. The system is exploiting students who need a paper to get a job, so why shouldn’t students exploit it too?<p>* Mostly applies to degrees where "learning on the job" would have resulted in equivelant knowledge and skills.
> When I asked him why he had gone through so much trouble to get to an Ivy League university only to off-load all of the learning to a robot, he said, “It’s the best place to meet your co-founder and your wife.”<p>Like, dude, I wonder why that is?<p>I get that the students see University as mostly a path to a degree/middle-class-union-card and that the college you go to as mostly a vetting process/sorting hat. Sure, I was 18 once too and didn't care about much past my nose too.<p>But, like, the reason it's a good place to meet people is <i>because</i> those people are attracted by the college itself.<p>Honestly, this is on the admin. They know the kids are all 'cheating' their asses off. I guess until some other feedback mechanism comes in, they aren't going to rock the boat.
Today, my course instructor told me that I work on my Javascript assignments too slowly. He suggested that I use ChatGPT like everyone else instead of writing code by hand. ...<p>Mind you, we didn't have any Javascript courses before, and we started with nodeJS head-first.
> When I asked him why he had gone through so much trouble to get to an Ivy League university only to off-load all of the learning to a robot, he said, “It’s the best place to meet your co-founder and your wife.”<p>Roy understands society better than most people.
Amazingly, the title could come from any era! When I was an undergrad, multiple of my classes had cheating scandals.<p>If writing is now table stakes with GPT perhaps colleges need to adapt? Nobody can afford to waste time given how hyper-competitive everything is.
My university had rampant cheating. Students attending for others was a big one, especially for the exams in rooms with 90 students. Finding someone to write papers for you was super easy, but CS cheating was cut throat. Guard your code because people will steal it, will break into the professors office after hours, and will try to hack your computer, just to name some incidents that happened. Granted I'm now 10 years out of uni but even back then, students felt more entitled to the degree, and were willing to color outside the lines if that is what it took to get it.
I tell my students to cheat all they want. Their managers will need to hit their URA targets.<p>More seriously, students have been cheating for a long time. Chegg.com was used solely for cheating.<p>It is easier to detect LLMs than Chegg. Chegg provided solutions from other students or the solution set. LLMs are sometimes so bad (e.g., because an "accidental" typo in the question) that I can eventually detect who cheats and give no mercy when grading the in-person exams.<p>Disclosure: I use LLMs to grade and give my personal touch at scale.
At least for many engineering and science disciplines a solution remains to force students to actually try to learn the material: in-class, closed book exams. Nothing is ideal, but this does force students to actually engage with the material and problems. They are welcome to use LLMs all they want to help them study (though they should be careful given how often I catch them making horrible mistakes in my discipline). But the assessment will be of them and their brain.
I feel a large part of this is the definition of "cheating". For example, I give take home exams and I allow students to use whatever resources they want.<p>They can use chat GPT, the textbook, Stack Overflow, etc. as much as they want for as much time as they need. Access to these resources during an exam is typically considered "cheating" because using them would confer an unfair advantage, since tests are designed without them in mind. So then just give them access to the resources and make the test harder.<p>Instead of a question like: "Here is some code find the bug", ask them to implement a large system and to document the bugs they encounter along the way, and how they used ChatGPT to fix them. Instead of "what's the definition of TCP" ask them to measure TCP vs. UDP using iperf.<p>If all the students are cheating, then you're not testing the right thing. Make the test fit better to the current information landscape, and professors / students alike will have a better time. Stop testing information recall, start testing for information literacy, analysis, and integration. Stop asking students to answer multiple choice questions, start asking them to build systems.
ChatGPT hit the mainstream market in my final year of undergrad.<p>I was indeed guilty of using it for one assignment wholesale, and a sizable portion of my final practicum. However, in the article it mentions something lightly that teachers use to distinguish LLM work from human work, which also rubbed me the wrong way.<p>The arguments and counterarguments were given equal weighting, unless a command were to be given to the LLM to spit out partiality to one, whereby it is overwhelming in substance (if not in language) towards that thesis. Now, finalizing my grad school time, I've not used it, and have actively discouraged group members from using it, as I feel there is an advantage in searching painstakingly for new, obscure ideas - where LLMs tend to give the same advice for recommendations to anyone who sets them the same problem. I used to do the same thing with Bing, in lieu of Google search, for more 'quirky' ideas to implement in my arguments.<p>There, I believe, lies an advantage for the semi-industrious knowledge student. Travel not the beaten path, but the one that takes slightly more effort for proportionally greater rewards.
I graduated from a top Russian university 13 years ago. I absolutely cheated on a few exams I considered non-essential for whatever I imagined my future would be, including some that are directly part of CS/AM major like stats, after minimal studying effort (for humanitarian requirements, I would not even flinch with 0 effort beyond typing an assignment into the chatbox). Thinking about it, I would do it again with ChatGPT today if for some reason I had to go and pass them again because employment requires a formal diploma.<p>This is because I know my responsibilities would not involve stats, they know it, I don't enjoy stats, and I feel confident in my ability to learn if the need arises with the rest of the math as the basis, but the university requires stats for a diploma.<p>To solve the problem, universities just need to streamline the requirements in their programs. Or find another way to accommodate for slightly more focused people.
My take as a current uni student.<p>This is all absolutely true. I have yet to meet anyone who doesn’t use chat. Some people use it as a personal tutor, most use it as a shortcut to finish assignments. Regardless, everyone is using it. LLM’s are great when used as a tool. Bouncing ideas and asking conceptual questions is a great way to learn (it really is as valuable as talking to a professor for a lot of questions). But how do you convince students to use LLMs as a learning tool instead of a cheating tool?<p>I think the solution is simple. Most people are short cutting with LLMs because they want a good grade, and they want it easy. So, stop with grading. The primary incentive for going to a class should be to pickup some new knowledge and skills, not to pickup a good grade.
This is why it is vitally important to provide a rich, varied educational experience.<p>If it is possible to pass a course using AI, it's partially the student's fault. But it is just as much the EDUCATOR's fault.<p>get these students into groups. Make them spontaneously interact. Use gamification principles. Count these interactions in grading. Prove you have internalized this material.<p>individual attention and personalized instruction were a vital part of my own education (college in the 80s and 90s). There are a thousand excuses why schools think they can't provide it. It often reduces to budget shortfalls. But this can be done with existing tools and personnel. It's more a matter of teaching more engagingly, more creatively.
<i>> “There might have been people complaining about machinery replacing blacksmiths in, like, the 1600s or 1800s, but now it’s just accepted that it’s useless to learn how to blacksmith.”</i><p>Kids, this is why you need to go to school.
At 17 years old, I went to my 4-year state university not only for the diploma and education, but also substantially for the social experience that I never had in the previous years of my life.
So many professors in college are detached and tone-deaf about their syllabus, their approach to teaching, and their workload, that the smart students adapt.<p>One of those adaptations is cheating. I have seen really hardwokring students in grad school finally resort to cheating because the workload was insane, and they realized everyone else was cheating.
Does it matter? I mean yes it does because students potentially cheating themselves out of an education and gross waste of time and money.<p>But do any employers still seriously rely on degrees to select their employees? Many still rely on degree to filter before interviews - but even here, we have only ourselves to blame for this, right? This is one thing we (businesses, employers) have complete control on. It does mean we now need to interview on more things. Like 2nd language, general reading comprehension, or basic numeracy. But didn't we already get to the point where we had to do that? Where we can't even trust that whoever is interviewing is the person seeking hiring?<p>So from a point of view of hiring, perhaps it doesn't matter anymore.<p>The article itself emphasizes one point where it matters: college as ethics training is on indefinite leave. And interviews will have to test for ethics. A tough one.
Just look at the assholes that run this country.
They set the standards.
They lie and cheat their asses off every day.<p>You can't reasonably expect that people WONT lie and cheat their asses off.
The definition of "cheating" will change... because those students are, in fact, <i>learning</i>.<p>They are learning <i>how to accomplish a lot more, a lot faster, with AI.</i><p>That's a valuable skill -- one of the most valuable skills that students can learn today.<p>It's not the <i>only</i> valuable skill they should learn, but it's certainly one of the most valuable.<p>Educators will have to adapt.