Didn't you previously say the professors are lazy? So both the students and the professors are lazy?
In my experience more students were lazy than professors, but this was 30-40 years ago when Universities were still mostly serious institutions with a few outliers (at least in the STEM and professional career paths like education, nursing, finance, and accounting). Based on my recent experience with new grads things have changed.
I feel this may more be describing the nature of how we teach students, and how we hire professors, rather than the evils of AI.
Perhaps, except how would you teach large numbers of students differently that would prevent cheating using AI? In a situation with small teacher to student ratios it might be possible, but that isn't realistic. What do you propose?
If memory serves, the same thing was said about the portable pocket calculator, rendering the need to learn arithmetic obsolete.
Yes, and how many people can do simple arithmetic in their head these days? How many cashiers (many with college degrees in the humanities) can make proper change without a cash register computing it? Right.
This is already being banded about as a problem with respect to the bar exam, which various flavors of ChapGPT are already passing with nearly 90% success rate (yes, even the essay sections!). this begs the question: is the bar exam really exercising understanding and ability to practice law? I deal with legal issues day in and day out as (one of) my day jobs, and 90% of the work is not based on understanding the nuances of estoppel, because that is a well tread and extensively written topic, which can just be looked up when needed (which is exactly why ChatGPT is good at it).
Sure, but just because AI might not (yet) replace your job, it certainly can replace many of your coworkers.
Computer science (and even electrical engineering) follow the same trajectory. Being able to partially differentiate a complex function isn't a skill that is useful in 99.999999% of work that people actually do,
True, but understanding the concepts involved do expand the mind and teach us to solve difficult problems systematically.
and likely has already been done by thousands of people before, which is why ChatGPT can also do it easily.
I'm not sure thst is why GPT is capable of doing it. A single instance in its training set is sufficient.
In my 12 years of graduate education, I've done it countless times in the schooling context, and not a single time in the professional context.
Neither have I. But a STEM education can't predict everything you'll need (or not). There are valid arguments about topics that are less useful, but a broader exposure to different types of problems and different methods of solving them (or approximating solutions) is highly valuable. I never needed calculus or DiffEq in my career. But I worked with mechanical, optical, and electrical engineers who did.
In fact, I hadn't studied math extensively until it was time to my my PhD thesis, and I discovered the math I needed wasn't readily available, and thus I had to derive it (e.g. the entire point of a PhD). The number of people working day-to-day in novel mathematical concepts in science and engineering are a minuscule portion of the workforce, yet it's a focus in education.
Novel mathematical concepts weren't a focus in my undergraduate education. Most math we were taught had proven practical applications. In grad school I minored in math and took some pretty wild grad level math classes. Even then I ended up using some of that in my MS thesis (graph theoretic approches to pattern recognition).
ChatGPT will never be able to teach you how to abide by a deadline, or deal with a co-worker who is a dickhead, how to make it through a code review,
But it might replace enough of middle management and your coworkers that those skills become irrelevant.
or even where to place the connectors on your PCB so that it fits into the case properly,
I'm sure it could.
or how to navigate complex political machinations between VP's, which in totality cover about 85% of working in any technology field.
Again, much of that becomes irrelevant when most of those humans are replaced by AI. Do you think an AI program manager won't see right through the typical BS excuses a lazy employee gives for being late, doing substandard work, etc?