Yup. Plus I just don't think it has a large database of Circuits problems.Also an engineer and musician, two fields that ChatPT struggles with, it spits out a lot of common sense answers, but abstract thought is something it can't come up with which isn't surprising.
It can manage concrete solutions to a degree but any critical thinking request fails..
This. It sounds cute until life and death situations arise and you can’t ChatGPT your way out of it.
Because it can literally give you answers to circumvent needing to learn. Why bother writing an essay where my grammar is challenged when I can have GPT spit one out in seconds?
When chatgpt first came out it was immediately used en masse for cheating on schools. That's a net negative for learning
Who said anything about changes to testing/exams? I doubt anyone would be allowed to use Chatgpt in an exam.
We’ve had google since 1998, how has that restricted the “recall of facts”?
I used mathway.com a long time ago. It was ahead of its time really.
When I was getting my masters, folks got caught using ChatGPT to write their papers. Apparently the professors were told their wasn't anything on the guidelines about using AI so they couldn't hold the students accountable Apparently. They had to add to the guidelines the following semester to include rules regarding AI and work. It was crazy that the school wasn't firm with that because at the end of the day the students didn't write shyt clear case of plagiarism but since it was technically original content from ChatGPT they didn't consider it that. I would have never even thought about doing that at a graduate level and risk getting kicked out.
This is what happens when greed becomes more important than anythingNext generation gonna be dumb AF
A lot of these higher education degrees and shyt gonna start looking really funny in the light. But these cacs Gon say it's all merit.I used mathway.com a long time ago. It was ahead of its time really.
When I was getting my masters, folks got caught using ChatGPT to write their papers. Apparently the professors were told their wasn't anything on the guidelines about using AI so they couldn't hold the students accountable Apparently. They had to add to the guidelines the following semester to include rules regarding AI and work. It was crazy that the school wasn't firm with that because at the end of the day the students didn't write shyt clear case of plagiarism but since it was technically original content from ChatGPT they didn't consider it that. I would have never even thought about doing that at a graduate level and risk getting kicked out.
Yeah but that's after 12 years of school and a minimum of 4 years in college, some with 6 years and an M.S.This has and always will be the case. I’m sure your job has the same people, few who really know how the problem solve. The bulk just know how to press buttons, and the others who are just there for a paycheck.
Yeah but that's after 12 years of school and a minimum of 4 years in college, some with 6 years and an M.S.
I'm late 30s and probably represent the median age of everyone in my department. We know how to cut corners but we also know how to do things the "hard way" with a pen, paper and calculator.
Problem is these kids only know the automated way, so they won't know how to check if something is wrong.
Example - we currently switched to a web-based platform and there are A LOT of issues with calculations, contract language etc. We make it work because we know what we're doing and we know how to explain to IT what's wrong. Without the necessary knowledge, we'd just assume everything is OK and it would cost millions each quarter just off mistakes.
Who benefits from a whole generation of young adults not knowing how to problem solve? Likely someone that doesn't have their best interests at heart.
Yup. Plus I just don't think it has a large database of Circuits problems.
It's much easier to parse through a bunch of LaTeX questions and resolve the logic from there, but things like Circuits that have a lot of techniques, simplifications, multiple approaches, a lot of pictures, etc. it struggles with bad.
Sometimes with these circuit problems you solve one portion, then you need to redraw the circuit and find the new simplification. It's not doing that and it's leading it to crap out. It also for some reason doesn't recognize the difference between simple configurations.
Math I can see it being ok, it's basically an equation solver like Wolfram is that can use a little more context to get the specific Math problems correct. Signals class because it's straight up Math-heavy and basically a Math class will probably be easy for it.
However, yeah things like Circuits when it's a lot less "rote" than that, and requires some critical thinking, it's struggling and can't solve it's way out of a wet paper bag. I didn't even try to throw in any Op-Amps, so any basic Electronics class I'm sure it's not ready for either.
Right now a glorified Google is the best description for it's current capabilities until it can show me the ability to solve problems like that.
The funny thing is, I actually want it to get better at solving these things so that I can delegate work to it, get correct answers quickly and check my understanding.
Kind of like what I mentioned I don't think Math is it's main problem. I can throw Calculus, Linear Algebra, Differential Equations at it and I think it'd do fine for the most part because it's basically rote. If X, do Y.interesting use case
i read a reddit post recently that basically said to have chatgpt write a python script to calculate X and Y than instruct it to run the script against the problem. i haven't tried it but i wonder if that might be of some use to you.
google recently set a new high standard in math.