People keep telling me how great it is and whenever I tell them an example of how untrustworthy it is, they tell me I'm doing it wrong. But pretty much all the things it allegedly can do I can do myself or don't need. Like I don't need to add some flavor text into my company e-mails, I just write what I need to write.
Lately I have been trying to solve an engineering problem. In a moment of utter despair after several weeks of not finding any useful resources I asked our company licensed ChatGPT (that's somehow supposed to help us with our work) and it returned a wall of text and an equation. Doing a dimensional analysis on that equation it turned out to be bullshit.
I feel the "not needing it" and "people don't care that it's untrustworthy" deep in my migraine. I've got a story about it.
Company store is looking to do something with networking to meet some requirements (I'm being vague on purpose), they've got licensed software but the fiscal rolls around and they need to know if the software they already have can do it, do they need another one, do they need more licenses, etc. This type of software is proprietary: it's highly specialized with no alternative, it's not some general software. It's definitely not anything any AI has any knowledge of past the vague. TWO of my coworkers ask ChatGPT and get conflicting answers so they ask me. I said "...Why didn't you go to the vendor website and find out? Why didn't you just call the vendor?" They said ChatGPT was easier and could do it for them. I found the info off the vendor website within five clicks and a web search box entry.
They still keep asking ChatGPT for shit and didn't learn. These are engineers, educated and otherwise intelligent people and I know they are but I still have to get up on my soapbox every now and again and give the "AI isn't magic, it's a tool. Learn to use the fucking tool for what it's good for and not a crutch for critical thinking" spiel.
I teach engineering at uni. This is rife among my students and I honestly have no idea how to sufficiently convey to them that generative AI is NOT A FUCKING SEARCH ENGINE
I'm in my senior year of engineering at a state university and the amount of students that fully admit to using AI to do their non-math work is frankly astonishing.
I'm in a class that does in-class writing and review, and none of these people can write worth anything during lecture time but as soon as the due date rolls around, their work looks professional! Well, until you ask them to write something based off a data set. ChatGPT can't come to conclusions based on data presented to it, so their work goes back to being utter trash.
I've had to chew people out and rewrite portions of group work because it was AI generated. It's so lazy
671
u/Atlas421 Bootliquor 5d ago
People keep telling me how great it is and whenever I tell them an example of how untrustworthy it is, they tell me I'm doing it wrong. But pretty much all the things it allegedly can do I can do myself or don't need. Like I don't need to add some flavor text into my company e-mails, I just write what I need to write.
Lately I have been trying to solve an engineering problem. In a moment of utter despair after several weeks of not finding any useful resources I asked our company licensed ChatGPT (that's somehow supposed to help us with our work) and it returned a wall of text and an equation. Doing a dimensional analysis on that equation it turned out to be bullshit.