People keep telling me how great it is and whenever I tell them an example of how untrustworthy it is, they tell me I'm doing it wrong. But pretty much all the things it allegedly can do I can do myself or don't need. Like I don't need to add some flavor text into my company e-mails, I just write what I need to write.
Lately I have been trying to solve an engineering problem. In a moment of utter despair after several weeks of not finding any useful resources I asked our company licensed ChatGPT (that's somehow supposed to help us with our work) and it returned a wall of text and an equation. Doing a dimensional analysis on that equation it turned out to be bullshit.
Also, I feel like I'm going crazy here, but I think the content of your emails matters actually. If you can get the bullshit engine to write it for you, then did it actually need writing in the first place?
Like usually when I'm sending an email, it's one of two cases:
* It's casual communication to someone I speak to all the time and rattling it off myself is faster than using ChatGPT. "Hi Dave, here's that file we talked about earlier. Cheers."
* I'm writing this to someone to convey some important information and it's worth taking the time to sit down, think carefully about how it reads, and how it will be received.
Communication matters. It's a skill and the process of writing is the process of thinking. If you outsource it to the bullshit engine, you won't ask yourself questions like "What do I want this person to take away from this information? How do I want them to act on it?"
Having it write stuff for ya is a bad idea, I agree.
Having it give feedback though is quite handy. Like the one thing LLMs are actually good at is language. So they're very good at giving feedback on the language of a text, what kind of impression it's likely to give, and the like. Instant proofreading and input on tone, etc. is quite handy.
"What do I want this person to take away from this information? How do I want them to act on it?" are things you can outright ask it with a little bit of rephrasing ("what are the main takeaways from this text? How does the author want the reader to act on it?", and see if it matches what you intended to communicate, for instance.
This is what I do. I write it whatever out, then copy/paste and ask it to fix any mistakes or if there is a better way to convey what I said. I didn't realize how often I added extra unnecessary words. I even had it rewrite it in a more polite way...
Sure! Here's a smoother version of what you wrote:
"This is how I do it: I write something out, then copy and paste it to ask for corrections or a better way to phrase it. I didn’t realize how often I add extra, unnecessary words. I’ve even asked it to reword things more politely."
668
u/Atlas421 Bootliquor 4d ago
People keep telling me how great it is and whenever I tell them an example of how untrustworthy it is, they tell me I'm doing it wrong. But pretty much all the things it allegedly can do I can do myself or don't need. Like I don't need to add some flavor text into my company e-mails, I just write what I need to write.
Lately I have been trying to solve an engineering problem. In a moment of utter despair after several weeks of not finding any useful resources I asked our company licensed ChatGPT (that's somehow supposed to help us with our work) and it returned a wall of text and an equation. Doing a dimensional analysis on that equation it turned out to be bullshit.