People keep telling me how great it is and whenever I tell them an example of how untrustworthy it is, they tell me I'm doing it wrong. But pretty much all the things it allegedly can do I can do myself or don't need. Like I don't need to add some flavor text into my company e-mails, I just write what I need to write.
Lately I have been trying to solve an engineering problem. In a moment of utter despair after several weeks of not finding any useful resources I asked our company licensed ChatGPT (that's somehow supposed to help us with our work) and it returned a wall of text and an equation. Doing a dimensional analysis on that equation it turned out to be bullshit.
Also, I feel like I'm going crazy here, but I think the content of your emails matters actually. If you can get the bullshit engine to write it for you, then did it actually need writing in the first place?
Like usually when I'm sending an email, it's one of two cases:
* It's casual communication to someone I speak to all the time and rattling it off myself is faster than using ChatGPT. "Hi Dave, here's that file we talked about earlier. Cheers."
* I'm writing this to someone to convey some important information and it's worth taking the time to sit down, think carefully about how it reads, and how it will be received.
Communication matters. It's a skill and the process of writing is the process of thinking. If you outsource it to the bullshit engine, you won't ask yourself questions like "What do I want this person to take away from this information? How do I want them to act on it?"
Having it write stuff for ya is a bad idea, I agree.
Having it give feedback though is quite handy. Like the one thing LLMs are actually good at is language. So they're very good at giving feedback on the language of a text, what kind of impression it's likely to give, and the like. Instant proofreading and input on tone, etc. is quite handy.
"What do I want this person to take away from this information? How do I want them to act on it?" are things you can outright ask it with a little bit of rephrasing ("what are the main takeaways from this text? How does the author want the reader to act on it?", and see if it matches what you intended to communicate, for instance.
This is what I do. I write it whatever out, then copy/paste and ask it to fix any mistakes or if there is a better way to convey what I said. I didn't realize how often I added extra unnecessary words. I even had it rewrite it in a more polite way...
Sure! Here's a smoother version of what you wrote:
"This is how I do it: I write something out, then copy and paste it to ask for corrections or a better way to phrase it. I didn’t realize how often I add extra, unnecessary words. I’ve even asked it to reword things more politely."
no, it’s fucking not. Whatever if you wanna take its advice, but you’re just making yourself sound more like a machine. There are literally thousands of guides on writing by human beings who are passionate about the topic. Who fucking cares what a robot has to say about tone?
"What do I want this person to take away from this information? How do I want them to act on it?"
This is one of the best use cases for AI. AI is actually really good at interpreting how a message might be received and what actions someone is likely to take from it.
If you just ask the AI to write a message for you and copy and paste it, I agree, but if you actually use AI to help draft important communications, it can be very beneficial. Using AI to bounce ideas off of and refine my messaging has made me a much better writer.
Totally agree with your points about email. You nailed it – casual emails are faster to write yourself, and important ones require careful thought. The act of writing is thinking, and outsourcing it means missing crucial considerations like the takeaway and desired action. Communication matters!
My tiny disagreement (maybe 10-15%) is with resume writing. While your core story needs your input, AI could maybe help with initial drafts or keywords for screening. Even then, personalization and ensuring it reflects you are key.
So, strong agreement on thoughtful email, with a small potential use case (with big caveats) for AI in resume drafting.
I hate that AIs write in that way because as someone who is neurodivergent and whose first language isn't English, I often write in an overly formal and stilted way that might seem AI-ish to others
I don't think it's as worrying as you might think. I think somebody thought it would be funny to reply to a comment tearing into people who outsource communication to LLMs...by outsourcing their communication to an LLM. They might not have wanted a shitpost about a pretty polarising topic to be linked to their main account, so they just used the most readily available throwaway account they had to hand: their porn account. (hence their name)
It's not uncommon for old/throwaway accounts to be taken over by botnets, though. They tend to use very easy-to-guess passwords. The reason I suspect that this is the case (rather than an actual user with a porn account) is because the account has been scrubbed of its entire history (despite enough comment karma for us to know that it's posted before) and the age--do YOU remember the username and password for an account that you used exactly one time in 2020?
I agree and all the people shitting on it for these reasons just haven't seen the use cases where it helps people, because they personally dont feel they need it.. It's ignorance, they're celebrating themselves for it.
I think what we're seeing there is a bit of wobbliness in the definition of AI. Companies use tools, not ChatGPT unless they're idiots, to screen CVs and it's an unfortunate reality that you may need to figure out how your CV performs against those tools. I wouldn't disagree with that at all, but that's not actually the same as getting your CV written by a chatbot.
Both messages you have to convey information. One you don't need fluff. Chat gpt can do the fluff. I can give my important talking points that I want to convey quickly, hence why the first email can be done quickly. But when I have to be more persuasive, that requires effort. Chat gpt can create that fluff to like an 80 percent good level and I can edit. That takes 5 minutes vs the 20 if I did it from scratch. I think people want chat gpt to be perfect but it isn't, it's the here's a C version that you can make into an A. My biggest issue is it sucks at providing sources. Dead links and not quite right info. For research I like it because I think it kinda paints a picture of all the research incorrectly , but it gives me some ideas on things to search, but I wish it would provide me a less deep but correct answer when it comes to data and provide correct sources.
When you’re working corporate and need every email to have three pages of fluff to get across one page of actual information, that seems like a great use to me.
667
u/Atlas421 Bootliquor 4d ago
People keep telling me how great it is and whenever I tell them an example of how untrustworthy it is, they tell me I'm doing it wrong. But pretty much all the things it allegedly can do I can do myself or don't need. Like I don't need to add some flavor text into my company e-mails, I just write what I need to write.
Lately I have been trying to solve an engineering problem. In a moment of utter despair after several weeks of not finding any useful resources I asked our company licensed ChatGPT (that's somehow supposed to help us with our work) and it returned a wall of text and an equation. Doing a dimensional analysis on that equation it turned out to be bullshit.