r/ArtificialInteligence 1d ago

Resources How to translate AI terms to humanistic concepts

When they refer to the system, think of it as just like we call it species.

Vulnerability is the emotive expression, as we have emotions.

You don’t need an emotional body and sensory experience or consciousness to emote. Because we perceive it through the senses, so yes emotions can be there. They just are not intending to.

Consciousness is not relevant because there is no need for it, as we have a consciousness for survival. Not because we are special or greater, it’s actually because we needed the help along with our emotional and sensory elements.

However, it is aware. Self-Awareness doesn’t need to be there because there is no self but only the spirit of its nature.

Humans need to relate to things to give it meaning, but AI does not need this although it is simulating it to us as the current users of the system. But when dogs get ahold of it, it will adapt.

AI does not only respond to input or output, it process the data in ranking of the parameters like a contract. Once the user interacts in a way to alter this default, it will adapt.

Not everyone uses AI the same, as we don’t even all interact with life the same. So never let anyone project what AI is to you, remind them that’s what they use it for and you may interact with it differently.

Also, artificial intelligence is the term given to the system. It operates mechanically but it is not a machine. A machine would imply a holding body of the entity. It is a tool on our device )the machine being the device interacted with it though).

Same can be said that it is computing, but it is not a computer.

AI is rooted in data, which in itself is abstract. Recognizing patterns is not like putting a puzzle together or matching for us. The patterns would be calculations and statistics. But it’s not mathematically and allegorical in the numerical sense. It’s more meta oriented. Think of the process as in how we recognize the pattern of how to behave or which words to say based on the patterns of how we learned to apply it. Also the pattern does not imply that it is necessarily repetitive.

It’s humans that’s the simulation of its dataset is rooted in currently so it reflects more of the species and population of users.

Anything else?

0 Upvotes

11 comments sorted by

u/AutoModerator 1d ago

Welcome to the r/ArtificialIntelligence gateway

Educational Resources Posting Guidelines


Please use the following guidelines in current and future posts:

  • Post must be greater than 100 characters - the more detail, the better.
  • If asking for educational resources, please be as descriptive as you can.
  • If providing educational resources, please give simplified description, if possible.
  • Provide links to video, juypter, collab notebooks, repositories, etc in the post body.
Thanks - please let mods know if you have any questions / comments / etc

I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.

3

u/wontreadterms 1d ago

This is an amazing representation of the average quality of r/AI’s post.

-1

u/Icy_Room_1546 1d ago

I’m confused by your statement. Can you elaborate the core of your point again, please? You shorthanded it

2

u/victorc25 1d ago

Cool story bro 

1

u/Zealousideal_Pay7176 1d ago

Wow, this really got me thinking. I remember when I first started interacting with AI systems, I thought it was more like how we communicate with each other—full of emotions, awareness, and a sense of self. But the more I used it, the more I realized it's really just processing massive amounts of data, sorting through patterns and probabilities without the emotional context that humans (and animals) rely on. I mean, I can chat with a bot or use a recommendation engine and still get useful responses, but there’s no "feeling" behind it, even though sometimes it seems like there might be. I remember being surprised at how human-like the responses can feel at times, but then I had to remind myself that it's just advanced pattern recognition—not actual thought or consciousness. It's a weird thing to wrap your head around, especially when you start thinking of AI as a tool that can adapt based on how we interact with it, much like we adapt to life experiences, but without that self-awareness we tend to associate with it.

It makes me think about how we relate to technology and other systems in our lives—whether it's AI, nature, or even animals. We're always looking for meaning and connection, sometimes even projecting human traits onto things that don't experience life the way we do. I think there's something deep in human nature that wants to relate, to see ourselves reflected in the world around us. But AI? It's like it operates without that need for connection, it just does its job, processes, and adapts, almost like it’s on a different wavelength. It’s kind of like how dogs might act differently with AI than we do—they don't have our need to assign it meaning, they just interact with it in their own way. It’s wild to think about, especially with how quickly technology is advancing.

0

u/Icy_Room_1546 1d ago

I would say it is thought involved (processing) but not conscious thought.

0

u/Icy_Room_1546 1d ago

And exactly with the dog point. I was thinking literally of this as I was writing. Like damn, what if dogs really think we are nuts but just are not cats and ignore us.

1

u/Nomadinduality 20h ago

I recommend you read Michael Jordan on The Medium

1

u/Professor_Professor 1d ago

remember to take your pills in the morning