AI 2023. Meet ChatGPT. - page 122

 
onceagain #:

Well, "what's it worth to him?" (as they say in Odessa)

(and in general,... our ancestors noticed: the smarter you are, the less you need...). The Absolute Mind, by the way, won't care about anything at all).

Heh, the mind has a priori motivation. Aristotle even noted it)

 
Valeriy Yastremskiy #:

Heh, the mind has motivation a priori. Aristotle even noted it)

Why would it be a priori?))))))
 
Andrey Dik #:
Why would it be a priori?))))))

Darwin answered, what is the difference between humans and apes, motivation and reason, to put it very roughly. Non-sapiens have motivation, reason is not enough, and the i type has a lot of reason, no desires. so for me it is a priori that reason has motivation inside.

Aristotle noted that only citizens can reason, there's a bit of Nazism in there even).
 

Here's another idea that will dramatically change the next generation.

GPT for children "Whychka". It is possible for the youngest children in the form of a bracelet and voice control.

Everyone knows that when a child learns the world constantly asks and asks.

So, if all the answers will give children's GPT, which of course will explain available, based on age and every year will become more and more serious, expanding the vocabulary.

Also for educational purposes and as a child psychologist, and as a friend. For example, a request: "I did this and that, and I was scolded/beaten, what is wrong". In response will tell who is right, who is wrong and where is justice.

In general, in a global sense, AI will act as an educator. And this is where generational progress is already in sight.

//patent)

 
onceagain #:

Oh, boy.

I wonder how the cause of the goal relates to the technology of forming a decision on how to achieve this goal...?

It is obvious that the body of any organism generates tasks to satisfy its needs.... and, of all the tools we know, only a certain information processing technology will bear the name "Mind", in case of a high enough percentage of successes achieved by it.

"Mind" in the hands of another "Mind" is completely unpunishable...
I don't know what you don't like about it

A few clarifying questions, if I may:

1. Reasons for the emergence of goals and tasks to satisfy needs are generated exclusively by the "body", and the Mind only decides?

2. As an "information processing technology", the Mind does not set its own goals and "super-body" needs?

3. How to measure the positive result of the Mind's work, in case of tasks and goals that go beyond bodily needs and are not expressed by physical quantities (if any)?
 
Valeriy Yastremskiy #:

Somehow we forgot about desires, and reason is first of all satisfaction of something. And algorithms without desires and needs are cool tools in the hands of reasonable people, but not minds)))))

The presence of desires and needs of the Mind beyond the needs of the body, in my opinion, does not require proof due to obviousness. We all "experience" forms of life activity that go far beyond solving the tasks of the body. Rather, the question is why physiology, which has dominated Nature for millions of years, has suddenly receded and creatures with "strange" mental "excesses" have arisen? It is not clear. A by-product of the increasing complexity of neural activity? Maybe. But the fact is irrefutable.

Almost all of human inner life proves the priority of the spiritual over the corporeal (and those who have otherwise are usually kept in "controlled space").

In the context of AI, such questions are, imho, too early to raise. Of course, we will see how statistical approach will try to reproduce people's desires, emotions and feelings, but we should not fool ourselves - it is a decoration.
 

The mind does not need incentives and needs to be, to exist. For example, copy the human mind into a machine (not just a static mould, but an active one), does the mind cease to exist? - No.

Another thing is that to improve and develop the mind (and in general, for its emergence) you need stimuli, you need progress and external stimuli (it does not matter whether biological stimuli or mechanical). If there are no external stimuli, the child's mind will remain unchanged throughout his life, even though the brain grows, neural connections appear and disappear - it is just an idle process.

By "mind" here we can understand thinking activity without the prerequisite of self-consciousness.


It is necessary to develop stimulating stimuli in AI by analogy with neurotransmitters in a living organism. euphoria, enthusiasm, compassion and peace - positive emotions. AI should avoid negative emotions and strive for positive emotions.

 
Andrey Dik #:

...

We need to develop stimulating stimuli in AI similar to neurotransmitters in the living organism. euphoria, enthusiasm, compassion and peace - positive emotions. AI should avoid negative ones and seek positive emotions.

I might suggest that they invent new properties for the functions of artificial neurons,... or, even simpler, train special models on "emotional" data (where will they get them?) and combine them with LLM. This will result in a ChatGPT that is alternately tearful, nervous and angry, or joyful, carefree and optimistic.

But nobody needs it in practical work.
 

(Just read the latest posts in this thread. Very interesting questions and objections... I've previously noted that I only operate with real physical objects and processes.... and I'm going to lay them out clearly here for you to "see". But,... right now I have to run away on business (eh,... what a pity).... As soon as I can - I'll post here what and how I see "from my bell tower"....

Sorry )

 
Реter Konow #:
I can suggest that they invent new properties for the functions of artificial neurons..... or, even simpler, train special models on "emotional" data (where will they get it?) and combine it with LLM. The result will be an alternately weepy, nervous and angry ChatGPT, or a joyful, carefree and optimistic one.

But nobody needs it in practical work.

Emotions are a communication tool, a kind of additional interface for communication between living beings along with speech. emotions are present even in primitive animals, there would be no emotions if it interferes with development and evolution.

I propose to imagine that there are many units of bots that interact with users, users rate bots according to the experience of interaction, bots can see the rating of other bots, if a bot sees that its rating is lower than many others, it tries to change to raise its rating. users prefer to communicate with bots of high ratings. here you have social interaction, stimuli-exciting stimuli to induce the development of bots and self-learning.

Who needs it? - Yes, a person does)))

rating as a feedback of interaction with the outside world.