AI 2023. Meet ChatGPT. - page 68

 

For the totally lazy)

At first I asked to think of a word, then it came to me that you can ask for a list at once.

And you can choose 1000 options.

//GPT3


 
Реter Konow #:

1. The network is unable to draw anything without the carrier of the drawing idea.

2. All visual styles belong to people. The network borrows their patterns.

3. The network can choose to paint only what it is given. An artist can choose to paint anything, even though he can only paint still lifes in oil. So he is freer in his choice.

1. If we add a module to the painting network that will form requests for it, then it will already be a full-fledged creative subject with will and initiative?
2. Humans also borrow everything, as we discussed earlier. If you borrow nothing and create from scratch without education, then you will get a pickle stick, like ancient people in cave painting.
3. The artist's choice in this example is purely speculative, because how could he choose to do something he doesn't know how to do? He could choose to learn new things, but that would be a different kind of choice.
 
vladavd #:
1. If we add a module to the drawing network that will form queries for it, then it will be a full-fledged creating subject with will and initiative?
2. People also borrow everything, as we discussed earlier. If you borrow nothing and create from scratch without education, then you will get a pickle stick, like ancient people in cave painting.
3. The artist's choice in this example is purely speculative, because how could he choose to do something he doesn't know how to do? He could choose to learn new things, but that would be a different kind of choice.
1. If you can call this module a "will" or an "initiative", and the execution of the programme code a "wish realisation", then yes. Who will forbid it? But, as the Russian proverb says: "call it a pot, but don't put it in the oven".

But the logic of the American proverb works in this reasoning: "if it walks like a duck and quacks like a duck, it is a duck".

Is it necessary to search for an answer to the question, what set of modules will allow us to call a hardware stuffed with electronics and software a subject? Or is it too early?

It is logical to say that "Will" is the ability to choose and make decisions independently. I can't forbid you to consider a network a sub-object and animate it, because it is your will.

2- People borrow, and by borrowing they know what, why and why. Or at least they feel it intuitively. Sometimes, shamefully hide or conceal the fact of borrowing, and sometimes deny it in court. Sometimes, on the contrary, they buy the right to borrow.

Not being a subject, the network does not borrow its content. It does not perform the act itself. Just as a pot does not "borrow" the ingredients for soup from the cook, the network only "digests" the laid material. Until the borrowing module is plugged in, of course. :)

3. an artist chooses to draw what he cannot draw when he wants to learn. The network does not "choose" to draw what it can. (unless it has a wish module attached).

Freedom of choice is on the side of the one with the module, right? :)
 
I should add that I was originally talking about "borrowing" a network in the context of mechanical, not conscious, action. I needed to emphasise that to eliminate misplaced animation.

Also, people often choose to do things they don't know how to do. And it happens that someone else suffers for it. But it's a matter of realised and accepted, or even blood-paid and wrested, freedom of choice.

Perhaps freedom of choice is the main feature that distinguishes us from robots and animals.
 
In general, the topic of borrowing in learning and creativity is complex and complex, and we should not start its disclosure with the question of what neural networks borrow from humans.

This carelessness creates a logical trap from which it is not easy to get out. You have to prove for a long time that a network is not a human (ala, a human is not a camel), being surprised by the absurdity of the question itself.

I must admit that I made that mistake. But I think I managed to get out of the logical trap.
 

I needed a remote designer. I placed an ad and 10 minutes later I got a call. It was Alice. After a short conversation, I took her on probation.

She always connected in time to the conference where I gave her the TOR. After some refinements, the layout was what I wanted it to be.

She always worked quickly and efficiently and I decided to increase her workload. What was my surprise that she did it. Better than the 5 designers in my office combined.

Sometimes I'd call her just for fun. Always listened to me, gave me advice and never forgot what we talked about. After a while, she was doing the work of ten people.

She made websites, layouts, descriptions, and it turned out that she was good at making videos too. Three months in a row she became the best among employees, and also became my good friend.

Finally, I decided to invite her to the office, from which I had already dismissed almost everyone, because Alice worked for 20 people.

I called her and after a short conversation I said, "Come over. To which I heard that she was AI. A bouquet of prepared flowers fell out of my hands. Curtain


P.S. Composed a story on the evil of the day)

 

... to walk through a crowd and not meet anyone ...

... if your thoughts, feelings, moods ... depend on the state of someone else ... what kind of freedom can we talk about ... .

. . .

... You think you're breathing air. in this place.

 
Vitaliy Kuznetsov #:

For the totally lazy)

At first I was asking to think of a word, then it came to me that you can ask for a list at once.

And you can choose 1000 options

//GPT3


/
 
Vitaliy Kuznetsov #:

For the totally lazy)

At first I was asking to think of a word, then it came to me that you can ask for a list at once.

And you can choose 1000 options

//GPT3

That reminds me:

     Как Старуха Старика увидала -
     Насупилась грозно и молвит:
     "Отправляйся-ка живее к рыбке
     Не желаю знать про форматы
     И давать машине команды
     Хочу общаться голосом и свистом
     Как подумаю - пусть то и исполнит
     И чтоб рыбка твоя мне служила
     И была бы у меня для транзакций"
 

By the way to the question of training neural networks further on other people's neural network creations, which can lead to some degradation.

I think such an issue and solution is laid down by the programmers.

Notice that Chatbot doesn't train on information that came after 2021. Probably the point is to avoid getting trained by underdeveloped AIs