AI 2023. Meet ChatGPT. - page 83

 
Vitaliy Kuznetsov #:

That's right. You need the language model to understand the request, the emotional colouring. I think she's doing a good job of that. I don't see how it doesn't know what I'm talking about. And this language model will be used for plug-ins and internet access. So we're still in the early stages of AI.

At the same time, it is capable of dialogue and perhaps some role of a psychologist.

Therein lies the paradox.

Connecting plug-ins turns the model from"early stage AI" into a multi-programmed language interface. By taking this step, developers stop developing artificial intelligence technology.Instead, theymove to developing linguistic interface technology .

Plugins are the dead end of developing a language model as an AGI.

Once a model becomes an interface, it will remain an interface forever.

If OpenAI went this way, they have exhausted the potential of the technology.

 
Реter Konow #:

***

If OpenAI went this route, they have exhausted the potential of the technology.

Or maybe they decided to save time? They will create more plugins, they will study their popularity and demand, progress. What they can repeat - they will repeat, the rest will be bought out.

For example, DALL-E could have been considered as a separate product or attached to GPT4 as a plugin, but in the end they combined it (at least in Bing).
 
Vitaliy Kuznetsov #:

Or maybe they decided to save time. They will create a lot of plugins, they will study their popularity and demand, progress. What they can repeat - they will repeat, the rest will be bought out.

For example, DALL-E could have been considered as a separate product or attached to GPT4 as a plugin, but in the end they combined it (at least in Bing).

The most valuable thing about a language model is the result given to the user (not the dialogue, as one might think).

If the model can't calculate the result and calls a third-party application (via a plugin), the model becomes dependent.

Plug-in applications are proprietary and owned by other companies. These companies can create their own language models and plug in as an additional linguistic interface to the main graphical one. Or, they can get away from GPT-4 and plug into LLaMA, for example. Or the LLM of another IT giant.

By becoming an interface, the model loses exclusivity, stops the AI race and goes out of the race. Now, as a linguistic interface, it can be replaced by any other LLM.

I think this marks the end of OpenAI's role in the development of AGI.

That's not surprising, given that they are not the authors of Transformer technology. They have reached the limit and are changing direction to immediate successes rather than global gains. They have no other option.

 
Реter Konow #:

***

If the model cannot calculate the result and accesses a third-party application (via a plugin), the model becomes dependent on that application.

***

Maybe it's just an element of monetisation. There is nothing stopping them from building all this into the basic version, except money and time.

 
Vitaliy Kuznetsov #:

Maybe it's just an element of monetisation. There's nothing stopping them from building all this into the basic version, except money and time.

For example, how can you build WolframAlfa into GPT-4? Unrealistic. The number of applications is huge. It is impossible to "build" their functionality into the model. You can only plug them in from the outside.

Applications are proprietary, i.e. belong to other companies.

 
Реter Konow #:

***

Applications are proprietary, i.e. owned by other companies.

Until they're bought

 
Реter Konow #:

How do you build WolframAlfa into GPT-4? Unrealistic. The number of applications is huge. It is impossible to "build" their functionality into the model. You can only plug them in from the outside.

Applications are proprietary, i.e. belong to other companies.

What are we smoking?

Actually, it's exactly the opposite - Tungsten's speciality is natural language queries. It's like gpt (trained on near-mathematics) in it, not the other way round.

everything you need is already built in

 
Vitaliy Kuznetsov #:

Until they're bought

Let's say, hypothetically, they get bought and start incorporating the source code into the language model. What prevents competitors (Google, for example) from doing the same? How will OpenAI beat them (even with Microsoft's money)? They had a temporary technological advantage, but they are losing it every day, because they are creating a reproducible technology, letting others catch up.

This is an objective process and OpenAI hardly has any alternatives. Soon similar linguistic interfaces with plug-ins from other companies will be put on the market, and OpenAI will get lost among them.

 
Maxim Kuznetsov #:

What are we smoking?

Actually it's strictly the other way round - tungsten's speciality is natural language queries. It's a semblance of gpt (trained on near-mathematics) in it, not the other way round.

Everything you need is already built in.

I'm confused among the two "vice versa". ) I didn't understand the thought.

The point is that you can't build one into the other. You can only connect it.

 
Реter Konow #:

Wanted to discuss this topic in more detail. There's a lot to think about.

...

1. Along with the acceleration of work with information, the processes dependent on the availability of information will accelerate. The efficiency and productivity of scientists, engineers, programmers, designers, artists will increase. Students will learn faster because they will immediately receive information in the right context, with graphs and tables.

2. I think people will not lose their jobs en masse (yet), but their work will become much easier.

3. we are moving to a new stage of acceleration of world processes.

1- Is information work accelerating? I mean, the availability of "query and learn" information does become greater when comparing semantic search engine (as represented by the language models under discussion) and traditional keyword indexed search. However, when it comes to the accessibility that is similar to "query, learn and apply in a real task", i.e. practically useful knowledge, I don't see any qualitative leap in this sense. And the limiting factor here is not the search engine at all, but man himself with his dependence on the biochemical nature of thinking, memory and mind. In my opinion, the availability of information already now far exceeds the individual's ability to process it (we take into account the volume and the presence of noise, false information and other obstacles, such as paid subscriptions to scientific databases).

In order to master something to a practically useful level, it is not enough just to hear an answer from a search engine (no matter how advanced or impressive it may be), you need to make efforts to comprehend the real nuances, develop skills, mastery, etc. I.e. specialists in their fields do not need AI-assistants, as if they have already mastered their field, while beginners are still not able to solve real tasks on the basis of answers from the search engine independently at the level of specialists. As an example, let me remind you of the practice of some beginners to "program" based on answers from stackoverflow. It is of little use, even though the answer is very accessible. And increasing the availability of this ready answer does not make the user more capable of programming. To become an expert, you should train your mind in solving real problems in your field, and do it yourself.

2. The time has come when specialists in some spheres begin to compete not only among themselves (which is not easy in itself even now), but also with the cheap generator of their product. I think it will not get easier for these specialists. What good will it do them if they can do their jobs easier with the help of technology if they get less and less of it?

3. If we think deeply about what world processes are going on here, it may turn out that this acceleration, if it takes place, is not in the interests of the Earth's population at all. However, I think it is better to postpone analysing the subjects of world processes.