AI 2023. Meet ChatGPT. - page 147

 
Реter Konow #:
Weigh the volume of information in the sphere of Mind, and compare it with the volume of information related to physical matter.

Matter is information-poor. And now, how much information has the human intellect created during the history of civilisation development?

So, who is the source of information?

information in a non-living system, as in computer science, is interpreted as a reduction of uncertainty (a measure of the orderliness of the system)

a simple example with increasing entropy in a closed environment (a vessel with gas).

This is probably the 5th time I've written this

the source of information is the material interaction of open systems.

It's like you've got Google turned off. Or you just like to encapsulate yourself in your ideas, losing information.
 
lynxntech #:

can you summarise the result point by point


Can I summarise?
 
Maxim Dmitrievsky #:

information in a non-living system, as in computer science, is interpreted as a reduction of uncertainty (a measure of the orderliness of the system)

a simple example with increasing entropy in a closed environment (a vessel with gas)

This is probably the 5th time I've written this

the source of information is the material interaction of open systems.

It's like you've got Google turned off. Or you just like to encapsulate yourself in your ideas, losing information.
on the contrary, the more information on a medium, the greater the entropy.
.

A disc that has been burned has more entropy than a blank disc because writing information to a disc introduces some degree of clutter into the data, and therefore increases the entropy of the system. When data is written to a disc, the magnetic field on its surface changes, creating groups of ordered and disordered zones of data that will be read and interpreted later. In addition, when data is written to a disc, errors occur that can be corrected by adding additional bits of information, which also increases entropy.

 
Maxim Dmitrievsky #:

information in a non-living system, as in computer science, is interpreted as a reduction of uncertainty (a measure of the orderliness of the system)

a simple example with increasing entropy in a closed environment (a vessel with gas)

This is probably the 5th time I've written this

the source of information is the material interaction of open systems.

It's like you've got Google turned off. Or you just like to encapsulate yourself in your ideas, losing information.

Google is evil)))) This is different information then, they sound the same, but there is entropy and a measure of orderliness, some person for some reason called it not successfully information. What does information have to do with what Peter is talking about? It does not correlate with the measure of orderliness at all. Besides, now I discussed with my son, he is also not aware, although he graduated 5 years ago and is familiar with the topic, that the term information is used in the physical characteristics of gases or other substances. With copromat he is on your own))))) I was not taught either, although entropy was pounded into my brain in a normal way))))

 

Besides, the more entropy, which means more different states (our study ended there), the more information, as I understand you, and in Peter's information, a gig of ones and a gig of ones and zeros weighs the same, but entropy will be different))).

Information as a measure of entropy in computer science is not what Peter says. This concept of entropy is from physics, when one distinguishes between calm and changing states, pressure is stable or changes have different number of states of molecules inside to another field of science, informatics, where it began to be counted specifically. But in fact entropy in computer science and in physics is a bit different. In physics, it is an estimated calculated quantity, entropy is not measured, it is calculated from classical measures of properties of substances. Temperature, pressure, mass. You can measure the force of pressure, but the stress inside the metal is only calculable, just like the change in entropy in a gas when it expands.

 
Andrey Dik #:
conversely, the more information on the medium, the greater the entropy.

A disc that has been written to has a higher entropy than a blank disc because writing information to a disc introduces some degree of disorder into the data and therefore increases the entropy of the system. When data is written to a disc, the magnetic field on its surface changes, creating groups of ordered and disordered zones of data that will be read and interpreted later. In addition, when data is written to a disc, errors occur that can be corrected by adding more bits of information, which also increases entropy.

You've written off non-entropy.
 
Valeriy Yastremskiy #:

Google is evil)))) This is different information then, they sound the same, but there is entropy and a measure of orderliness, some person for some reason called it not successfully information. What does information have to do with what Peter is talking about? It does not correlate with the measure of orderliness at all. Besides, now I discussed with my son, he is also not aware, although he graduated 5 years ago and is familiar with the topic, that the term information is used in the physical characteristics of gases or other substances. With copromat he is on your own))))) I was not taught either, although entropy was put into my brain normally so)))))

Information in the 8th grade nowadays, including entropy. So it was not taught :)) it is a basic concept of information theory. It's even used in MO metrics.

Everything ties together very nicely there.

I didn't study at any physics department. Hey, what's up?
Google is good if you know how to use it.

 
Valeriy Yastremskiy #:
Besides, the more entropy, which means more different states (our education ended there), the more information, as I understand you, And in Peter's information, a gig of ones and a gig of ones and zeros weighs the same, but the entropy will be different))).
You're so stupid. The greater the entropy, the greater the loss of information. That's it, I'm not communicating with you :) people are too lazy to even read.
 
Maxim Dmitrievsky #:
You're so fucking stupid. The greater the entropy, the greater the loss of information. That's it, I'm not talking to you :))

Nah, then do not finish learning)))))) I didn't find entropy in the 8th grade in computer science)))))) Lan, in physics then it turns out the other way round. The measure of disorder is the number of different states - entropy. 0 is a calm state, the elements of the system are the same. 1 turbulent state. There are too many variants of states to estimate them. And it is not infinitely many. It's just that the states of the elements change chaotically relative to each other.

And so yes, we have 0 information about the state of matter and its elements accurately at entropy 1. We have no possibility to calculate or estimate the states of elements inside.

 

Any tool is good, if you know how, and evil, if you don't know how)))))

Well, the measure of orderliness, the ability to count these states, entropy as a measure of the number of states and the measure of loss of information about the state of the object are all valid concepts, but from different sciences. In communication, it is better not to hope that everyone went to the same school))))))

We seem to agree, at entropy 1 we have a loss of information about the object 100%)))).