You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Weigh the volume of information in the sphere of Mind, and compare it with the volume of information related to physical matter.
information in a non-living system, as in computer science, is interpreted as a reduction of uncertainty (a measure of the orderliness of the system)
a simple example with increasing entropy in a closed environment (a vessel with gas).
This is probably the 5th time I've written this
the source of information is the material interaction of open systems.
It's like you've got Google turned off. Or you just like to encapsulate yourself in your ideas, losing information.can you summarise the result point by point
information in a non-living system, as in computer science, is interpreted as a reduction of uncertainty (a measure of the orderliness of the system)
a simple example with increasing entropy in a closed environment (a vessel with gas)
This is probably the 5th time I've written this
the source of information is the material interaction of open systems.
It's like you've got Google turned off. Or you just like to encapsulate yourself in your ideas, losing information..
A disc that has been burned has more entropy than a blank disc because writing information to a disc introduces some degree of clutter into the data, and therefore increases the entropy of the system. When data is written to a disc, the magnetic field on its surface changes, creating groups of ordered and disordered zones of data that will be read and interpreted later. In addition, when data is written to a disc, errors occur that can be corrected by adding additional bits of information, which also increases entropy.
information in a non-living system, as in computer science, is interpreted as a reduction of uncertainty (a measure of the orderliness of the system)
a simple example with increasing entropy in a closed environment (a vessel with gas)
This is probably the 5th time I've written this
the source of information is the material interaction of open systems.
It's like you've got Google turned off. Or you just like to encapsulate yourself in your ideas, losing information.Google is evil)))) This is different information then, they sound the same, but there is entropy and a measure of orderliness, some person for some reason called it not successfully information. What does information have to do with what Peter is talking about? It does not correlate with the measure of orderliness at all. Besides, now I discussed with my son, he is also not aware, although he graduated 5 years ago and is familiar with the topic, that the term information is used in the physical characteristics of gases or other substances. With copromat he is on your own))))) I was not taught either, although entropy was pounded into my brain in a normal way))))
Besides, the more entropy, which means more different states (our study ended there), the more information, as I understand you, and in Peter's information, a gig of ones and a gig of ones and zeros weighs the same, but entropy will be different))).
Information as a measure of entropy in computer science is not what Peter says. This concept of entropy is from physics, when one distinguishes between calm and changing states, pressure is stable or changes have different number of states of molecules inside to another field of science, informatics, where it began to be counted specifically. But in fact entropy in computer science and in physics is a bit different. In physics, it is an estimated calculated quantity, entropy is not measured, it is calculated from classical measures of properties of substances. Temperature, pressure, mass. You can measure the force of pressure, but the stress inside the metal is only calculable, just like the change in entropy in a gas when it expands.
conversely, the more information on the medium, the greater the entropy.
A disc that has been written to has a higher entropy than a blank disc because writing information to a disc introduces some degree of disorder into the data and therefore increases the entropy of the system. When data is written to a disc, the magnetic field on its surface changes, creating groups of ordered and disordered zones of data that will be read and interpreted later. In addition, when data is written to a disc, errors occur that can be corrected by adding more bits of information, which also increases entropy.
Google is evil)))) This is different information then, they sound the same, but there is entropy and a measure of orderliness, some person for some reason called it not successfully information. What does information have to do with what Peter is talking about? It does not correlate with the measure of orderliness at all. Besides, now I discussed with my son, he is also not aware, although he graduated 5 years ago and is familiar with the topic, that the term information is used in the physical characteristics of gases or other substances. With copromat he is on your own))))) I was not taught either, although entropy was put into my brain normally so)))))
Besides, the more entropy, which means more different states (our education ended there), the more information, as I understand you, And in Peter's information, a gig of ones and a gig of ones and zeros weighs the same, but the entropy will be different))).
You're so fucking stupid. The greater the entropy, the greater the loss of information. That's it, I'm not talking to you :))
Nah, then do not finish learning)))))) I didn't find entropy in the 8th grade in computer science)))))) Lan, in physics then it turns out the other way round. The measure of disorder is the number of different states - entropy. 0 is a calm state, the elements of the system are the same. 1 turbulent state. There are too many variants of states to estimate them. And it is not infinitely many. It's just that the states of the elements change chaotically relative to each other.
And so yes, we have 0 information about the state of matter and its elements accurately at entropy 1. We have no possibility to calculate or estimate the states of elements inside.
Any tool is good, if you know how, and evil, if you don't know how)))))
Well, the measure of orderliness, the ability to count these states, entropy as a measure of the number of states and the measure of loss of information about the state of the object are all valid concepts, but from different sciences. In communication, it is better not to hope that everyone went to the same school))))))
We seem to agree, at entropy 1 we have a loss of information about the object 100%)))).