From theory to practice - page 295

 
Renat Akhtyamov:
So you're saying that the parallel lines intersect (support and resistance)?

The price! Well, or the integral of returnees.

 
Alexander_K2:

The price! Well, or the integral of returns.

Here is one option for using entropy

npdeneqtest Nonparametric Test for Equality of Densities

npdeptest Nonparametric Entropy Test for Pairwise Dependence

npsdeptest Nonparametric Entropy Test for Serial Nonlinear Dependence

npsymtest Nonparametric Entropy Test for Asymmetry

npunitest Nonparametric Entropy Test for Univariate Density Equality


Here is the documentation. Here are the formulas: Entropy-Based Inference using R and the np Package: A Primer

Files:
np.zip  2331 kb
 
Alexander_K2:

I've only seen the calculation formula on Wikipedia:

It compares the current probability distribution with a normal distribution as a measure of chaos.

I don't get the idea.

What I understand from your phrase is that an ordered process cannot have a normal distribution.

May I ask why such a statement?

How are chaotic and normalcy related?

ZS I'm not refuting it, I'm wondering.

 
Nikolay Demko:

an ordered process cannot have a normal distribution.


It's a mystery! Shhhhhh...

That's why I read philosophical treatises on non-entropy.

For example...

 
Nikolay Demko:

Well, at least give me a hint how to calculate this nagentropy.

Because you're swearing like that and you want to be understood.

ZZZ I googled and indexed everything at my leisure, but I couldn't find anything but common phrases. Non-entropy is the opposite of entropy, which is a measure of order. That's all I got out of an evening of googling.

I haven't googled it, but I can't imagine crossing support lines with resistance lines; unless it's about the pens...

 
Алексей Тарабанов:

I haven't googled it, but I can't imagine crossing support lines with resistance lines; unless it's the mash-ups we're talking about...

They have a Copenhagen interpretation. Anything could be in it.

 
Yuriy Asaulenko:

Theirs is a Copenhagen interpretation. Anything could be in it.

Shall we wait for the reaction of our esteemed colleagues?

 

Wikipedia writes - The term [Non-entropy] is sometimes used in physics and mathematics (information theory, mathematical statistics) to refer to a quantity mathematically opposite to the value of entropy.
https://ru.wikipedia.org/wiki/Негэнтропия

So it is possible to calculate entropy, and simply take it with a negative sign to get non-entropy.


Calculating entropy in R, the easiest way out of the dozens that are out there (empirical entropy):

1) there is a series of numbers 2 7 4 3 7 9 4 4 4 4 4 1 3 10 3 8 4 9 10 7 which we are interested in.

2) count the number of repetitions of each value:
1: 1
2: 1
3: 3
4: 7
7: 3
8: 1
9: 2
10: 2
Got a new row of numbers 1 1 3 7 3 1 2
count the sum of the numbers
sum = 1 + 1 + 3 + 3 + 7 + 3 + 1 + 2 + 2 = 20
(logically the sum == the length of the row in the first item)
and now divide:
1/20 1/20 3/20 7/20 3/20 1/20 2/20 2/20
We get:
0.05 0.05 0.15 0.35 0.15 0.05 0.10 0.10
is the distribution density of the numbers in the series from the first item.

3) Entropy = - sum(freq*log(freq) (formula for vectors in R syntax. log() natural)
H = - (0.05*log(0.05) + 0.05*log(0.05) + 0.15*log(0.15) + 0.35*log(0.35) + 0.15*log(0.15) + 0.05*log(0.05) + 0.10*log(0.10) + 0.10*log(0.10) )
H would be entropy.

4) There is also a scale there, if the user wants to do an operation on the result
H / log(2)
or
H / log(10)

5) Multiply H by -1 and you get non-entropy.

 
Dr. Trader:

5) Multiply H by -1 and you get non-entropy.

If it were that simple, I would not have asked the respected mathematicians to deal with this question.

Once again:

It is the difference between the current probability distribution of the increments and the Gaussian one with the same expectation and variance for a given sample size.

The problem is that the variance of the increments cannot be calculated by the usual formula (the only thing that the mathematics children who have flooded this forum in search and misery know). You have to apply non-parametric methods :))

 

Moreover, this formula has a profound philosophical and physical meaning. We have a measure of the structuredness of the process, its deviation from chaos.

Gentlemen! Cheerful old men and young people!

Non-entropy is the parameter determining the state of trend/flat!!!

I present it to you.