a completely random process and FOREX. - page 5

 
D.Will писал (а):

I have decided to reduce the determinism of the pseudo-random number generator by shuffling the series of random numbers several times.

% shuffle
for=1:1:10000
i1 = fix(rand*N)+1; ))))
i2 = fix(rand*N)+1; ) ))
c=r(i1);
r(i1)=r(i2);
r(i2)=c;

If you look above I gave an example where I shuffled the whole sequence several times. and displayed both one and the other sequence.




When shuffled like that - on an eight without two. Where are the candelabra gentlemen!?
 
Korey:
D.Will wrote (a):



Decided to reduce the determinism of the pseudo-random number generator by shuffling a series of random numbers. several times.



% shuffle

for=1:1:10000

i1 = fix(rand*N)+1; ))))

i2 = fix(rand*N)+1; ) ))

c=r(i1);

r(i1)=r(i2);

r(i2)=c;



If you look above I gave an example where the whole sequence was shuffled several times. and displayed both one and the other sequence.









When shuffled like that - on an eight without two. Where are the candelabra gentlemen!?
What's wrong here? Randomly selected two indexes and swapped the contents?

fix(rand*N)+1 returns an integer from 1 to N. In Matlab, the indexing goes from 1.

fuck
 
rand clicks the sequence sequentially. The indices are taken as a pair of adjacent pseudo-generator numbers,
and they are known to be correlated, i.e. they lie within a period .
Try a random number of rand calls between getting the indices to break the periodicity.
 
Korey:
rand clicks the sequence sequentially. The indices are taken as a pair of adjacent pseudo-generator numbers,

and they are known to be correlated, i.e. they lie within period m.

Try to perform random number of rand calls between getting indexes to break periodicity m.

about this we already discussed.

the point of this preset is something else.
It would be ok if the change would be additive in the form of summation to this row of other numbers.
But the nature of the change is completely different.

If you believe that the correlation in rand with permutation is so great, such a generator is of no value.
do you understand?

Even though it's pseudo-random, that doesn't mean you should make paranoid insertion of random calls from a random number.

all those calls will also have a distribution correlated with the PSG.

The fact is that by mixing the data, the nature of the sequence remains the same.

what correlation are you talking about????????
 
Especially for Korey

close all;

N=1000;
r=NORMRND(0,0.0077,1,N);

r1=r;

for i=1:1:100000
i1 = fix(rand*N)+1
for j=1:1:1000
rand;
end
i2 = fix(rand*N)+1
c=r(i1);
r(i1)=r(i2);
r(i2)=c;
end;

figure;
%r=r-05;
for i=2:1:length(r)
r(i)=r(i)+r(i-1);
r1(i)=r1(i)+r1(i-1);
end

grid on;

plot(r);
figure;
plot(r1);



Before


After


It's even cooler =))






 
to D.Will

The way you form a random series is very similar to the algorithm for linear congruent oscillators. It has long been proven that this algorithm (and its various modifications) generates anything but a random series. This is true for generating a sequence "as a whole" and the generator of random data itself(a remark: if I'm not mistaken, mathLab has implemented such an algorithm, but this is easy to check). Moreover, the computer can do anything but one thing - namely, generate a random sequence. The use of neural networks is promising in this direction, and people manage to get with the help of NS, let's say, "maximally proved random variables" and for a company to defend all sorts of clever doctoral theses. Autoregressive prediction models work well (in the sense of statistically well) on such series, you can try and make sure.

 

Correlation, exactly correlation.

J. Forsyth. Machine Methods for Mathematical Computing
Knuth D.E. The Art of Programming. Vol. 2, it seems.
In general, standard random sequence generators have long been recognized as unsuitable, if it is necessary to write your own.

 
grasn:
to D.Will


The way you generate a random series is very similar to the algorithm
for linear congruent oscillators. It has long been proven that this algorithm (and its
different modifications) forms anything but a random series. This
concerns generation of a sequence "as a whole" and the random data generator itself. More
Moreover, the computer can do anything but one thing - namely, generate a random
series. Promising in this direction is the use of neural networks and
people manage to get, shall we say, "maximally provable
random variables" and in the process defend all sorts of clever doctoral theses. On such
autoregressive models work well (in the sense of statistically well).
predictions, you can try and see for yourself.





Do you have a link? You must have neural networks with chaotic behaviour.
Autoregressive which y(n+1)=a0*y(n)+b.noise. ? how exactly are they good?
y(n+1)=a0*y(n)+a1*y(n-1) .... a5*y(n-5) + b.noise results in a linear neuron + noise. what's good about it?


By the way. does your statement mean that the above process can be predicted?
 

I actually meant the following, I assume that k is the number of cycles:

i1 = fix(rand*N)+1
k=fix(rand*100000)+1
for j=1:1:k
rand;
end
i2 = fix(rand*N)+1
c=r(i1);
r(i1)=r(i2);
r(i2)=c;
end;

 
D.Will писал (а):
grasn:
to D.Will


The way you generate a random series is very similar to the algorithm
for linear congruent generators. It has long been proven that this algorithm (and their
different modifications) forms anything but a random series. This
concerns generation of a sequence "as a whole" and the random data generator itself. More
Moreover, the computer can do anything but one thing - namely, generate a random
series. Promising in this direction is the use of neural networks and
people manage to get, shall we say, "maximally provable
random variables" and in the process defend all sorts of clever doctoral theses. On such
autoregressive models work well (in the sense of statistically well).
predictions, you can try and see for yourself.





Do you have a link? It must be neural networks with chaotic behaviour.
Autoregressive ones that y(n+1)=a0*y(n)+b.noise. ? how exactly are they good?
y(n+1)=a0*y(n)+a1*y(n-1) .... a5*y(n-5) + b.noise results in a linear neuron + noise. what's good about it?


By the way. does your statement mean that the above process can be predicted?

I was given the material for acquaintance, that is called in hands. But it is not because everything is secret, I think, it is possible to find in the Internet.

Speaking of which, does your statement mean that the above process can be predicted?

I thought I clearly wrote: "Autoregressive prediction models work well (in the sense of statistically well) on such series, you can try and see for yourself".

Again. Very well (statistically) predicted by AR models, try and convince yourself. In my humble understanding - pittance on your generation. Is that a model??? You yourself have quite rightly pointed out that it is not a model. You must first create a model. Firstly, simply invent a condition that the "price" is guaranteed not to be negative under any initial condition - you will understand that it is not so simple. And investigate it, and what you are doing now is bullshit in the literal sense. There are a lot of processes, both natural and technical, resembling quotes. You can easily get a number of PI's that resemble quotes with Fibo, levels and other attributes.

PS: If you want to find the gnomon of the phenomenon, then - fractals!!!!. :о)