Author's dialogue. Alexander Smirnov. - page 35

 

And if you calculate an array of quadratic weights in the init() function, you can get a nice result at all. In addition, the calculations can be optimized using IndicatorCounted(). Well, it will hang for the first several seconds when the periods are long, so what the hell with it...

 
Mathemat:

And if you calculate an array of quadratic weights in the init() function, you may get a nice result at all. In addition, the calculations can be optimized using IndicatorCounted(). Well, it will hang for the first several seconds when the periods are long, so what the hell with it...

I tried it. It calculates a moving regression using a ready array at the speed of an ordinary wizard.
The only inconvenience is that the array turns out to be of dimension A[][20] (there are no structures on isi),
and I have to remember the numeric address of a cell like on BESM-3)))
 
Mathemat:

So it will hang for the first few seconds during long periods, so what the hell with it...

I don't think there should be any noticeable sluggishness in the first calculation. But, as it seems now, we should first calculate the outgoing values (there seems to be no point in remembering them), then calculate the reduced sums, then recurrently calculate their new values and, finally, add the incoming values. All this for three sums (sum proper, first and second derivatives). If the period is small, this would be enough for the full calculation of the only needed sum.
Generally speaking, such extreme forcing is justified only if the algorithm is designed for optimization in the tester, imho.
 
The interesting thing is that regardless of the order of the polynomial, the computation time will be about the same (if the array of weights is prepared in advance).
 
dug up my handiwork - the polynomial mash
Files:
 
Is it too difficult to give an explanation, Dimitri, especially about the meaning of the parameters? The craftsmanship is, to put it mildly, of a very high quality.
 

It's not easy:-)

Polynomial: K0*X^0+K1*X^1+K2*X^2+K3*X^3..., K coefficients are defined in line K="1/5/6/1/-20" (K0=1, K2=5...). Argument X changes within the range from ArgumentMin to ArgumentMax and some curvature is obtained, which can be viewed in ControlMode=true, and then this curvature is used as coefficients for sliding.

It would be more interesting to make a spline, because it's not easy to get the desired curve shape with this polnymode.

 
Is the curve some sort of weight function of the k-types for the waving machine?
 
Mathemat:
Is the curve some sort of weight function of the k-types for the waving machine?

Yes, it is.
 

The edge value ( X 1, right-hand edge) for the cubic polynomial constructed using MNC, for seven points in the series, ( X 7*(-2)+ X 6*(4)+ X 5*(1)+ X 4*(-4)+ X 3*(-4)+ X 2*(8)+ X 1*(39))/42 . The row to check is 0, 1, 8, 27, 64, 125, 216, when substituting the first six numbers into the formula, the result should be 216, because the cubic polynomial aligns the series consisting of cubes. Source, Kendall M and Stewart A.


By the way, the same cubic polynomial for seven points, but giving an estimate of the value by MNC for the middle points, i.e.

For X 4 will be ( X 7*(-2)+ X 6*(3)+ X 5*(6)+ X 4*(7)+ X 3*(6)+ X 2*(3)+ X 1*(-2))/21

For X3 it will be ( X 7*(1)+ X 6*(-4)+ X 5*(2)+ X 4*(12)+ X 3*(19)+ X 2*(16)+ X 1*(-4))/42

For X2, it will be ( X 7*(4)+ X 6*(-7)+ X 5*(-4)+ X 4*(6)+ X 3*(16)+ X 2*(19)+ X 1*(8))/42


Generally, these are interpolation formulas, so in order to extrapolate, for example, to X 0, i.e. into the future, beyond the existing series, you have to look for other coefficients in the formula.