You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
I also tried to implement a similar algorithm back in 2013... But I used 7 indicators, and Zigzag was used to form a vector for training the NS. But the essence is the same - I was looking for reverse positions... When I started to use Zigzag I had no idea what to do with it. until I accidentally came across some patterns. That radically changed my TS. Now my algorithm is much simpler:
1. Calculating patterns on minute and hour timeframe, over the last year;
2. Making a dictionary of turning points (pairs "minute pattern - hour pattern") ;
3. teaching NS using the tipping point dictionary (on 150-160 pairs);
This is the result of my approach:
To the disadvantages of my approach:
1) High risk of the TS - since it is not possible to determine the exact value of the break price, the TS places 9 pending orders with lots: 1, 1, 3, 6, 14, 31, 70, 158, 355;
2) Difficult to implement an exit algorithm (trawl TS);
So NS can be used for trading, only question is what to teach NS...
P/s: By patterns I understand A. Merrill's patterns (M & W) .
It's a smart approach. And the patterns were described simply as the position of the bars in the matrix, without taking into account the actual price delta - only the relative position?
I have an idea, to try the pattern indicators but with a different frame - the first five bars we analyse the indicators on the last 5 indicators, and the two indicators for trend analysis - we analyse in increments of 10 and take into account the absolute changes.
The zig-zag is a smart idea, but how do the peaks filter out from the flat wobbles there could be false points of trend change?
A sensible approach. And the patterns described simply as the position of the bars in the matrix, without taking into account the actual price delta - just the relative position?
I have an idea, to try the pattern indicators, but with a different frame - the first five bars we analyse the indicators on the last 5 indicators, and two indicators for trend analysis - we analyse in steps of 10 and at the same time take into account the absolute changes.
About the zig-zag is a smart idea, but how peaks filtered from flat wobbles could be false points of trend change?
I do it this way:
There's a dynamic array that stores exclusively pairs of patterns (I call it a dictionary), if a pair of patterns gets into the dictionary a second time I don't write it down; and two counter arrays of senior timeframe and junior - they count how often a pattern was involved in forming pairs, even if it was not written into the dictionary.
The training vector is formed according to the dictionary, the weight of an individual pattern = pattern_counter / maximum_counter. I.e., the pattern, which participates in formation of pairs more often, equals 1, and all other patterns are less than 1. This is the table you get after teaching the NS:
Structure of the NS: 64 input neurons, 4 internal, 1 output. That is, one input neuron describes one pattern. The grid takes 40-50 minutes to train, and the error of NS does not exceed 0.00001.
Thus I have a model that can predict the significance of pairs of patterns, even if it was not in the dictionary before.
I have been struggling with flat and false peaks for a long time but I'm working on the level of ZigZaga calculation. I slightly modified the code of a standard Zigzag, namely, implemented ZZ percentage on its basis. So far, the code looks more or less as follows:
int MyCExtremum::GetCombiZigzag(const double &high[], // буфер цен high
const double &low[], // буфер цен low
const datetime &time[], // буфер время
int ExtDepth, // глубина поиска экстремумов(первого прохода)
double ExtDeviation,// "пороговое значение": жесткая ступенька + % роста цены
int ExtBackstep // глубина поиска экстремумов(второго прохода)
)
{
//--- value
int shift=0, whatlookfor=0, lasthighpos=0, lastlowpos=0, Deviat=1;
double lasthigh=0.0, lastlow=0.0, percent=0.0;
int rates_total = ArraySize(time); // размер входных таймсерий
int limit = rates_total - ExtDepth; // лимит на расчеты...
//+---------------------------------------------------------------+
//| ОЧЕНЬ ВАЖНАЯ ПРОВЕРКА ВЛИЯЮЩАЯ НА КОРРЕКТНОСТЬ ВЫЧИСЛЕНИЙ! |
//+---------------------------------------------------------------+
if(ArrayIsSeries(high)) ArraySetAsSeries(high,false);
if(ArrayIsSeries(low)) ArraySetAsSeries(low,false);
if(ArrayIsSeries(time)) ArraySetAsSeries(time,false);
//+---------------------------------------------------------------+
//| ПРОВЕРКИ ВХОДНЫХ ПЕРЕМЕННЫХ |
//+---------------------------------------------------------------+
if(rates_total<20)
{
Print(__FUNCTION__," ERROR: the small size of the buffer.");
return(-1);
}
if(ExtDeviation<0 || ExtDeviation>100)
{
Print(__FUNCTION__," ERROR: Is\'not correct a Deviation. The value of Deviation should be in the interval [0..100].");
return(-1);
}
//--- Проверка: Depth and Backstep
if((ExtDepth < ExtBackstep)||(ExtDepth < 2))
{
Print(__FUNCTION__+" ERROR: Is\'not correct a Depth and Backstep. The value of Depth should be greater than Backstep.");
return(-1);
}
//--- готовим буфер ZigzagBuffer[]
if(ArraySize(ZigzagBuffer)>0) ArrayFree(ZigzagBuffer); // Удаляем старые данные
ArrayResize(ZigzagBuffer,rates_total, EXTREMUM_RESERVE);
ArrayFill(ZigzagBuffer,0,rates_total,0.0);
if(ArrayIsSeries(ZigzagBuffer)) ArraySetAsSeries(ZigzagBuffer, false);
//---
if(ArraySize(HighMapBuffer)>0) ArrayFree(HighMapBuffer); // Удаляем старые данные
ArrayResize(HighMapBuffer,rates_total, EXTREMUM_RESERVE);
ArrayFill(HighMapBuffer,0,rates_total,0.0);
if(ArrayIsSeries(HighMapBuffer)) ArraySetAsSeries(HighMapBuffer, false);
//---
if(ArraySize(LowMapBuffer)>0) ArrayFree(LowMapBuffer); // Удаляем старые данные
ArrayResize(LowMapBuffer,rates_total, EXTREMUM_RESERVE);
ArrayFill(LowMapBuffer,0,rates_total,0.0);
if(ArrayIsSeries(LowMapBuffer)) ArraySetAsSeries(LowMapBuffer, false);
//---
if(ArraySize(TimeBuffer)>0) ArrayFree(TimeBuffer); // Удаляем старые данные
ArrayResize(TimeBuffer, rates_total, EXTREMUM_RESERVE);
ArrayFill(TimeBuffer, 0, rates_total, 0);
if(ArrayIsSeries(TimeBuffer)) ArraySetAsSeries(TimeBuffer, false);
//--- корректировка Deviation
if(ExtDeviation < 1)
{
Deviat = 1;
}else
{
Deviat = (int)ExtDeviation;
}
//--- получаем "свежие" минимумы и максимумы
if(GetHighMapZigzag(high,ExtDepth,Deviat,ExtBackstep) < 0) return(0);
if(GetLowMapZigzag(low,ExtDepth,Deviat,ExtBackstep) < 0) return(0);
//--- final rejection
for(shift=ExtDepth;shift<rates_total;shift++)
{
switch(whatlookfor)
{
case Start: // search for peak or lawn
if(lastlow==0 && lasthigh==0)
{
if(HighMapBuffer[shift]!=0)
{
lasthigh=high[shift];
lasthighpos=shift;
whatlookfor=Sill;
ZigzagBuffer[shift]=lasthigh;
TimeBuffer[shift]=time[shift];
}
if(LowMapBuffer[shift]!=0)
{
lastlow=low[shift];
lastlowpos=shift;
whatlookfor=Pike;
ZigzagBuffer[shift]=lastlow;
TimeBuffer[shift]=time[shift];
}
}
break;
case Pike: // search for peak
if(LowMapBuffer[shift]!=0.0 && LowMapBuffer[shift]<lastlow && HighMapBuffer[shift]==0.0)
{
//---
ZigzagBuffer[lastlowpos] = 0.0;
TimeBuffer[lastlowpos] = 0;
//---
lastlowpos=shift;
lastlow=LowMapBuffer[shift];
ZigzagBuffer[shift]=lastlow;
TimeBuffer[shift]=time[shift];
//--- Обязательно: покинуть switch
break;
}
//--- Обход "двойственности"
if(LowMapBuffer[shift]!=0.0 && HighMapBuffer[shift]!=0.0 && LowMapBuffer[shift]<lastlow)
{
//---
ZigzagBuffer[lastlowpos] = 0.0;
TimeBuffer[lastlowpos] = 0;
//---
lastlowpos=shift;
lastlow=LowMapBuffer[shift];
ZigzagBuffer[shift]=lastlow;
TimeBuffer[shift]=time[shift];
//--- Обязательно: покинуть switch
break;
}
if(HighMapBuffer[shift]!=0.0 && LowMapBuffer[shift]==0.0)
{
//--- Проверка: % роста цены
percent = (HighMapBuffer[shift]-lastlow)/(lastlow/100);
if(percent > ExtDeviation)
{
lasthigh=HighMapBuffer[shift];
lasthighpos=shift;
ZigzagBuffer[shift]=lasthigh;
TimeBuffer[shift]=time[shift];
whatlookfor=Sill;
}
percent = 0.0;
}
break;
case Sill: // search for lawn
if(HighMapBuffer[shift]!=0.0 && HighMapBuffer[shift]>lasthigh && LowMapBuffer[shift]==0.0)
{
//---
ZigzagBuffer[lasthighpos] = 0.0;
TimeBuffer[lasthighpos] = 0;
//---
lasthighpos=shift;
lasthigh=HighMapBuffer[shift];
ZigzagBuffer[shift]=lasthigh;
TimeBuffer[shift]=time[shift];
//--- Обязательно: покинуть switch
break;
}
if(HighMapBuffer[shift]!=0.0 && LowMapBuffer[shift]!=0.0 && HighMapBuffer[shift]>lasthigh)
{
//---
ZigzagBuffer[lasthighpos] = 0.0;
TimeBuffer[lasthighpos] = 0;
//---
lasthighpos=shift;
lasthigh=HighMapBuffer[shift];
ZigzagBuffer[shift]=lasthigh;
TimeBuffer[shift]=time[shift];
//--- Обязательно: покинуть switch
break;
}
if(LowMapBuffer[shift]!=0.0 && HighMapBuffer[shift]==0.0)
{
//--- Проверка: % роста цены
percent = (lasthigh-LowMapBuffer[shift])/(lasthigh/100);
if(percent > ExtDeviation)
{
lastlow=LowMapBuffer[shift];
lastlowpos=shift;
ZigzagBuffer[shift]=lastlow;
TimeBuffer[shift]=time[shift];
whatlookfor=Pike;
}
percent = 0.0;
}
break;
default:
return(-1);
}
}
//--- return value of prev_calculated for next call
return(rates_total);
}
MyCExtremum is a class for calculating ZigZag...
A sensible approach. And the patterns described simply as the position of the bars in the matrix, without taking into account the actual price delta - just the relative position?
I have an idea, to try the patterns of the indicators, but with a different frame - the first five bars we analyse the indicators on the last 5 indicators, and two indicators for trend analysis - we analyse in increments of 10 and at the same time take into account the absolute changes.
The zig-zag is a smart idea, but how do the peaks filter out from the flat wobbles there could be false points of trend change?
Andrey Emelyanov:
Structure of the NS: 64 input neurons, 4 internal, 1 output. That is, one input neuron describes one pattern.
I do the following:
There is a dynamic array that stores exclusively pairs of patterns (I call it the dictionary), if a pair of patterns got into the dictionary a second time I don't write it down; and two arrays of high timeframe and low timeframe counter - they count how often a pattern was involved in forming pairs, even if it was not written to the dictionary.
The training vector is formed according to the dictionary, the weight of an individual pattern = pattern_counter / maximum_counter. I.e., the pattern, which participates in formation of pairs more often, equals 1, and all other patterns are less than 1. This is the table you get after teaching the NS:
Structure of the NS: 64 input neurons, 4 internal, 1 output. That is, one input neuron describes one pattern. The grid takes 40-50 minutes to train, and the error of NS does not exceed 0.00001.
Thus I have a model that can predict the significance of pairs of patterns, even if it was not in the dictionary before.
I have been struggling with flat and false spikes for a long time but I am calculating ZigZaga. I slightly modified the code of a standard Zigzag, namely, implemented ZZ percentage on its basis. So far, the code looks more or less as follows:
The array is an interesting solution. Are there any differences in statistics between pairs/periods, what is the stability in general of the variability of the frequency of occurrence of a pattern giving a positive prediction result?
About the zig-zag, I also have a percentage solution, but I also use a deeper history to calculate a reference section of the zig-zag, against which I compare the percentage change in others.
As for analysing indicators with patterns - that's very interesting... I think there is less noise in indicators, but you have to choose indicators so that one suppresses "low noise" and the other "high noise", then you have a multi-filter.
Are you hoping for results with this model? Your inner layer acts as an intermediate compressor, not a classifier.
The array is an interesting solution. Is there any difference in statistics between pairs/periods, what is the stability in general of frequency variability of pattern occurrence that gives a positive prediction result?
About the zig-zag, I also have a percentage solution, but I also use a deeper history to calculate a reference section of the zig-zag, against which I compare the percentage change in others.
As everyone knows, A. Merrill's patterns do not give an exact answer whether the pattern will develop further (maintain the trend) or change into another pattern (price rebound). That is why I decided to search for the answer using two timeframes - one hour and one minute. I collect statistics on the recurrence of pairs and do not have a universal training dictionary yet. However I am sure this connection must exist... Otherwise there would be no harmonious models: butterflies, bats, etc.
My baby is still dumb and dull, but it's getting somewhere... 8 input indicators, 1 output, 15 neurons in the covered layer. 2000 input vector, 10000 training epochs.
This is actually the 3rd or 4th one, all getting pretty much the same results. I guess I need more neurons and input vector, but it takes a long time to train.
I have an approximate idea of the pattern it should pick up, I've selected indicators from different timeframes and the outputs seem to have meaningful information.