Machine learning in trading: theory, models, practice and algo-trading - page 1624

 
Kesha Root:

Hmm, "0.5-0.7 seconds of calculation" is too much for MLP, maybe you teach and then calculate, on small datasets with a sliding window?

Let's go better in order:

1 What is the raw data (ticker(s), timeframe)

2 What is the size of the training dataset (1k,10k,100k...)

3 What kind of features

4 What are the targets

5 What kind of grid


It's enough to start with...

1. candlesticks + indicators
2. 200-300к
3. this is know-how - I do not share
4. binary classification - up/down
5.
 
Evgeny Dyuka:
1. candlesticks + indicators
2. 200-300к
3. this know-how - I do not share
4. binary classification - up / down
5.

How many features?

It is not the type of mesh, but the way of assembling in keras, it is the structure of the mesh, for example MLP (in keras only Dence layers) or some kind of mix, better the code of the mesh here

 
Kesha Rutov:

How many features?

sequental - not the type of mesh, but the way of assembly in keras, the structure of the mesh is tender, for example MLP(in keras only Dence layers) or a mix of some kind, better the code of the mesh here

Keras has all the layers that are in tensorflow/.

 
Kesha Rutov:

How many features?

It is not the type of mesh, but the way of assembling in keras, the structure of the mesh, for example MLP (in keras only Dence layers) or some kind of mix, better the code of the mesh here

def make_model(arr_size):
  sgd = SGD(lr=0.01, decay=1 e-6, momentum=0.9, nesterov=True)

  res = 2 # количество ответов
  act = "softmax"
  #act = "sigmoid"
  #opt = sgd
  opt = 'adam'

  model = Sequential()

  model.add(Dense(int((arr_size-res)*k), input_dim=(arr_size-res), activation='relu'))
  model.add(Dropout(dropout))

  #model.add(Dense(int((arr_size-res)*0.5*k), activation='relu'))
  #model.add(Dropout(dropout))

  #model.add(Dense(int((arr_size-res)*0.3*k), activation='relu'))
  #model.add(Dropout(dropout))

  #model.add(Dense(int((arr_size-res)*0.1*k), activation='relu'))
  #model.add(Dropout(dropout))

  model.add(Dense(res, activation=act))

  if res==1:
    ls="binary_crossentropy"
  else:
    ls="categorical_crossentropy"
  model.compile(loss=ls, optimizer=opt, metrics=['accuracy'])
  return model
 
Vladimir Perervenko:

keras has all the layers that tensorflow/ has.

I know.

I said that MLP is only dense layers in keras

 

Evgeny Dyuka:

code

Okay. So MLP.

arr_size-res must be large?

 
Kesha Rutov:

Okay. So MLP.

arr_size-res must be big?

arr_size is the number of chips on the input, the code is written crookedly, I copied it as is, I wrote it for myself
 
Evgeny Dyuka:
arr_size is the number of chips on the input, the code is written crookedly, copied it as is, it was written for myself

Well I asked you how many features, and you ignored

I propose an experiment, take the Eurobucks series, divide it 70\30%, train the first piece, generate MO-indicator on the second and post it here along with the test series

 
Kesha Rutov:

Well I asked you how many features, and you ignored

I propose an experiment, take the Eurobucks series, divide it 70\30%, train the first piece, generate MO-indicator on the second and post it here along with the test series

I answered: 250-300 chips
 


Yu.I. Zhuravlev. Mathematical methods of forecasting