is there a way to load a file into the memory during optimization and each - page 3

 
Lorentzos Roussos #:

Its an array , that's the lowest level i could brind it down to but , 300 megabytes .

The code compiles if the array is -1.0's 0.0's 1.0's , (in other words if theres 3 values per slot)

20x90x12000 slots 

But , if its 4 digit mixed values(i.e. : any value between -1.0 or 1.0) (they can't go above 1.0 or below -1.0 anyway) it cannot compile .

To simplify : 

  • -300MB+ include
  • -of a double array 
  • -compiles when array varies between 3 values
  • -does not compile otherwise
  • -i'm allowing 4 digits in the array upon creation of the include
  • -the values cannot go above 1.0 or below -1.0

Its just a double array include , a huge array but it compiles in some cases.

Ah ok. 300 MB I understand better.

Also why you want to load it once only.

Your data file is a binary file ?

EDIT: By the way how much time it takes to load this 300 MB file ?

 
Alain Verleyen #:

Ah ok. 300 MB I understand better.

Also why you want to load it once only.

Your data file is a binary file ?

The original file i want permanent is a complex structure . 

In the test i perform calcs with that structure , that file(of the structure) is 40mb but its 4 levels deep (in objects)

The double array was a partial recalc precalc of some stuff to avoid loading the structure (hence the size) 

EDIT : 136 seconds (to compile)
 
Lorentzos Roussos #:

The original file i want permanent is a complex structure . 

In the test i perform calcs with that structure , that file(of the structure) is 40mb but its 4 levels deep (in objects)

The double array was a partial recalc of some stuff to avoid loading the structure (hence the size) 

EDIT : 136 seconds (to compile)
So it compiles finally ?
 
Lorentzos Roussos #: Its an array , that's the lowest level i could brind it down to but , 300 megabytes.

Then, to reduce the code file size, try to do some cosmetic clean-up, like removing trailing zeros and even the decimal point if it can be represented as a whole number.

 
Alain Verleyen #:
So it compiles finally ?

it compiles when values are 1.0 0.0 or -1.0 in the array 

 
Lorentzos Roussos #:

it compiles when values are 1.0 0.0 or -1.0 in the array 

I am not sure what you did. Did you include the original data file as "text" data or "binary" data ?
 
Fernando Carreiro #:

Then, to reduce the code file size, try to do some cosmetic clean-up, like removing trailing zeros and even the decimal point if it can be represented as a whole number.

Could it be the mqh size ?

 
Alain Verleyen #:
I am not sure what you did. Did you include the original data file as "text" data or "binary" data ?

you mean how it was exported ? the mqh was exported as FILE_TXT 

i'm talking about the workaround here .

The original file -if you mean the structure one - is a binary file , i'm loading doubles and integers and arrays , not strings and converting them , that's what you're asking i guess.

 
Lorentzos Roussos #:

you mean how it was exported ? the mqh was exported as FILE_TXT 

i'm talking about the workaround here .

The original file -if you mean the structure one - is a binary file , i'm loading doubles and integers and arrays , not strings and converting them , that's what you're asking i guess.

Usually text data is bigger then binary data, well, it depends of the data actually.

I am asking the size of your original data file (binary) ? An incidentally, the time need to load it ? but this last question is just by curiosity.

Why I am asking ? Because if your "workaround" is bigger in size than your original data file (and impossible to compile anyway), you could consider importing the data as binary data and then convert it to your structures in the code.

Will that be faster than your initial load, test, load, test...impossible to know without trying.

 

Ok the precalc loaded as a binary file and not included in the executable is faster that loading the original structure .

So :

~40mb of the complex structure binary file loaded per pass resulted in 755 passes in 11minutes

~170mb of the precalcs (a big double array essentially) as a binary file loaded per pass resulted in 755 passes in 2:30 minutes 

So its 4.78 times faster to load a straight array 

The solution Fernando suggested could blow them out of the water though but i can let it run for 7 hours