Need help! Can't solve the problem, I'm hitting hardware limitations - page 3

 

Sorry dummies, is there a limitation in x64? Here is the first article I came across (well, not the first, ok) RAM limitation for SQL SERVER 2008 under x64 system - as much RAM as the base eats.

Maybe you should give it a try.

ps maybe useful Remove 4 GB memory limit on 32 bit Windows 8 / 8.1

 
komposter:

There is a large amount of information (about 20 GB in a text file).

So, why text? Wouldn't it be easier to start by converting the data into binary form? Then you may get a suitable size:

It's frustrating how much information there is... If it were 10 GiG, I'd move it to RAM-disk (in fact, into memory) and read as much as I can.

 
meat:

So why text? Wouldn't it be easier to start by converting the data to binary? Then you'll see if the size is right:

So it looks like 20 gigabytes is not the limit.
 

Upgrade to the 64 bit version - up to 16TB of RAM will be available.

Store the file in binary form for faster reading.

Process the file in chunks according to RAM size.

Try to preprocess data to eliminate duplicate information.

 
komposter:

There is a large amount of information (about 20 GB in a text file).

...

And if it were necessary to go through these sequences once, I would do so. But you have to go through them repeatedly, shifting forward a bit each time.

...

And if you process them in chunks?

Read two chunks, let it be 1Gb each. The first chunk is processed and the next pass adds from the second chunk "...shifting forward a bit". At the same time cut off the beginning of the first chunk (it's not needed anymore because it's always "...shifting forward a bit". When the second chunk finishes reading the third chunk - and now add from the third chunk "...shifting forward a bit". This way there will always be two chunks in RAM (maximum 2GB) and there will be an order of magnitude less access to the hard disk.

 
GT788:

Upgrade to the 64 bit version - up to 16TB of RAM will be available.

Store the file in binary form for faster reading.

Process the file in chunks according to RAM size.

Try to preprocess data to eliminate duplicate information.

What axis/version is this for? Even XPpro x64 supports physical up to 128 and virtual up to 16TB.

 
Silent:

What axis/version is this for? Even XPpro x64 supports physical up to 128 and virtual up to 16TB.

That's right, I found the maximum 192 GB for 7.
 
komposter:

There is a large amount of information (about 20 GB in a text file).

The information consists of the same kind of sequences, about a million of them.

It is necessary to go through all the sequencesrepeatedly and make some calculations.

The first thing that comes to mind is to read all the contents of the file, fill the array of structures with it and work with them in memory.

But it goes wrong, with next resizing MT swears "Memory handler: cannot allocate 5610000 bytes of memory".

Dispatcher shows that terminal.exe uses 3.5 GB RAM (of 16 physical). I assume this is because the process can only get 4GB.

EA says "Not enough memory(4007 Mb used, 88 Mb available, 4095 Mb total)!!!".

And this is only 15.3% of the required amount (and I would like to increase it in the future as well).

I have no more imagination.

Shall I try to recompose these sequences in such a way, that I obtain many-many pieces, but each of them contains only necessary at the moment information?

Also try to compress data (I've already converted to floats with char types everywhere I can)? But it will give me another 10-20% at most, and I need to reduce volume by an order of magnitude...

Any advice, friends? I will not rust).

And in the direction of the use of the database have not looked? Set the database, downloading there from a text file data. You can perform any aggregation of data in advance for future calculations.

SQL queries from the Expert Advisor.

The DBMS can also be put on a separate server to increase the performance of the trading station.

 

Thank you all so much for your participation!

I'm off-line now for the weekend, but I'll be sure to reply to everyone in the morning and try to take advantage of the advice.

 
elugovoy:

Have you considered using a database? Set up a database, load data into it from a text file. It is possible to perform some kind of data aggregation beforehand for further calculations.

And then SQL queries from the Expert Advisor...

DBMS can be installed on a separate server to increase performance.

Thought about it. I am interested in opinions:

komposter:

Another thought is to move everything to a database (MySQL?) and work with it. The idea is that databases are geared up for such volumes and constant re-logging.

Are there any experts? Who has an opinion?

What will be the acceleration compared to reading a file and what will be the slowdown compared to work in memory?