Machine learning in trading: theory, models, practice and algo-trading - page 2807
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
.
Your script consumes almost 9 gigabytes of RAM on my sample, but it seems to work, the files are saved. I don't even know where the memory is consumed there, while the sample takes a little more than a gigabyte.
.
I also found out that headings in the table (column names) are saved in quotes - how to switch this off?
What does this code do? To make it faster, you should convert all columns to the same data type (float 32, 16 - no need, it will be slower) and calculate coRRR through fast arrays.
if we are talking about the real correction of the kaRma
Your script consumes almost 9 gigabytes of RAM on my sample, but it seems to work, the files are saved. I don't even know where the memory is used, while the sample takes a little more than a gigabyte.
So?
R bad probably)
I've also found a problem - headings in the table (column names) are saved in quotes - how to switch it off?
what did you do to solve the problem?
So what?
R is bad, I guess.)
what did you do to solve the problem?
Bad/good is too critical a judgement.
It's obvious that the package code is not memory efficient but can be fast, or the script copies the whole table\selection many times.
And what you did - you found the problem and reported it to a professional hoping for help.
What does this code do? For speed, you should convert all columns to the same data type (float 32, 16 - no need, it will be slower) and calculate the number of columns using fast arrays.
if we are talking about the real correction of the kaRma
As far as I understand, there is no concept of different data types (int, float, etc.) in R at all. And, it will reduce the memory size, but it will not affect the speed much. On video cards, yes, there will be an increase.
As far as I understand, in R there is no concept of different data types (int, float, etc.). And, it will reduce the memory size, but it will not affect the speed much. On video cards, yes, there will be an increase.
everything is there. It will affect the speed catastrophically. Dataframes are the slowest beasts with the most overhead.
It's not about video cards, it's about understanding that such things don't count through dataframes in a sober state.
Tip: Is it necessary to use vectors of 100,000 observations to see the correlation between them?
I am looking for highly correlated vectors, i.e. with correlation greater than 0.9.Your script has been running for more than a day and has not yet created a single file based on the results of the screening. I don't know, maybe it's time to switch it off?
Depends on the zhekez and sample size. If your processor is multi-core, parallelise the execution. Below is a variant of parallel execution
Four times faster than serial. Hardware and software
Good luck