You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Sorry mate, it's customary to have your own spoon.
He dropped off an archive of exponent-processed tick data a couple of sheets ago, and now it turns out they don't have time stamps.
He dropped off an archive of exponent-processed tick data a couple of sheets ago, and now it turns out they don't have time stamps.
Yeah, embarrassing, what can I say... I'll start collecting with time stamps next week...
Throw an example of the archive, a script to write how to transfer two bytes.
https://yadi.sk/d/snmT60R43RNUeL file AUDCAD_3DC.rar 247 Mb
Here are the ticks for 3+ years (from 2014 till 2017 October 28.) for AUDCAD, a tool, already processed many times by Alexander, for 3 DC, one of which was quoted 4 digits and its ticks have ended on 26.02.2016. The ticks were taken from http://advancetools.net/index.php/instrumenty/tikovye-ob-emy/istoriya-tikov, unzipped and merged into 3 solid files. Didn't do any checks. The size of 2 of the 3 .csv files is over 2GB.
The resource does not explicitly state, but according to my information, Igor Gerasko should be thanked for these tics.
Nikolai, here is the ruble/dollar archive from the exchange.
Format:
Date Time in msec Bid Ask Last Volume
There are a lot of doubles in the archive by time, I don't understand what the gimmick is. Like this
Also, the archive is not sorted by time, the timings are random. Sometimes they are a minute ahead, sometimes even more.
Hence the question: Process data? so that they measured only timing, not paying attention to what goes from each transaction.
I think we have to check for complete duplicates, not only by time but also by price, volume, direction.
There's a lot of time duplication in the archive, I don't know what the big deal is. It's like this.
Also, the archive is not sorted by time, the timings are all over the place. Sometimes a minute ahead, sometimes even more.
Hence the question: should the data be processed? so that they measure only timing, without paying attention to the fact that the recording comes from each deal.
This is not a duplicate; it's just a certain market volume which is spread among several Limit orders. The prices are different there. It should be sorted by time - can you specify the time where the sorting is broken?
This is a question for Alexander, by the way. There are several increments at one and the same time (if you follow your logic) - and how should we calculate?
These are not duplicates - just some market volume smeared among several limit orders. The prices are different there. It should be sorted by time - can you specify the time where the sorting is broken?
This is a question for Alexander, by the way. You obtain several increments at one and the same time (if you follow your logic) - and how should we calculate them?
Pardon me for interfering. Imho, fair to count as on the exchange, in the system "netting": [average] price of position = cost of all transactions / volume of all transactions.
Where transaction value = transaction volume * exchange rate. For example, if you bought 1.2 lots of EURUSD at 1.2025, then the transaction value = 120,000 * 1.2025 = $144,300.
These are not duplicates - just some market volume smeared among several limit orders. The prices are different there. It should be sorted by time - can you specify the time where the sorting is broken?
This is a question for Alexander, by the way. There are several increments at one and the same time (if you follow your logic) - and how should we calculate?
There's a lot of time duplication in the archive, I don't know what the big deal is. It's like this.
Also, the archive is not sorted by time, the timings are all over the place. Sometimes a minute ahead, sometimes even more.
Hence the question: Process data? so that they measured only timing, not paying attention to what goes from each transaction.
I think we still have to check for complete doubles, not only by time but also by price, volume, direction.
Oh, come on, Nikolai. I can see it's not a quick matter - I'll get the tics together next week and check it out myself. But, if you are seriously interested - it will be interesting to see your results.
Do stock market data really suit you, Alexander?