Features of the mql5 language, subtleties and tricks - page 94
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
and how do you see the practical application of GetMicrosecondCount that it spoils the whole work of the program in the current version? describe the practical application
for example, neither in C++, nor here, i do not see variants other than those described by Renat, i.e. measuring code execution time with accuracy in mcs
I don't understand your insistence, frankly speaking
The scope of this function is quite extensive, if you have a flight of fancy.
I already mentioned above about multitimer, which allows you to run several timers with different periods at the same time.
This is also what fxsaber already wrote about.
The microsecond function compared to millisecond function is useful not only for speed tests, but also for taking various telemetry information during runtime of Expert Advisors.
If the accuracy of such telemetry is 16 ms (more precisely 1/64 s = 15625 microseconds) - this is a very big error.
It is configured, but it does not help - I do not understand the reason. True, my server is ntp2.stratum2.ru
If you use GetTickCount for such long intervals, you shouldn't have any problems.
There will be problems if you use GetMicrosecondCount.
If using the microsecond function is a matter of principle, you'd better use this variant of the function.
Information for taking into account:
Approximate execution time of the functions:
- GetTickCount - ~ 2 ns
- GetMicrosecondCount - ~ 30 ns
-RealMicrosecondCount - ~ 40 ns
If you use GetTickCount for such long intervals, there shouldn't be any problems.
There will be problems if you use GetMicrosecondCount.
If using microsecond function is a matter of principle, you'd better use this variant of the function.
Information to be taken note of:
Approximate execution time of the functions:
- GetTickCount - ~ 2 ns
- GetMicrosecondCount - ~ 30 ns
-RealMicrosecondCount - ~ 40 ns
I'm using someone else's code, there is no such functions, but the effect of desynchronization occurs.
What is the probability of changing the local computer time between two calls to GetMicrosecondsCount used to measure the time in microseconds?
This depends, first of all, on the specified period of synchronization of the system time with the Internet time. For example, I set synchronization once a day, and during this time there is more than 1 second of divergence. Someone synchronizes once an hour, or even more often. So, calculate the probability.
...because I already know about such feature of GetMicrosecondCount() and that this function is more than an order of magnitude slower than GetTickCount.
If we were to compare PerfomanceCount and GetTickCount directly, the difference would be much smaller.
Although, frankly speaking, I don't really understand all this talk about execution speed when we're talking about 2-20 nanoseconds. This difference can only be really felt by running a nearly empty loop (with this function) for a hundred million iterations. This is in itself a wrongly designed solution.
And yes, you'll be just as blown away by a pure WinAPI function (whether GetTickCount or QueryPerformanceCounter) when you slip a scrap into the chainsaw by changing the date even by seconds. There's no protection at all that you're talking about supposedly having. Sucked out of your hand as a problem and alleged solution.
So everything is correct - this is how WinAPI is and this is reality.
You're wrong. I specifically cited the code here using WinApi. Run it, change the clock in the process, and look at the result.
It's not very clear, how to conduct any dialogue with you, if your arguments are based on speculations. And you don't even consider it necessary to get acquainted with the course of topic discussion.
Scope of this function is very broad, if there is a flight of fancy.
I already mentioned above about multitimer, which allows you to run several timers with different periods at the same time.
This is also what fxsaber already wrote about.
The microsecond function compared to millisecond function is useful not only for speed tests, but also for taking various telemetry information during runtime of Expert Advisors.
If such telemetry is accurate at 16 ms (more precisely 1/64 s = 15625 microseconds), it is quite a big error.
I constantly measure the speed of execution, as Renat wrote, I have never seen any problems, you seem to wind up from nothing, or do not want to change what was written before, it happens when as the saying goes on ... shi and then you realize that all was written in vain, no offense, but the same multitimer you mentioned easy to implement without any errors, but for this you have to pay, Renat above also gave a description in response to my question
The mouth of the Truth is mute to the uninitiated.
well, yes))
It depends, first of all, on a set period of synchronization of system time with internet time. For example, I set synchronization once a day, and during this time there is more than 1 second of divergence. Someone is synchronized once an hour, or even more often. Here you can estimate the probability.
Are you sure you read the whole question?
...between two calls to GetMicrosecondsCount...