Public discussion of the formula for calculating the cost of resources in the MQL5 Cloud Network - page 10

 
YuraZ:

the interesting thing is that incoming traffic is increasing by leaps and bounds!

I've got 32 cores on my 32 cores, scattered across stations, on one station with 8 cores alone, 1.5 gigabytes downloaded in just one day

We drastically reduce the amount of duplicate traffic when running multiple agents from the same directory. It means that if one core downloads data for a symbol, it will be available for other cores as well.

While the tests are going on, we are now balancing many cloud computing processes.

 
Renat:

We will drastically reduce the amount of duplicate traffic if multiple agents are running from the same directory. That is, if one core has uploaded data by symbol, it will be available to other cores as well.

While the tests are going on, we are now balancing many cloud computing processes.

Now that's better, because even though there is no limit (and no limit) and there are enough disks, there is still a lot of stress ....
 
Renat:

We will drastically reduce the amount of duplicate traffic if several agents are launched from one directory. It means that if one core has uploaded data for a symbol, it will be available for other cores as well.

While the tests are going on, we are now balancing many cloud computing processes.

That's great news,

Although it will not solve the general problem, one has free Internet, electricity is a pain in the pockets of the other, the traffic hurts his pocket, the third would be happy to connect all the machines in the local network at work, but the boss will not understand such altruism.

All problems are solved by paying for the service at once.

I have no intention to make a business out of my car rental service, but I will not give away my car to the detriment either.

Again, the idea is lost, I give my machine to others for free for a few months (and therefore without accounting), and when I need it it turns out that everyone's machine is busy !!!! and then what, resent the whole world?

 
enat:

We will drastically reduce the amount of duplicate traffic if multiple agents are running from the same directory. It means that if one core has downloaded data on a symbol, it will be available for other cores as well.

While the tests are going on, we are now balancing many cloud computing processes.

Yes, that's certainly a relief! Thank you!

--

In principle, the maximum required volume is easy to calculate.

At the moment, the only thing that's scary is the number of dealings. Here's one of the 8 cores that got 11 dealings

The information is essentially the same.

Today I see 7 gigabytes, yesterday it was 1.5 gigabytes

i think i will exceed my monthly traffic limit of 15 gigs per month

too bad, i will have to unplug my 32 cores and maybe leave only one core at most

--

the very idea is cool! it tests with great speed, those who have traffic without limit and lots of disk space

I don't have a problem with disk space but I do have a problem with traffic.

 
YuraZ:

Today I am already watching 7 gigabytes, yesterday it was 1.5 gigabytes

i think i will exceed my monthly traffic and i will be cut in speed! my limit is 15 gigs per month

Too bad, I will have to disable my 32 cores, leaving probably no more than one.

You have been actively testing, I have only 2.5-3 Gb downloaded (my disk has 60-75 Gb for this).

And there will be a little less dealings of yours.

 

It's not that bad at all!

In general, the volume needed is a one-off download!

Therefore, the spike in traffic will only go in the beginning.

So, if to suffer this moment the problem is not so global.

Yes, and disk space as it is possible to allocate as it is approximately possible to know its necessary volume!

---

So the only thing that will scare you is the number of dealers!

If the developers solve the problem with duplication of data, it will not be a problem

 
Interesting:

It's actively testing, I've only downloaded 2.5-3 Gb so far (I've allocated 60-75 Gb on my disk for this).

And the number of dealers is a bit less than yours.

that's only on one machine!

i have more than 10 machines in the pool

i have 34 cores

 
YuraZ:

It's not that bad at all!

In general, the volume needed is a one-off download!

Therefore, the spike in traffic will only go in the beginning.

So, if to suffer this moment the problem is not so global.

Yes, and disk space as it is possible to allocate as it is approximately possible to know its necessary volume!

---

so only the number of dealing rooms will scare you!

If the developers will solve the problem with the duplication of data, the problem will not become

There is a but, or rather even two:

1. There will be at least 100 popular dealing desks, and maybe more.

2. each of them has many more tools than on the server of the developers, and at the moment the history is not in very good quality. If they decide to change the history there, it will be automatically downloaded again.

PS

By the way, I allocated space on the basis that there will be no more than 100 servers to 500 Mb each.

But if the financial component I will arrange can easily increase the disk space at least 10 times.

 
Interesting:


PS

By the way, I have allocated space on the basis that there will be no more than 100 servers 500 Mb each.

But if the financial component will satisfy me, I can easily increase the amount of disk space by about 10 times.

And how much for one average core, for 24 hours, a minimum would suit you?
 
YuraZ:


How much for one average core, in 24 hours, would you be satisfied with a minimum?