You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Moore is just advertising his brainchild and doesn't understand anything about the progress of science at all, he must be.
Piston aviation died in a similar way - it has reached its ceiling, such an engine has constant power. A jet engine with constant thrust is another matter.
Well, it's the same with processors, I guess. I think the answer is in the new materials. The voltage drop on a germanium diode is less than on a silicon diode (in that vein, I don't mean to replace krmium with germanium).
And anyway, what does today's"world economy" spend its increased hardware capabilities on? On releasing a new version of Windows? )) (win10 eats more than xp).
Well, it's commonplace. There are many examples in the world, the internal combustion engine, for example - you cannot constantly increase the compression ratio to increase efficiency (the more you increase it, the less efficient it is) and make it more high-speed. And here they have stalled, and serious breakthroughs are unlikely. And before, you could derive some comrade's law, according to which the efficiency of the engine was rapidly increasing.
Piston aviation died in a similar way - it has reached its ceiling, such an engine has constant power. A jet engine with constant thrust is another matter.
Well, it's the same with processors, I guess. I think the answer is in the new materials. The voltage drop on a germanium diode is less than on a silicon diode (in that vein, I don't mean to replace krmium with germanium).
And anyway, what does today's"world economy" spend its increased hardware capabilities on? On releasing a new version of Windows? )) (win10 eats more than xp.)
... and the new technology has not appeared, not even on the horizon.
What is this new technology for?
Is it for PR, to encourage exalted people to replace the computer?
I suggested above to compare BESM-4 with modern computers in terms of the number of floating point operations. No one can. Although floating point is a better measure of progress on computational tasks.
From my perspective as a user with very limited knowledge of programming techniques, I don't need gigahertz at all. I need to be sure that, without making special individual efforts, I'll get a code which is as efficient as possible at a given stage of development. And gigahertz is clearly not first place here.
I'll put the following circumstance in the first place. If my application algorithm involves some computationally complex algorithm, e.g. optimization or operations with matrices, then I will AUTOMATICALLY use the most efficient code possible, both in terms of writing technique and hardware usage, e.g. using much threading. And if the last 15 years gigahertz have increased within 100%, with my approach automatic, e.g. using MKL library for matrices, leads to an increase in performance by orders of magnitude.
The moral of my words: PC performance can't be measured by gigahertz, the performance should be measured by the speed of execution of applied tasks.
The first GPU with a significant number of cores appeared in 2007 (Nvidia, 128 cores). Today, the number of cores has reached 3,000. Did you experience a 128x speedup of your computer in 2007 versus 2006? How about 3000x acceleration today compared to 2006? There isn't one. Cores continue to be used in graphics where paralleling is easy. In 2009-2010 I tried to program something on a GPU with 256 cores myself. I immediately found out why the software mostly doesn't use multi-core - it is very difficult, the developer has to manually decide which parts of the program can be paralleled. Well, I did finish the new code, it was 3 times faster with 256 cores. But I was even pleased with the 3 times acceleration. But when I had to create the next code I recalled my agony and stopped parallelizing anymore. Of course, separate problems such as graphics and database handling will continue to use multi-core, but other programs will only be helped by a new compiler which automatically finds places in the program which can be paralleled, but no such compiler exists as far as I know.
I don't deny the potential of improving the software and making computers faster on this basis. I argue that the purchase of new computers and smartphones will fall in 2021-2022 due to the stalling of hardware improvements. Will you buy a new computer if it has the same cores, memory and frequency as your old computer? Probably not, you will buy new software. All hardware manufacturers and related industries will go into recession, with mass unemployment.
I read somewhere that an American institute created a processor with 1000 cores, but no interaction with memory.
The number of cores will also stop growing since running several cores in parallel consumes more power than one core. Decreasing the transistor size led not only to an increase in the number of cores but also to a decrease in transistor power which eventually allowed the total power to be kept at approximately the same level.
Vladimir, let's remember the good old ESka principle(ES_computers).
Now we see the same thing (in principle) handheld terminals and connection to a cloud service with essentially limitless computing capabilities.
So in the future there will be computational farms that will be renting resources to users and the desktops and handhelds will only be terminals communicating with the mainframe.
Conclusion: Intel will be busiest to the hilt since even those processors which are available will be needed more and more and I don't foresee any stagnation in this industry (imho).
Vladimir, let us remember the good old principle of the ECU(EC_computer).
Now we see the same thing (in principle) handheld terminals and connection to a cloud service with essentially limitless computing capabilities.
So in the future there will be computational farms that will be renting resources to users and the desktops and handhelds will only be terminals communicating with the mainframe.
Consequently Intel will be busy as much as possible since even the existing processors will be needed more and more and I don't foresee any stagnation in this industry (imho).
And this was actually the OS on the EU.
A virtual machine system