![MQL5 - Language of trade strategies built-in the MetaTrader 5 client terminal](https://c.mql5.com/i/registerlandings/logo-2.png)
You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Using light to transmit information is not new, it is used in cables. Using it in a chip is fairly new, but too inconvenient because you have to modulate the digital information to the light, add photodiodes and LEDs, and the fibre itself is much thicker than a metal connection. There are no particular advantages. Photonic transistors and memory do not yet exist, as I understand it. Although there are ideas of using photons as qubits in a quantum computer.
...
So, starting around 2021-2022, computers and cell phones will stop increasing the performance of their processors. It will not make sense for people to buy a new computer, iPad or selfphone if their processor has the same power as the old generation of these devices. The sale of new devices will drop. Because these devices affect many industries, experts predict a worldwide recession. The market is likely to start reacting to the end of Moore's Law before 2020.
Is it easier to write a program for parallel CPU cores than for GPU? The problem is the same: the programmer has to rack his brains and decide which pieces of a program can be paralleled, write special paralleling code and so on. Most programmers do not suffer and write single-core programs without twists. What is the problem here: lack of cores or programs using multi-core? I think it's the latter. Even if I give you a CPU with 3000 cores you will still write single core programs since there is no difference in difficulty of writing programs for 3000 cores and for 4 cores. What is needed is a new compiler that can automatically detect pieces of code that can be paralleled. But again the progress in creating such a compiler depends not on the hardware but on the programmers' willingness to write such a compiler. Throughout this thread I am stating that the possibility of creating new hardware after 2020 is diminishing due to the advances in semiconductor technology and the reduction in the size and power consumption of transistors. New materials and transistors are still over the horizon. Intel tried to create Knight Hill generation of processors on 10nm technology in 2016 and postponed that generation until late 2017. Samsung too has problems with their 10nm technology for their app processors. Already at 10nm size, the transistors only give a small reduction in size and power compared to 14nm. Heat dissipation becomes a big problem. A leap in technology is needed. One of the indicators of technology is the price per transistor. So, that price was dropping before 28nm, and after that it started rising exponentially. Many companies stopped at 28nm because of the price. So the progress towards 10nm technology and then 7nm and the last 5nm will be accompanied not only by heat problems but also by high price.
Somehow no one has mentioned NP-complete problems.
For some reason nobody has mentioned NP-complete problems. Well, there are many such problems, and for none of them is there any effective way of solving them. Any 10x, 100x, 1000x increase in computer performance is useless in effectively finding solutions to these problems. This is one of the fundamental problems of mathematics, but not of silicon crystal engineering. From this point of view, the proposed topic in general seems meaningless and the problem sucked out of thin air.
Somehow no one mentioned NP-complete tasks. Well, there are many such problems, and for none of them there is any efficient way to solve them. Any 10x, 100x, 1000x increase in computer performance is useless in effectively finding solutions to these problems. This is one of the fundamental problems of mathematics, but not of silicon crystal engineering. From this point of view, the proposed topic seems senseless at all and the problem sucked out of thin air.
You don't need sudoku performance.
What actually happens if the productivity of an instance of iron does not increase? And what is the share of devices in the global GDP? Probably less than, say, pharmaceuticals or software for the same iron. Soon the devices will be given for free as an attachment to some super popular gizmo. They give branded smartphones for a symbolic price.
And the production in China should not go down. If you can't catch Pokémon with one smartphone, catch them with two.
...
Soon the devices will be given away for free as an add-on to some super popular thing.
...
Plus
You don't need a super computer to catch pokemon and watch kate.
I see a swallow catching flies outside my window.
Searching for the target, capturing the target, catching the target, eating it - all in a very chaotic flight. Do we have a flying machine packed with gigahertz and gigabytes to fly like that?
And it's all done by a computer the size of the tip of your little finger.
But apart from controlling the flight, the same swallow's computer controls all its internal processes, keeping all their unknowable totality in a certain equilibrium state!
I've caught a computing device called an "arithmometer": you twist a knob and numbers pop up. That was 50 years ago. After all, nothing qualitative has happened in the computing world since then, only quantitatively: the knob is spinning faster.
And yet the future of computing is just above our noses.
I see a swallow catching flies outside my window.
Searching for the target, capturing the target, catching the target, eating it - all in a very chaotic flight. Do we have a flying machine packed with gigahertz and gigabytes to fly like that?
And it's all done by a computer the size of the tip of your little finger.
But apart from controlling the flight, the same swallow's computer controls all its internal processes, keeping all their unknowable totality in a certain equilibrium state!
I've caught a computing device called an "arithmometer": you twist a knob and numbers pop up. That was 50 years ago. After all, nothing qualitative has happened in the computing world since then, only quantitatively: the knob is spinning faster.
And yet the future of computing is just above our noses.