Global recession over the end of Moore's Law - page 10

 
Vladimir:
Using light to transmit information is not new, it is used in cables. Using it in a chip is fairly new, but too inconvenient because you have to modulate the digital information to the light, add photodiodes and LEDs, and the fibre itself is much thicker than a metal connection. There are no particular advantages. Photonic transistors and memory do not yet exist, as I understand it. Although there are ideas of using photons as qubits in a quantum computer.
This is not new either, last century optronics. Back in the 80's they were used as switches, micro assemblies. Diodes are structurally simpler than transistors, per layer. You just have to invest more in development. Even if you don't see the future.
 
Vladimir:

...

So, starting around 2021-2022, computers and cell phones will stop increasing the performance of their processors. It will not make sense for people to buy a new computer, iPad or selfphone if their processor has the same power as the old generation of these devices. The sale of new devices will drop. Because these devices affect many industries, experts predict a worldwide recession. The market is likely to start reacting to the end of Moore's Law before 2020.

You can't see the forest for the trees. CPUs are now powerful enough to handle all everyday tasks. Even now, increased CPU performance is not demanded by users. They prefer less powerful smartphones and tablets to bulky but powerful desktop computers.
 
Vladimir:

Is it easier to write a program for parallel CPU cores than for GPU? The problem is the same: the programmer has to rack his brains and decide which pieces of a program can be paralleled, write special paralleling code and so on. Most programmers do not suffer and write single-core programs without twists. What is the problem here: lack of cores or programs using multi-core? I think it's the latter. Even if I give you a CPU with 3000 cores you will still write single core programs since there is no difference in difficulty of writing programs for 3000 cores and for 4 cores. What is needed is a new compiler that can automatically detect pieces of code that can be paralleled. But again the progress in creating such a compiler depends not on the hardware but on the programmers' willingness to write such a compiler. Throughout this thread I am stating that the possibility of creating new hardware after 2020 is diminishing due to the advances in semiconductor technology and the reduction in the size and power consumption of transistors. New materials and transistors are still over the horizon. Intel tried to create Knight Hill generation of processors on 10nm technology in 2016 and postponed that generation until late 2017. Samsung too has problems with their 10nm technology for their app processors. Already at 10nm size, the transistors only give a small reduction in size and power compared to 14nm. Heat dissipation becomes a big problem. A leap in technology is needed. One of the indicators of technology is the price per transistor. So, that price was dropping before 28nm, and after that it started rising exponentially. Many companies stopped at 28nm because of the price. So the progress towards 10nm technology and then 7nm and the last 5nm will be accompanied not only by heat problems but also by high price.

There are a number of tasks that are fundamentally impossible to parallelize. Parallelism is not a cure-all.
 
For some reason nobody has mentioned NP-complete problems. Well, there are many such problems, and for none of them is there any effective way of solving them. Any 10x, 100x, 1000x increase in computer performance is useless in effectively finding solutions to these problems. This is one of the fundamental problems of mathematics, but not of silicon crystal engineering. From this point of view, the proposed topic seems senseless at all and the problem sucked out of thin air.
 
Vasiliy Sokolov:
Somehow no one has mentioned NP-complete problems.
Patamusta
Алгоритм Гровера — Википедия
  • ru.wikipedia.org
Алгоритм Гровера (англ.  , GSA) — квантовый алгоритм решения задачи перебора, то есть нахождения решения уравнения Предполагается, что функция задана в виде чёрного ящика, или оракула, то есть в ходе решения мы можем только задавать оракулу вопрос типа: «чему равна на данном », и после получения ответа использовать его в дальнейших вычислениях...
 
Vasiliy Sokolov:
For some reason nobody has mentioned NP-complete problems. Well, there are many such problems, and for none of them is there any effective way of solving them. Any 10x, 100x, 1000x increase in computer performance is useless in effectively finding solutions to these problems. This is one of the fundamental problems of mathematics, but not of silicon crystal engineering. From this point of view, the proposed topic in general seems meaningless and the problem sucked out of thin air.
As computational power grows, the complexity of the problems is negated as it becomes possible to use AI capable of self-complicating as much as possible. Thus all the problems that have solutions will be solved not by increasing computational power, but by increasing the capabilities of the Solver (AI). This is a kind of qualitative transition from the tangible to the informational (intangible). Quantum computers will be a dead-end development (apparently) of that very, material, to a new qualitative transition to emergence of a self-complexifying Resolver.
 
Vasiliy Sokolov:
Somehow no one mentioned NP-complete tasks. Well, there are many such problems, and for none of them there is any efficient way to solve them. Any 10x, 100x, 1000x increase in computer performance is useless in effectively finding solutions to these problems. This is one of the fundamental problems of mathematics, but not of silicon crystal engineering. From this point of view, the proposed topic seems senseless at all and the problem sucked out of thin air.

You don't need sudoku performance.

What actually happens if the productivity of an instance of iron does not increase? And what is the share of devices in the global GDP? Probably less than, say, pharmaceuticals or software for the same iron. Soon the devices will be given for free as an attachment to some super popular gizmo. They give branded smartphones for a symbolic price.

And the production in China should not go down. If you can't catch Pokémon with one smartphone, catch them with two.

 
Yuri Evseenkov:

...

Soon the devices will be given away for free as an add-on to some super popular thing.

...

Plus

You don't need a super computer to catch pokemon and watch kate.

 
Vasiliy Sokolov:

I see a swallow catching flies outside my window.

Searching for the target, capturing the target, catching the target, eating it - all in a very chaotic flight. Do we have a flying machine packed with gigahertz and gigabytes to fly like that?

And it's all done by a computer the size of the tip of your little finger.

But apart from controlling the flight, the same swallow's computer controls all its internal processes, keeping all their unknowable totality in a certain equilibrium state!

I've caught a computing device called an "arithmometer": you twist a knob and numbers pop up. That was 50 years ago. After all, nothing qualitative has happened in the computing world since then, only quantitatively: the knob is spinning faster.

And yet the future of computing is just above our noses.

 
СанСаныч Фоменко:

I see a swallow catching flies outside my window.

Searching for the target, capturing the target, catching the target, eating it - all in a very chaotic flight. Do we have a flying machine packed with gigahertz and gigabytes to fly like that?

And it's all done by a computer the size of the tip of your little finger.

But apart from controlling the flight, the same swallow's computer controls all its internal processes, keeping all their unknowable totality in a certain equilibrium state!

I've caught a computing device called an "arithmometer": you twist a knob and numbers pop up. That was 50 years ago. After all, nothing qualitative has happened in the computing world since then, only quantitatively: the knob is spinning faster.

And yet the future of computing is just above our noses.

The euphoria about neural algorithms was gone back in the 80's. Much hope was pinned on them. Many tasks with their help were successfully solved. But on the whole, the idea failed, artificial intelligence was never created.