You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
So in the future there will be building compute farms which will be dedicated to users and desktops and handhelds will be just the terminals for communicating with the mainframe.
Hence the moral: Intel will be busier than ever because even those processors that are available will be needed more and more and I don't foresee any stagnation in this industry (imho).
The power consumption of the logic circuit element is calculated using the formula:
P = f*C*V^2
where f is the frequency, C is the load capacitance (input capacitance of the next element + capacitance of the metal connection), V is the supply voltage. Frequency has stopped increasing in the last 5-10 years, 2-3GHz. Smaller transistor size led to lower load capacitance (lower input transistor capacitance and shorter connections between transistors) and lower supply voltage. When I started in the industry the supply voltage was 5V, then 3V, 1.5V, 1V and now 0.5V. Every new generation of silicon technology now leads to a voltage reduction of 0.1-0.2V. When Moore's law stops, the power stops decreasing and the number of cores stops growing.
Few people are aware of the fact that all integrated circuit technology was developed by Intel. Every company in the world is copying Intel. They invented FinFET 10 years ago and spent all these 10 years to put it into production. My friends at Intel tell me they don't have any more ideas. Our company is funding research at various universities but so far there is nothing. The world is on the verge of some pretty dire consequences of the end of Moore's law. In difficult economic times, world wars usually occur, leading to a surge of investment by states in new technologies and the subsequent development of those technologies for peaceful purposes. This was the case during WWII - Alan Turing invented the computer to decode German military messages. 25-30 years ago, as a consequence of the computer revolution, there was a need to network computers and the internet was born. In the last 10 years, the Internet has essentially changed little. Today smartphones can connect to the Internet at almost the same speed as a home computer. I can't imagine what new technology will take the place of computers and the internet and allow worldwide economic growth to continue.
Can you recommend a CPU with 256 cores
The limit of development in 2020
How can you be so sure that if you can't see a prospect, there isn't one?
I admit that it is possible to be a good expert in one industry and have a good idea of what awaits it on the horizon of 2-5 years (although in the technology sector this is too distant horizon, imho).
But one cannot be aware of all research in all related fields, can one?
If a ceiling of gigahertz or gigabits/sec is reached, then some alternative (surely an order of magnitude more powerful) will be found and the world will continue to evolve. Unless, of course, that development is necessary.
It is like arguing about hydropower production and being upset that in 5 years all the rivers will be used as efficiently as possible, but not seeing that there are plenty of much more powerful alternatives, from heat pumps to nuclear energy.
Why the melancholy? )
NVidia latest > 2000 cores
I can't find any processors from them, only cards
The card is the processor. You write code in Visual Studio 10 in CUDA C language, compile it on the GPU and run it. Writing code for the GPU is much harder than for the CPU. You need to add commands for memory allocation on the GPU (there is usually not much of it), transferring data from the CPU to the GPU memory, special paralleling commands, then rewriting the data back, freeing memory, etc. A lot of different subtleties, but you get to use 3000 cores. See here
https://developer.nvidia.com/how-to-cuda-c-cpp
The card is the processor. You write code in Visual Studio 10 in CUDA C language, compile it on the GPU and run it. Writing code for the GPU is much harder than for the CPU. You need to add commands for memory allocation on the GPU (there is usually not much of it), transferring data from the CPU to the GPU memory, special paralleling commands, then rewriting the data back, memory release, etc. A lot of different subtleties, but you get to use 3000 cores. See here
https://developer.nvidia.com/how-to-cuda-c-cpp
How can you be so sure that if you can't see the prospect, there isn't one?
I admit that you can be a good expert in one industry and have a good idea of what's in store for it in the horizon of 2-5 years (although in the technology sector that's too far off a horizon, imho).
But one cannot be aware of all research in all related fields, can one?
If a ceiling of gigahertz or gigabits/sec is reached, then some alternative (surely an order of magnitude more powerful) will be found and the world will continue to evolve. Unless, of course, that development is necessary.
It is like arguing about hydropower production and being upset that in 5 years all the rivers will be used as efficiently as possible, but not seeing that there are plenty of much more powerful alternatives, from heat pumps to nuclear energy.
Why the melancholy? )
There is no melancholy, there is fear for the future, both my own and others. It is certainly so easy to live when you trust in scientists that they will find a solution to a problem, a new technology, a cure for cancer or a solution to global warming. The end of Moore's law is quite relevant. Read recent articles on the subject. My view may be pessimistic, but it is based on a deep knowledge of semiconductor technology and the latest research in the field by virtue of my specialty. It takes about 10 years to bring a new technology to mass production, and so far no such technology has appeared in the laboratories of companies or universities. So I expect 5-10 years of stagnation, maybe even longer, in computer technology. There is a worldwide organization ITRS (International Technology Roadmap for Semiconductors) which consists of employees of major semiconductor companies and publishes the semiconductor roadmap for the near future (their vision of where the technology is going). They have been publishing this roadmap since 1965, every two years. The last one was the 2014 report. The next report should be published this summer. Everyone in the field was looking forward to this report, but it never came out and the organization was renamed the International Roadmap of Devices and Systems (IRDS) and subordinated to the IEEE. This new organization will publish a roadmap of computer and communication systems, software, etc. What will be included in this report is pretty vague.
http://www.eetimes.com/document.asp?doc_id=1329604
I asked about the processor, not the expansion card they have different mounting slots.
GPU = graphics processing unit (produced mainly by Nvidia)
CPU = central processing unit (made by Intel or AMD)
Both are processors. Don't you get it? Call GPU a card or whatever you want but it is a processor with 3000 cores if you have the latest model. If you have a computer it's also a GPU, read the documentation what model you have and how many cores it has.