According to the Moors law, we can squeeze transistors at a level one day; we can not compress it more so we will have to add more components to make a faster computer. So in the long run, our computer will eventually be bigger, like the first generation vacuum tube computer.
But now AI has killed this idea. AI is an Artificial Intelligence, which is a machine learning process using the processor as its central brain, and it implements after learning. Like a human baby, the baby learns a lot of things from parents and school. He then applies those techniques in his life, which he thinks the best.
In computers, AI will learn more things and utilize minimum resources giving maximum output. AI can do lots of things; if we have to make a machine capable of doing such things, we have to make a lot bigger machine, maybe 100 and 1000x times bigger than the AI chip. So, in our computer AI will learn and give more output while using minimum resources.
For example, in a recent release, the NVIDIA graphic card 2080, has AI inbuilt. If we have to record our voice, this AI can record only our voice while eliminating outside disturbances. But if we have to make an electronic device capable of doing this, it will be huge, so AI is bending Moore’s law, not wholly killing it.