That is, unless we hit some road-block when it comes to improving computer components. Silicon is already close to its limits, so something better, that is as cheap to manufacture, is going to be required to get much further than we are today. There are some promising technologies that are being explored, but no one's really made a functioning prototype with them yet.
If we actually do reach such a road block, I suspect we'll get much more cloud-based tech, maybe even just a mini-cloud that you'd put somewhere in your house, that would distribute high-quality, super-low latency video streams to our smaller devices, reducing their need for processing power. The main drive in semiconductor tech now is to increase performance per watt, rather than performance in total, although total performance does creep upwards a bit every gen too. It's just not a lot anymore.
Due to the enormous power bills of supercomputers, this works just as well for them as making more powerful single chips, as they can just use several less powerful that have higher performance:watt ratio and it'll still pay off in the end because they can calculate more with less power. It does take more space, however.
I really doubt we'll have commercial direct brain interface chips in 50 years. Maybe in 100, though. The current technologies we have for brain interfaces is... extremely primitive, and research into it is going slow, due to not a lot of people being interested in having their skulls cut open and be guinea pigs. It's basically only done with people who have severe disabilities and could gain a lot from even low-level primitive interfaces. These interfaces are also big, and require cables going through your skin, and sometimes you start bleeding around the plug.

