Most people don't need backwards compatibility to the end of time. As long as it runs their favorite OS and browser, it works fine.
Mobile phones are a great example: Intel refused to trim the fat in their architecture, even when they had their short lived attempts at mobile x86.
Phones are already cumbersome to use, I don't need to run WordPerfect on one.
@polarisfm But is it really supporting old instructions that is slowing Intel down? I was under the impression that once you get past the decode stage, the inside of CISC and RISC chips looks remarkably similar. It's just that after nearly 40 years of optimizations, they've run out of areas where they can optimize. ARM started later and smaller so they are only now approaching those same limits that intel hit a few years ago.
But I'm not so up to date on CPU design. Do you have some reference that shows that old op-codes are dragging down Intel's development?
@praxeology I should have been more clear, sorry.
It's a combination of things. I think mostly it was Intel's complacency for many years, putting R&D money into everything but CPUs. The sheer amount of instructions has definitely caused other issues for them, namely a ton of security issues.
There are so many instructions that haven't been properly audited. Intel certainly had the money and resources to do this, and to develop ways to properly mitigate issues that are cropping up now.
@praxeology I think one of the most important things is power draw. Devices are getting smaller and more portable, something x86 really wasn't designed for. Intel had astronomical amounts of money to throw at that problem, but they seemingly never committed to it. The Intel Atom CPUs they aimed at mobile phones and tablets, for example, were killed off very quickly.
Mismanagement hurt Intel the most, on top of ARMs efficiency. They could have rivaled ARM for longer if they hadn't been docile.
A friendly mastodon instance primarily for shitposting, gays, and the glory of the free and open source software movement.