But they can block Nvidia and ATI hardware easily enough anyway, so having their own API doesn't buy them control over hardware.
It works for Microsoft because years ago they managed to become top dog in gaming APIs, albeit through traditional Microsoft dubious means of spreading FUD about OpenGL whilst implementing it really badly to back up their claims. I can understand why they don't want to give up that top spot.
The Mac Pro currently has ATI hardware - this will render that machine obsolete. They will block via MacOS update to only use their own silicon at the same time as going ARM only thus controlling the software and hardware rather like Micro$oft did.
I have no issues with this to be fair - one apple product here and it is absolute rubbish and always has been and is used for one purpose only
Old puter - still good enuff till I save some pennies!
A fair point. I was thinking when I wrote my previous post that it could just end in Apple discontinuing the Mac line completely and sticking to iPhone and iPad for the future.
Although that would mean the money put (and to be put) into all of the conversion work goes to waste, but it could still end up that way in time.
They've already done that - their workstation user base is ~5 guys who're stuck with software that hasn't been ported to windows yet. The likelihood of this persisting after the ARM switch is slim, as the software hasn't gone to other X86 OSes due to the effort involved but switching to windows/linux looks a lot easier than switching to ARM. There's also the big looming issue of apple wanting 30% of revenue for the expensive professional software, which will go down like a lead balloon
The dedicated encoding card in the mac pro is actually the most open way of doing that (and far better than anything on windows or android). Everyone else is integrating fixed function hardware for tasks like that (OK, mostly only decode) on-die in the APU or GPU, so having one smaller card to replace makes it easier to keep devices current. The equivalent stuff in phones is soldered, but that goes for literally every modern phone
I admittedly don't have much to do with video production, but fixed function encoding is absolutely not a replacement for software encoders - they're a fast and power efficient method of encoding but you pay for that in encoding efficiency and/or quality.
With that said, I'm very interested to see how well some common video codecs will be implemented on ARM with presumably decent optimisation - in theory it could work quite well.
For all the chaos this is likely to cause, it will undoubtedly be an interesting one to watch, and even though I've no real interest in purchasing one, I'm keenly awaiting some deep dives on the hardware! And the software TBH...
I believe AMD have Apple specific PCI IDs for the silicon used in Apple hardware, so Apple could still claim that those cards are Apple hardware.
In theory NEON should be capable of some decent performance in things like video encoding, just traditionally it seems hard to find people who can code for NEON. I'm sure Apple can get spend their way out of that sort of problem.
That's pretty much my line of thinking - things like video encoders benefit greatly from tuning, as can be seen by how the likes of x264 and AV1 started out incredibly slow but gained speed throughout their development.
I wonder if ARM's SVE would be much use here? AVX2 gets used extensively but I'm not sure how comparable they are.
What annoys me is that everytime a company switches to ARM,they have to make disposable devices. There is no reason why you can't have a similar infrastructure to X86 systems,with plug in upgrades. It seems to go contrary to environmental and right to repair concerns that are being brought up.
Apple didn't need an excuse to solder CPUs - they were doing that with Intel CPUs, and before it was commonplace if I remember correctly.
What's another example you have in mind? I can't really think of any.
I agree about the whole repairability thing though, it's ridiculous how disposable some very expensive devices are made to be.
I was talking about sockets for other stuff like RAM. If you look at the last desktop Acorn,ie,the Acorn Phoebe 2,you had expansion slots,and slots for extra RAM,etc. The RiscPC had slot in CPU cards:
https://www.retro-kit.co.uk/page.cfm...PC-processors/
This is not much different from some of the Pentium systems when you had slot in CPU cards.
Even those 1980s systems had expansion slots. Compare that with the modern ARM systems which have everything soldered onboard. They use ARM as an excuse,to make sure the systems are not upgradeable in anyway,when clearly this is not really required. There is zero reason why a a desktop ARM system,can't have PCI/PCI-E expansion slots,extra slots for DIMMs or SODIMMs,etc.
Last edited by CAT-THE-FIFTH; 10-07-2020 at 11:09 PM.
I don't really see it that way TBH, Apple have made plenty of x86 stuff which is not upgradable so it's not like they need an excuse. And upgradable ARM systems exist too: https://www.anandtech.com/show/15733...64-workstation
It's just that we haven't see much ARM stuff on the desktop in recent years.
There are currently 1 users browsing this thread. (0 members and 1 guests)