(Replying to PARENT post)
https://www.newegg.com/p/pl?d=pci+fiber+network+card
https://www.blackmagicdesign.com/products/decklink/techspecs...
https://www.pcworld.com/article/394512/the-best-pcie-40-ssd....
So yeah, high end audio and video workflows can use plenty of PCI cards. Theyβre more to PCI than GPUs.
(Replying to PARENT post)
But let's assume you really want more compute Apple has to first design a standalone GPU, then it has to do a PCI or CXL core or some custom interconnect, then it has to redesign it's OS for non-uniform memory doing all that fun stuff hiding the memory communication, then it has to add the primitives to it's languages for you to manage memory (because to get performance you have to actively design for it) and then you need to get all your performance critical applications to re-write their stacks with this in mind.
It's a massive lift! And for what? Only Desktop can take advantage of it, and in Apple's product line only the Mac Pro. And that's assume you execute well and the device ends up more capable than just using the SoC. And all of this we're talking about on top of the massive lift Apple literally just finished moving all their products onto Apple silicon. It's just a crazy ask for a v1 of a product.
(Replying to PARENT post)
But that $3500/entry level and upgrade into your career machine isn't the goal in whatever the heck these are. Aside from maybe like a Hollywood editor who wants to use a mac and gets their way, I have no idea who these are for. I think the goal is to just look really cool in a fancy office.
(Replying to PARENT post)
Throughout this era (from, say 1974 through until 2022), the elements that were composed to create a personal computer were MSI, LSI, and VLSI integrated circuits mounted onto a PCB, and wired up with traces on the board(s) and expansion card slots.
The M1-based Macs introduced a new era in personal computer architecture: previously used for phones and embedded devices, the SoC, and particularly, the chiplet-based SoC, took over from the motherboard.
The elements composed to make a PC now are chiplets, wired up with silicon interposers, and encapsulated into an SMT chip. The benefits in speed, bandwidth, and power usage of a SoC over VLSI-on-a-PCB are enormous, and reflected in the performance of the M1-based Macs.
Where do expansion cards fit with an SoC-based model? They're slow, narrow bandwidth, and power-hungry devices. A GPU expansion card on an SoC-based computer might as well be accessed over an Ethernet.
Of course, it's disruptive. People legitimately _like_ the ability to compose their own special combination of functions in their PC, and it allows a level of _end-user_ customization that isn't (currently?) possible with SoCs.
But retaining that level of customization means significant performance costs, and the gap is only going to grow as further "3D" chiplet assembly techniques become common.
The cost of creating a SoC from chiplets is sufficiently high that a market of one (ie. end-user customization) isn't possible. Right now, we get base, Pro, Max, and Ultra variants from Apple. It's possible we'll get more in future, but ... it's fundamentally a mass-market technology.
The era of end-user hardware customization is very likely drawing to a close.
(Replying to PARENT post)
I'd say they should just kill the product off to stop all the incessant whining about what a ripoff it is, but I'm sure they'd counter that there's no such thing as bad publicity.
(Replying to PARENT post)
So yeah, a niche machine but there is a lot more to PCI than graphics cards.
(Replying to PARENT post)
So... why expose PCI lanes in the first place, then? "Optimization" isn't really an excuse, as long as you can saturate the PCI bus it should run smoothly. Some support would be better than nothing, especially if you're brave enough to pay $7,000 for a Mac Pro.
This whole thing is kinda baffling. Surely they haven't gotten rid of the multi-GPU address space support, they still support Intel/AMD systems after all.
(Replying to PARENT post)
Apple executive 2: "Great! What are they for?"
Apple executive 1: "They're decorations to make them look like prosumer systems with the hope of future expansion."
Apple executive 2: "Okay. Then what will we promise?"
Apple executive 1: "That's a tomorrow problem. We're selling hope here."
(Replying to PARENT post)
(Replying to PARENT post)
Apple wants you to upgrade your hardware by purchasing new devices, not components, because that sells more units.
If you think about all their insulting decisions from that lens, everything else makes more sense than any other basic assumption.
(Replying to PARENT post)
(Replying to PARENT post)
But for most tasks which were known to exist 5+ years ago when these decisions were made, their on die gpu really is plenty fast and offers a larger memory area than most every gpu known to be coming at that time.
(Replying to PARENT post)
I'm curious to see how far gaming goes with their translation+gpu vs. likes of nvidia. Same problems linux has with wine, translation against windoze directx, or lack thereof.
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
it's 1989 all over again.
(Replying to PARENT post)
I don't know what to say. Is it because gpus are upgradeable, and they are allergic to this, retreating from it like vampires from holy water? Is it to keep the price of GPUs from muddying the water in their careful pricing ladder where the cpu and gpu are bundled?
It's insulting