(Replying to PARENT post)

An apple event; they bring Kojima in to talk about gaming on Mac; they introduce gaming mode; they are excited by gaming on Mac; they still don't offer dedicated gpus? Not even in their workstations to try to appeal to game developers?

I don't know what to say. Is it because gpus are upgradeable, and they are allergic to this, retreating from it like vampires from holy water? Is it to keep the price of GPUs from muddying the water in their careful pricing ladder where the cpu and gpu are bundled?

It's insulting

πŸ‘€gentleman11πŸ•‘2yπŸ”Ό0πŸ—¨οΈ0
πŸ‘€musictubesπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

A lot of people in these comments don't seem to be really engaging with what was said. The unified memory architecture is fantastic, everything is in one place, easy and fast to access, up to the limit of the size of memory and compute you can fit on one chip it's massively advantageous. There is obviously a point where you reach the limit of how much you can pack in, but the disadvantages of having to shovel data back and forth mean you need a really powerful GPU before it becomes worth having.

But let's assume you really want more compute Apple has to first design a standalone GPU, then it has to do a PCI or CXL core or some custom interconnect, then it has to redesign it's OS for non-uniform memory doing all that fun stuff hiding the memory communication, then it has to add the primitives to it's languages for you to manage memory (because to get performance you have to actively design for it) and then you need to get all your performance critical applications to re-write their stacks with this in mind.

It's a massive lift! And for what? Only Desktop can take advantage of it, and in Apple's product line only the Mac Pro. And that's assume you execute well and the device ends up more capable than just using the SoC. And all of this we're talking about on top of the massive lift Apple literally just finished moving all their products onto Apple silicon. It's just a crazy ask for a v1 of a product.

πŸ‘€SilverBirchπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

The Mac Pro has a lot of significance in the professional media industry. Final Cut changed the landscape forever and at $3500 entry level Mac Pro that could compete with a $70k AVID station after a $1000 software purchase and an Nvidia card upgrade. Now as a college student I could go work for myself because I could (barely) afford the tools. I still work on a Mac to this day.

But that $3500/entry level and upgrade into your career machine isn't the goal in whatever the heck these are. Aside from maybe like a Hollywood editor who wants to use a mac and gets their way, I have no idea who these are for. I think the goal is to just look really cool in a fancy office.

πŸ‘€imageticπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Early personal computers used a basically passive backplane (eg. S-100 bus). After that, the Apple ][, IBM PC, and later Macintosh models used a motherboard with expansion cards.

Throughout this era (from, say 1974 through until 2022), the elements that were composed to create a personal computer were MSI, LSI, and VLSI integrated circuits mounted onto a PCB, and wired up with traces on the board(s) and expansion card slots.

The M1-based Macs introduced a new era in personal computer architecture: previously used for phones and embedded devices, the SoC, and particularly, the chiplet-based SoC, took over from the motherboard.

The elements composed to make a PC now are chiplets, wired up with silicon interposers, and encapsulated into an SMT chip. The benefits in speed, bandwidth, and power usage of a SoC over VLSI-on-a-PCB are enormous, and reflected in the performance of the M1-based Macs.

Where do expansion cards fit with an SoC-based model? They're slow, narrow bandwidth, and power-hungry devices. A GPU expansion card on an SoC-based computer might as well be accessed over an Ethernet.

Of course, it's disruptive. People legitimately _like_ the ability to compose their own special combination of functions in their PC, and it allows a level of _end-user_ customization that isn't (currently?) possible with SoCs.

But retaining that level of customization means significant performance costs, and the gap is only going to grow as further "3D" chiplet assembly techniques become common.

The cost of creating a SoC from chiplets is sufficiently high that a market of one (ie. end-user customization) isn't possible. Right now, we get base, Pro, Max, and Ultra variants from Apple. It's possible we'll get more in future, but ... it's fundamentally a mass-market technology.

The era of end-user hardware customization is very likely drawing to a close.

πŸ‘€__dπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

I suspect Apple would be perfectly content if they never sold another Mac Pro. I suspect most Mac Pros manufactured are for internal use. Their main products don't have a GPU, so neither does this. It already had an F-U price. Whatever customers use them for, it's an irrelevant side hustle for Apple.

I'd say they should just kill the product off to stop all the incessant whining about what a ripoff it is, but I'm sure they'd counter that there's no such thing as bad publicity.

πŸ‘€sh34rπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

When pressed by Gruber, Ternus says that there are PCI cards made specifically for Avid, Pro Tools, video capture cards, networking, and even some custom hardware configurations. The Mac Pro can also be loaded up with a lot of PCIe flash storage.

So yeah, a niche machine but there is a lot more to PCI than graphics cards.

πŸ‘€musictubesπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

> "Fundamentally, we've built our architecture around this shared memory model and that optimization, and so it's not entirely clear to me how you'd bring in another GPU and do so in a way that is optimized for our systems,"

So... why expose PCI lanes in the first place, then? "Optimization" isn't really an excuse, as long as you can saturate the PCI bus it should run smoothly. Some support would be better than nothing, especially if you're brave enough to pay $7,000 for a Mac Pro.

This whole thing is kinda baffling. Surely they haven't gotten rid of the multi-GPU address space support, they still support Intel/AMD systems after all.

πŸ‘€smoldesuπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Apple executive 1: "Let's add last gen PCIe slots that don't do anything."

Apple executive 2: "Great! What are they for?"

Apple executive 1: "They're decorations to make them look like prosumer systems with the hope of future expansion."

Apple executive 2: "Okay. Then what will we promise?"

Apple executive 1: "That's a tomorrow problem. We're selling hope here."

πŸ‘€sacnoradhqπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Apple consistently fails to fully implement hardware protocols because they only need to implement the subset that their hardware and peripherals use. This is yet another example in a very long list for the M1/M2 series. They're cut corner hardware.
πŸ‘€superkuhπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

I suspect there is a more fundamental truth here that’s practically reliable.

Apple wants you to upgrade your hardware by purchasing new devices, not components, because that sells more units.

If you think about all their insulting decisions from that lens, everything else makes more sense than any other basic assumption.

πŸ‘€andrewmcwattersπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

What actually stops amd/nvidia form slotting a card into it, writing a driver, and letting people use it?
πŸ‘€photonbeamπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

This is the long term result of Apple not wanting any further dependency on AMD/Nvidia, but at the same time not realizing that AI was going to hit and be very, very Nvidia centric.

But for most tasks which were known to exist 5+ years ago when these decisions were made, their on die gpu really is plenty fast and offers a larger memory area than most every gpu known to be coming at that time.

πŸ‘€benttπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

You'll take what Apple gives you, or go back your adventures in plebian life. Who needs an industry-leading gpu for graphics or compute.

I'm curious to see how far gaming goes with their translation+gpu vs. likes of nvidia. Same problems linux has with wine, translation against windoze directx, or lack thereof.

πŸ‘€bastard_opπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

Maybe Apple will release PCI expansion cards itself that you can take from Pro to Pro.
πŸ‘€hyperhelloπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

If expandability is not what Apple wants, then what is the space inside Mac Pro for?
πŸ‘€AraceliHarkerπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

There is no excuse to not support eGPUs though
πŸ‘€villgaxπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0

(Replying to PARENT post)

history repeats itself.

it's 1989 all over again.

πŸ‘€hdygdtfdπŸ•‘2yπŸ”Ό0πŸ—¨οΈ0