(Replying to PARENT post)
Today we have fewer machines than the great explosive growth of the 80s.
Consider that most 'software' today is JavaScript interpreted by the Web Browser. It's not like those portability concerns didn't exist in the 80s, if anything, it was harder because you had to make your own interpreter back then.
---------
Many (maybe most?) video games seem to have been written in a VM, at least before Doom / high performance 3d graphics.
I think console games were in C/Assembly for performance.
But 'computer' games at that time was before the standard IBM PC or at least, before the PC won and Microsoft achieved dominance. When you didn't know if Amiga, PC98, IBM PC, Mac, or others would win it only made sense to write a VM.
SCUMM (Monkey Island and many others) comes to mind.
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
The opposing idea is represented more by arcade gaming, and later, stuff like Doom and Quake: The game is relatively intimate with the hardware in what it simulates, while the kind of definition that makes up a scene is more on the order of "put a monster here and a health pickup there", which aligns it towards being map data, instead of scripted logic.
(Replying to PARENT post)
(Replying to PARENT post)
Earthbound (SNES, 1994) contains TWO complete scripting systems, one for the dialog system (which is occasionally used for things it shouldn't be; most of shop logic is in it), and one for scripting sprite movement. The dialog script is actually quite impressive and easy to use; I'd consider implementing a similar system even in a modern RPG. The sprite movement script is trash, significantly harder to work with than games that use raw assembly. Apparently that movement script system was actually a common in-house library at HAL, dating back to the NES era, but I don't know too much about that history.
Also most of the game's assembly was actually compiled from C, which was almost unheard of for console games at the time.
(Replying to PARENT post)
I donβt think virtual machines and emulation are that new of a thing. Virtualizing x86 at full speed on consumer hardware has been a thing for, what, 15 to 20 years? And sure that requires special processor features, but remember that systems that came before that that would need to be emulated had even less computational demands. Iirc, a widely used pos software from the 80s has been running in emulation on pos hardware that far exceeds its requirements for the last 25 years, at least.
Also, my understanding is that lots of crucial government and business software runs on many layers of virtualization.
And my last recollection from what Iβve gathered is that, really until around the mid 90s a lot of operating systems made until then were pretty much hypervisors that ran programs that were virtual machines themselves. Multitasking was simply being able to route hardware resources to a given program, which was sorta its own environment.
(Replying to PARENT post)
https://archive.org/details/byte-magazine-1977-11/page/n147/...
(Replying to PARENT post)
(Replying to PARENT post)
#rightinthefeels
(Replying to PARENT post)