|
Re: 32-bit color
04/11/17 03:37 AM
|
|
|
> > > MAME will output video in whatever your video card is set to. Most computers > > nowadays > > > run on 32-bit color by default, only older systems still run on 256 color / > 16-bit > > > color. > > > > That's not really true - MAME will pretty much always pump out 24-bit (8-bit per > > channel) texture data, which may get downsampled if you're running in a lower depth > > video mode. Older versions of MAME could output lower depth, or even 8-bit indexed > > (256 colours), but that hasn't been supported in a long time. > > Windows has always called it 32-bit mode, as opposed to 24-bit mode which was slow > and generally not widely supported (presumably due to alignment or something) > > being a pedantic asshole about it doesn't make the project look friendly. > > as for what the original poster was asking, MAME only really works properly with > those modes these days, things like the 8-bit palletized modes were removed a long > time ago, chances are if you're running a modern version on a modern OS it is running > in '32-bit colour' mode already.
The horrors of Intel's i810 chipset. 24-bit was the max setting in GDI/DirectDraw, 16-bit only in OpenGL/D3D. Cue random emulator crashing on load when it couldn't set either 32-bit or 16-bit mode because the user had a 24-bit desktop instead! To make it even worse, in Intel's infinite wisdom, every computer which had the 82810 only came with PCI slots - board manufacturers had to specify the 82815 chipset to get an AGP slot. The i815 onboard video was exactly the same as the i810 however.
The S3 ViRGE and a few other 90s PCI cards also maxed out at 24-bit on the desktop (the older Trio series had 32-bit output despite the ViRGE being based on the same design, go figure - of course I wouldn't put it past Microsoft to simply have a shit driver in those days, as I distinctly remember GTA1 having 32-bit depth support in pure DOS but only 24-bit in Windows on the same computer) but as with the Intel chipset they would probably only do 16-bit in accelerated modes. Or in the S3 ViRGE's case, "decelerated" mode, as apparently rendering in GDI was faster than rendering in DirectDraw, and D3D/OpenGL larger than 320x240 was out of the question!
|
|