> Doesn't MAME keep the native resolution as well? When I play "Street Fighter II" and > my desktop is 1360x768, then MAME's fullscreen resolution is still 1360x768 with the > actual game's screen increased by MAME. It doesn't switch to a real screen resolution > of 384x288. Would this even be possible? Doesn't a monitor have just a few predefined > resolutions anyway? Like 640x480, 800x600, 1024x768, 1360x768? And isn't MAME > configured to not change the screen's resolution unless you explicitly set it with -r > and -switchres? Or am I completely wrong here? Or maybe I misunderstood what you were > saying.
I don't know how you would do this on Windows, but on Linux it's possible to compute new video timing modes on the fly and update the video card to use those modes. The CRT monitor has to actually support the mode of course (which is to say, the timings have to be within the monitor's spec, but they can be any weird resolution and refresh rate that you want), but it's possible. I have some code that I intended to incorporate into my front end (should I ever actually get far enough in the thing to start adding these kinds of features) that does this. If the game was originally 384x288 @ 59.97 Hz, it would compute video timings to support this, and switch the monitor to those timings, before starting the game.
However, some really low spec timings like that can be unachievable for "modern" PC CRTs. They simply won't sync to a scan rate that slow. My test rig has a Mitsubishi Diamondtron 21 inch monitor, circa early 2000's in age, and it couldn't sync to the natural resolution and refresh rate of some games that I tried this technique on. I ended up choosing a refresh rate that was a multiple of the original game rate to get the scan rate high enough that the monitor would do it (for example, 384x288 @ 119.94 instead of 59.97).
And then what I found was that the phosphors of the CRT weren't really designed with the level of bloom needed to make 384x288 look good. I ended up with really, really skinny lines of video alternating with really think black empty spaces. I expect that the thickness of the scan lines is a function of the width of the beam and also the bloom of the phosphors. I guess it shouldn't have been surprising that a 'modern' CRT made to excel at rendering small pixels at 1600x1200 resolution wouldn't easily adapt to rendering big and chunky pixels at really low resolutions.
An alternative could be to double the resolution and render the scan lines manually, by explicitly drawing black on every other line. I didn't try that though, I kind of gave up eventually and decided to return to the problem later.
Also, as an aside - what's up with the vaguely racist graphics being used on this site now ("EmuChat? Ain't nobody got time 4dat")? Kind of makes me embarrassed to be posting here ...
|