|
Re: I may get flamed for this, but this IS the loony bin so... (Retroarch Bounty)
10/09/18 01:20 AM
|
|
|
> The MAME info screen at game launch, or mame.info, shoud tell the hardware's default > original lag like it does resolution and refresh. > That would maybe stop people from doing silly tweaks with no idea about what lenght > of delay they should seek for accuracy. >
Actually it's more likely that MAME shouldn't even display resolution, since it seems to be a source of confusion for people.
Also in most cases there's a single, or double buffer on the sprites, that doesn't necessarily apply to tilemaps, or palette or anything else, just because the hardware lags the sprites by a frame doesn't mean it lags everything else (sometimes the software does this instead to keep everything in sync) Sometimes these buffers are automatic, hardware controlled, other times the copy is triggered by software.
You're asking for an arbitrary number that can be measured against any number of things, not something that can simply be derived from hardware specs.
The Cave CV1000 hardware for example has no inherent lag (it's a blitter, you could blit to the screen at any point and flip buffer at any point) but the way it's programmed (to avoid sprite glitches by rendering to an offscreen buffer etc.) contributes towards making the games laggy.
> Ideally there would be a button to press ingame (like F11 for speed) so the delay > information would appear in a corner of the screen ingame, with the game's and the > sync option if any used, for instance; > [GAME: 2 frames, SYNC 3 frames] > > This could be then required to appear in a replay video for it to be trustable. > > Of course RA by its design could still get around this, unless there's means to > prevent even that.
|
|