> > However, using Donkey Kong and 1943, I found that the time > > from the caps lock LED coming on to the first apparent on-screen action (first jump > > sequence frame, or first fired shot frame) was consistently 3 frames. > > The important thing now is: How does it react when you use vsync in window and in > fullscreen mode? Using it without any special options is just the base value. To > check input lag, it's interesting to know how much longer it takes for vsync (and > maybe triple buffer) in both, window and fullscreen. > > > > I also tried my desktop Linux system with an integrated AMD graphics card (on the > > motherboard) and SDLmame, can't remember what version, but a few versions old at > > least, and the delay was a consistent 7 frames. > > That's much too long. > > > > I do wonder what the 'minimum' achievable delay is, I suspect it's not possible to > do > > better than 3 frames > > From a purely software point of view you can easily test it: Pause the game, then > press Shift + P + jump button. (That means: While in pause mode, MAME advances one > frame and on this frame, the jump button is pressed.) Then continue pressing Shift + > P (further frame advance) until Mario jumps. This way you can see how the game is > programmed. > In "Vs. Super Mario Bros.", it takes one frame to jump: If the moment where you press > Shift + P + Jump counts as frame 0, then the jump occurs at frame 1. > At the character selection screen of "Street Fighter II" it doesn't even take this > one screen. Press the button and the fighter changes immediately, at frame 0. > So, yes, the game itself on a software level reacts immediately. > > However, this still doesn't explain how much it would take for an actual physical > gamepad in real world time to send the signal. I'd be really curious if three frames > is the actual delay or if the real arcade is faster.
Well I also wrote a program that does nothing but print out what key was pressed the moment the key press is detected and that also took 3 frames from the caps lock LED coming on to the screen updating. However, my program was not optimized for lowest possible latency:
- It may be possible for a program to acquire user input events faster than the normal Cocoa event loop - possibly using more low level functions for watching HID events or something like that
- I defer rendering to a different thread than the thread that receives input, so there is some overhead in passing a message to the other thread to tell it to update the window with the text for the newly pressed key. I would expect it to be measured in microseconds but who knows.
I think I'll write a program that tries to update the screen as quickly as possible (using the shortest coding path possible from event detection to screen update) and see what I can get.
Also, I have tried various programmatic settings in my program for adjusting the 'vsync' and buffering settings of OpenGL rendering but they didn't seem to accomplish anything, on OS X at least. On Linux I've experienced wildly different behaviors depending on graphics card, driver, and version of GLX being used.
|