MAMEWorld >> Programming
View all threads Index   Threaded Mode Threaded  

Pages: 1

rubinstu
MAME Fan
Reged: 02/05/10
Posts: 20
Send PM


Client-Server model for control
#390090 - 02/28/21 06:05 AM


Here is my idea. I would not be surprised if someone has already done it, or something similar. If that's the case, please do let me know!

The ultimate goal is to have a platform for controlling game play with algorithms, AI, machine learning, etc. I would like to embed a TCP-based server into MAME. The server has access to the same data and controls as the Lua engine - reading and writing data, controlling the inputs, etc.

On a completely different software application (or really, many applications!), I would like to build a TCP-based client which could talk to the server and control the games remotely.

As opposed to the built-in Lua plugin mechanism (which is excellent, BTW), this proposed method has the following advantages:
Language independence. The client software can be written in just about any language (C++, Python, or whatever). If you can make a TCP client, you can control the game.
Easy integration with third-party libraries. The client can integrate with third-party or novel software libraries like Tensor Flow, OpenCV, etc.
Client could run on separate computer. Although not required, one can run MAME (server) on a separate physical computer than the client. This may help with processing bottlenecks, debugging, etc.
Client user interface is unconstrained to MAME's capabilities.

I propose a simple data exchange protocol. It could be a simple homebrewed packet exchange, or leverage an established mechanism like MQTT (https://mqtt.org/), Protocol Buffers (https://developers.google.com/protocol-buffers), gRPC (https://grpc.io/), etc.

I have a vague idea that the server would operate in parallel with the Lua engine, and perhaps make use of the same underlying code. For me, this is where things get a little murky. I have written various machine-to-machine networking protocols, etc., and am confident that I could do a reasonable job with this except for the critical interface between the server components and MAME itself.

So, if anyone can help with the following, I would be greatly appreciative!
Where in the source code should I look to make he actual "hooks" into MAME?
Has anyone done anything like this before? I have seen some older projects where MAME is compiled as a library, which is a good approach, but the projects are old and don't really have the hooks needed. They are more for front-end and integrating MAME into general applications.
Anyone interested in collaborating?
Ultimately, I would be pleased to open-source this code, or if appropriate, have it merged into the official MAME codebase.

All thoughts and input are welcome!

-Stuart



MooglyGuy
Renegade MAME Dev
Reged: 09/01/05
Posts: 2261
Send PM


Re: Client-Server model for control new [Re: rubinstu]
#390091 - 02/28/21 01:27 PM


What's your plan for maintaining temporal coherency?

Even if you get MAME to run on a server backend, a user controlling the emulated machine is going to be issuing input events against a visual state that, most likely, is already temporally out-of-date by the time it appears in the client-side window.

By the time any packet containing input state from the client makes it back to the server, the local time of the emulated machine will have advanced even further, so applying the client inputs at that point will effectively be applying the input response after the point that the user intended for those inputs to be acted upon.



rubinstu
MAME Fan
Reged: 02/05/10
Posts: 20
Send PM


Re: Client-Server model for control new [Re: MooglyGuy]
#390094 - 02/28/21 06:26 PM


Moogly, the problem of timing in general is an important one. As you imply, lagging even a little can start to cause chaos pretty easily. I've thought about this:
In a "training" or other automated play situation, the speed of the play doesn't really matter as long as the "player" (i.e. some bot or AI) and game are in sync. Emulator with the server running can operate in a synchronous mode. So, for even n frames (where n is often 1), the server pauses the game, processes the input, posts the outputs, then resumes the game. In the simplest form, this could be just single-stepping the game from the server.

When running the client and server on the same physical computer (which I imagine will be common), the network lag will be quite small, since the port connection isn't really using the network, it's just inter-process communications (IPC). I am hoping that on a reasonably fast machine, simple game, and simple AI bot, training and playing can actually be faster than normal real-time play.

I have seen other bots train and play faster than real-time with the Lua engine.

Does this make sense? Thanks for your input. I'd like to hear more thoughts.


Pages: 1

MAMEWorld >> Programming
View all threads Index   Threaded Mode Threaded  

Extra information Permissions
Moderator:  Pi 
0 registered and 64 anonymous users are browsing this forum.
You cannot start new topics
You cannot reply to topics
HTML is enabled
UBBCode is enabled
Thread views: 418