Thursday, October 1, 2009

Automatic Testing: Client and Server

(This will be a mostly technical post.)

There is a lot of work planned for the engine after the launch of the open beta (which is, btw, just about here), so automatic testing is crucial to prevent regressions. Basically, every time you commit a change, you want to be able to run a completely automatic battery of tests that checks you didn't break anything. So far in the Intensity Engine only the master server had automatic tests for it, using Django's test client. Those tests are actually very comprehensive, and have already come in handy. There are also some unit tests for the JavaScript API. But for the client and server as a whole, writing automatic tests isn't as easy: They don't have nice test setups like Django, and what's more, the client is a GUI program, using OpenGL (if it were using a GUI like GTK or Qt, there would actually be some tools to help out).

The tests/ directory contains what I did so far towards this goal, over the last few days. The test setup uses pexpect, a very useful Python module that lets you communicate with processes, sending them input and checking their output. This, combined with the server's Python console, lets the server be tested in a nice fashion: Start the server in a separate process, issue it commands - for example, check such and such state variable of such and such logic entity - and validate the output. Using Python's unittest module, in each test the server environment (home directory, with downloaded assets and so forth) is created from scratch. So all of this together gives an appropriate way to test the server.

The client was a little harder. First off, I added an option to run a Python console, just like on the server (which took some refactoring of the console code). Then, I had to write bindings for 'injecting' user events. That is, by issuing commands through the Python console, the test runner can manipulate the client as if it were an actual person: Move the mouse, click, press a key, and so forth. This took some time to get working correctly, due to how SDL events behave and how the engine processes them. But it appears to now be working as it should.

Currently there are 4 tests. Each of them starts up a master, a server, and a client, runs a map ('storming') and then does one of the following:
  • Modify a state variable on the client and see that it propagates to the server
  • Modify a state variable on the server and see that it propagates to the client
  • Modify a state variable, restart the map, and see that it returned to the original value
  • Modify a state variable, upload the map, and see that the new value is used
In other words, these are high-level functional tests, and they already cover most of the basics: Starting up the client and server, loading a map in each, network synchronization, and map uploading. In particular, they cover basically what someone starting out with the Intensity Engine would do (if they follow README-standalone.txt). So these tests are a good start, while of course many more should be added over time.

The tests take about a minute to run, what with creating an entirely new environment for the master, server and client for each test, and starting them all. (Also, for some reason the injected SDL events are slower than I would expect.) But it's actually kind of funny to watch the tests running, with the mouse jumping around, clicking and keypress sounds, and so forth. (Although I imagine I will get tired of it soon enough, and just go do something else while the tests run...)

No comments:

Post a Comment