While looking into the possibilities of the Source SDK, specifically the FacePoser program, I had an idea that may or may not be useful. The SDK contains an assortment of tools for creating realistic human characters; characters that look, move and emote like real people. This goes as far as a very impressive facial expression modeller.

The characters that are created for games like Half Life 2 are very convincing. So, my idea was to create a human character (dressed in a lab coat and carrying a clipboard), who would stand beside the model of whatever autonomic system I was simulating. As the simulation wears on, the character will speak, and emote, various feedback cues to the user, like flailing his arms around when sensors fail. What more intuitive form of feedback than one which everyone is most used to having to interpret?

The trick, of course, is to not let this become a more technologically advanced version of Clippy. The character, who could be thought of as an avatar for the autonomic system’s general health, would generally stay in the background. Much of the feedback he provides could even be picked up subconciously, as he walks around the car performing ‘checks’; all the while providing subtle auditory hints and contorting his face to show his level of contentment.

UnfortunateOf course, there are some faces that no amount of technology could ever emulate.