Open Source Software for Animatronic Characters
In my previous animatronics projects, I always had a lot of fun with the hardware, but then I didn't put so much energy into programming the movements. That's actually a pity, because it's precisely the movements that make the characters come alive. One reason was that it is really programming, i.e. every movement had to be written in program code. This is quite tedious, often seems choppy in the result and has to be recompiled and started again and again to try it out.
For this reason, I have developed the software "Animatronic Workbench" (AWB), which can be downloaded from GitHub as open source. This allows you to create the movements of animatronic characters on the screen without programming knowledge. The first practical use of the software was with my animatronic Grogu:
Getting Started + Documentation
In the Animatronic WorkBench documentation you will find instructions on how to install the software and create a first project.
There you can also use the supported hardware and the FAQ.
There is also a Guide to creating timelines and to Export of the project for ESP32 microcontrollers.
Dashboard
View Version History
Roadmap
- Windows app
- Timeline editor
- ESP32 client
- Control via midi controller
- Control via mouse / keyboard (without mandatory midi controller)
- Include servos in timeline
- Live mode
- Saving the movements in the microcontroller
- Autonomous mode
- Web interface for the client
- Control STS serial servos
- Control SCS serial servos
- Control PWM servos
- Support Wifi handheld remote control
- Integrate sound / speech into timeline
- Project configurator instead of manual editing of JSON files
- Automatically generate "Hardware.h" from Project Configurator
- Integrate RGB LEDs
- Enable custom code in the ESP32 client
- Control LX serial servos
- detailed instructions
- Video tutorial
- UnDo function in the editor
Timeline Editor
The core of the "AWB Studio" app is the timeline editor. Here, the movements of the individual servos can be designed in scenes via a timeline. The operation is intuitive and similar to the operation of video editing programs.
A single scene is saved under a name and can be assigned to a status such as "Idle", "Sleep" or "Talk".
Live Control vs. Autonomous Storage of Movements in the Microcontroller
During editing, the changes can be transmitted directly to the animatronic character via USB or WLAN. This allows you to follow the adjustments live on the character in the editor.
The movement sequences can then be stored in the microcontroller, so that the character can move autonomously without a connected computer and only needs a power connection.
With the (optional) selector switch, a status (such as "Idle", "Sleep" or "Talk") can be selected on the microcontroller, which then plays (only) the corresponding movements.
Input Controller
The software uses midi mixer controllers as input devices, which can be connected via USB and can be bought for less than 100 euros. With these controllers, you can move through the timeline of the respective animation and haptically control the rotations of the servos by means of rotary or sliders.
Through the return channel of some mixer models, you can even read the current position of the servos on the controller when the timeline is playing in playback mode. If the animatronic character is connected to the computer via USB cable at this time, the movement is also performed live on the figure.
Web interface for the client
The ESP32 client microcontroller sets up a small, dedicated Wi-Fi network, which can be used to connect to a smartphone or tablet. Via a local website, the condition of the servos (including temperature, load, etc.) can be read and movements can also be started. This is especially practical if the animatronic character is no longer connected to the computer, but is already on a stage or in a shop window, for example.
There is also a short YouTube interview about the project from MakerFaire 2023: