Getting Started

To start using leap-c for solving a given problem, several steps are required. We highly recommend consulting the examples for inspiration.

The environment (env.py)

First, define the gym environment with which the agent will interact. This module can contain explicit models (e.g., differential equations), but also serve as a bridge to more complex simulators (e.g., through the simulator’s API, the FMI standard or the UDP communication protocol). For more information on the required steps, see gym’s documentation on getting started and building a custom environment.

The controller (controller.py)

Then, create a controller to be applied. Our examples use acados to provide an MPC controller. More information on the possibilities of acados can be found in the problem formulation document or the Python interface docs. Generally, our algorithms use the ParameterizedController interface.

Training

Finally, define the configuration of the training loop (training length, learning rates, update frequency etc.) and watch the objective improve :). You can find examples for training configurations in the folder scripts.

For hyperparameter optimization, you can also wrap the training loop in an HPO framework such as Optuna.