Transformer based framework for integrating and interpreting multiscale, -rate and -modal data in mechatronic systems


Transformer based framework for integrating and interpreting multiscale, -rate and -modal data in mechatronic systems


In this project, we capture the industry's need to integrate data across various scales, including temporal (multirate), spatial (multiscale), and different types (multimodal).

  • Companies possess models and extensive data that span different time scales (multirate/multiscale), such as the 10kHz PWM cycles of an inverter, 50Hz motor waveforms, or car acceleration and deceleration figures. Often, interactions between these time scales lead to oversimplification, requiring additional modelling effort to run in real-time. This compromises accuracy and hampers the interpretability of deviations between observations and modelled behavior.
  • Models, whether physics-based or data-driven, tend to be inflexible when handling heterogeneous data (multimodal). Sensor data often demands significant effort for preprocessing and synchronization to align with a model type's requirements.
  • Physics-based models can support data-driven modelling techniques on multiple levels, offering faster learning rates, higher reliability, and increased interpretability (if the phenomenon is also modelled). However, there are no clear mechanisms to easily identify the origin of discrepancies between the model and observed behavior.

Project goal

Transformer models have recently revolutionized the landscape of machine learning-based vision and forecasting, surpassing established methods like long short-term memory (LSTM) and convolutional neural network (CNN) techniques. Transformers excel at handling various data types (multimodal), including vision, and accommodating data at significantly different time scales (multirate/multiscale). Furthermore, the integration of physics-based models has proven effective in reducing memory requirements and training time for transformers. The "attention mechanism" in transformers allows them to interpret the performance of included models.

In the proof-of-concept stage, applying transformers to describe dynamical mechatronic systems has shown promising results. This project aims to advance the state-of-the-art by building a comprehensive transformer-based framework for mechatronic systems. This framework seeks to simplify:

  1. Interconnecting multiscale/multirate data,
  2. Fusing multimodal sensor data into a single model, and
  3. Interpreting observations back to the models on their respective time scales.

In essence, the project aims to:

  1. Reduce the reliance on multiscale, simplified models of the same component, expediting the modeling process and streamlining multiscale/multi-rate predictions and analysis.
  2. Minimize the need for intensive preprocessing of data sources (such as vision) before applying them in a predetermined, inflexible model.
  3. Trace back discrepancies in models (both physics-based and data-driven) for multiple time scales, making them more easily interpretable on their respective time scales.

Interested to join this project?

MULTISCALE_SBO is a Strategic Basic Research (SBO) project. We are looking for companies to join the User Group and work with us on the valorisation of the project.

Interested? Complete the form below and we will contact you as soon as possible.