Skip to content

Simula-COMPLEX/ODiSAR

Repository files navigation

Out of Distribution Detection in Self-adaptive Robots with AI-powered Digital Twins

This repository implements a Transformer‑based digital twin approach for proactive out‑of‑distribution (OOD) detection in self‑adaptive robots (SARs), with built‑in interpretability.


Conceptual Overview & Motivation
This approach integrates forecasting, uncertainty estimation, and feature-wise reconstruction analysis into a unified Digital Twin-based OOD detection pipeline:

  • Digital Twin Model (DTM): A Transformer that predicts and reconstructs future system states.
  • Digital Twin Capability (DTC): Computes reconstruction error and MC-dropout-based forecast uncertainty.
  • Explainable OOD Detector: Flags OOD events based on reconstruction error and forecast uncertainty, while attributing the cause to individual state features.

This capability aligns with the Monitor and Analyze phases of the MAPLE-K feedback loop, enabling not only detection but also explanation of potential anomalies. It lays the foundation for future integration with planning and adaptation mechanisms for self-adaptive systems.


DT Overview

Overview: The Transformer-based Digital Twin Model (DTM) takes historical input and produces both forecasted and reconstructed future system states. The Digital Twin Capability (DTC) analyzes these outputs using reconstruction error and uncertainty (via MC Dropout) to detect potential OOD instances. The Explainable OOD Detector combines this analysis to flag OOD states and attributes them to the most contributing system features.


Setup

  1. Clone the Repository
git clone https://github.com/Simula-COMPLEX/ODiSAR.git
cd ODiSAR
  1. Install Dependencies
pip install -r requirements.txt

Configuration

This project currently provides two use cases:

1. Ship Motion Prediction (i.e., NTNU)

  • Config file: config/config_train.py

2. Trajectory Prediction using Mobile Robots (i.e., PAL Robotics)

  • Config file: config/pal_config_train.py

Datasets for both cases will be made available soon.

Modify the config files to set:

  • INPUT_FEATURES, OUTPUT_FEATURES
  • DATA_PATHS, SEQUENCE_SETTINGS, TRAINING_PARAMS

Also, update config/config_inference.py to set:

  • TEST_DATA_DIR, MODEL_PATH, SCALER_PATH, etc.

Model Training

python main_train.py

Or for mobile robot navigation config:

python pal_main_train.py

This performs:

  • Phase 1: Joint training for forecasting and reconstruction
  • Phase 2 (optional): Fine-tuning the reconstruction head only
  • Computes and saves reconstruction and uncertainty thresholds

Inference & OOD Detection

python main_inference.py

Or for mobile robot navigation config:

python pal_main_inference.py

This performs:

  • Forecasting on test data
  • Computes forecast reconstruction error and MC-dropout variance
  • Applies thresholds to detect OOD
  • Generates visualizations and JSON-based diagnostics

Outputs

  • forecast_results.csv: Ground truth vs forecasted values
  • forecast_errors.csv: MSE/RMSE per feature
  • thresholds.pkl: Saved thresholds for inference
  • ood_diagnostics.json: Confidence-aware OOD decision metadata
  • Plots:
    • Forecasted vs GT curves
    • Reconstruction/variance distributions
    • Quadrant-based OOD scatter

Notes

  • Ensure that your input and output features are defined clearly and the combined feature set used for normalization does not contain duplicates.
  • Inference will reuse the training-time normalization statistics.
  • The script supports datasets with different feature combinations (e.g., Ship dynamics, Mobile robot navigation).

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages