Skip to content

Exploratory analysis of clifford algebra and hypercomplex neural network on basis of efficieny and interpretability.

Notifications You must be signed in to change notification settings

Astraflaneur/hypercomplex-neural-network

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

hypercomplex-neural-network

Overview

This repository contains the implementation and experimental results for a comparative study of quaternion neural networks (QNNs) and their generalizations to Clifford/Parameterized Hypercomplex Neural Networks (PHNNs), including PHB-cos models. In this work we moved beyond quaternions and observed tangible benefits in accuracy, efficiency, and interpretability.

Contents

  • report/ – LaTeX source and compiled PDF of the report.
  • code/ – Implementation of real, quaternion, PHC, and PHB-cos models, training scripts, and evaluation routines.
  • plots/ – Generated figures (accuracy vs parameters, validation accuracy, explanation IoU).
  • results/ – Aggregated CSVs and LaTeX tables from 5-seed experiments.

Tutorials

The folder tutorials/ contains a set of tutorials to understand the Parameterized Hypercomplex Multiplication (PHM) layer and the Parameterized Hypercomplex Convolutional (PHC) layer. We develop simple toy examples to learn the matrices A that define algebra rules in order to demonstrate the effectiveness of the proposed approach.

  • PHM tutorial.ipynb is a simple tutorial which shows how the PHM layer learns the Hamilton product between two pure quaternions.
  • PHC tutorial.ipynb is a simple tutorial which shows how the PHC layer learn the Hamilton rule to organize filters in convolution.
  • Toy regression examples with PHM.ipynb is a notebook containing some regression tasks.

Key Results

  • PHC models (especially $n=6$) achieve higher accuracy with fewer parameters compared to quaternion and real-valued networks.
  • PHB-cos models provide state-of-the-art interpretability by aligning filters with inputs, yielding better IoU with ground-truth ROIs compared to Grad-CAM/LIME explanations.
  • Parameter efficiency scales with $1/n$, validating the advantage of Clifford-based approaches over quaternions.

Installation

git clone https://github.com/Astraflaneur/hypercomplex-comparison.git
cd hypercomplex-neural-network
pip install -r ./code/requirements.txt

Usage

To train a model (example: PHCNet with $n=6$ on CIFAR-10):

python train.py --model phc --n 6 --dataset cifar10 --epochs 200 --seeds 5

To reproduce plots and tables:

python make_plots_published.py --csv results.csv --outdir outputs

About

Exploratory analysis of clifford algebra and hypercomplex neural network on basis of efficieny and interpretability.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published