We propose LOcal Model-Agnostic Time-series Classification Explanations (LOMATCE, pronounced "lom-te-see"), a method akin to LIME [1], to explain deep-learning-based time series classifiers. LOMATCE uses parametrised event primitives (PEPs) to extract and parameterise events like increasing trends, decreasing trends, local maxima, and local minima from time series data. These PEPs capture temporal patterns, and facilitating the training of simple, interpretable models like linear regression [2]. Discussing time series in terms of these familiar events makes the explanations more intuitive and comprehensible.
-
lomatce: This directory contains code files.
-
examples: Includes notebook examples for each dataset.
-
utils: Contains utility files:
helper_class.py: Functions for clustering, explanation plots, etc.test_dataloader.py: Dataloader for the test set.
-
explainer.py: Core method implementation, from PEP extraction to applying interpretable models like linear regression to mimic deep learning inference. -
perturbation.py: Applies various perturbation strategies and generates neighboring samples. -
lomatce_simulation.py: Runs the FCN model multiple times with random train-test splits to ensure robustness of results. -
lomatce_vs_baseline.py: Code to compare LOMATCE against LIME, SHAP, Integrated Gradients (IG), and a random baseline.
-
Fig 1: The proposed XAI method for deep learning based time series classifiers using Parameterised Event Primitives (PEPs).
git clone https://github.com/yourusername/lomatce.git
cd lomatce
pip install -r requirements.txtfrom lomatce.explainer import LomatceExplainerlomatce_explainer = LomatceExplainer(basic_dir='path/to/data_directory')explanation = lomatce_explainer.explain_instance(
origi_instance=your_ts_instance,
classifier_fn=your_model_predict_function,
num_perturbations=1000, # Number of perturbations
n_clusters=20, # Number of event clusters
top_n=15, # Top features to show
class_names=["Class1", "Class2"]
)explanation.visualise(your_ts_instance, show_probas=True)summary = explanation.get_explanation_summary()
print(summary)You will get key info like local model prediction, original (black-box model) prediciton and local fidelity score.
Here's an example of LOMATCE highlighting important regions of a time series:
Fig 2: Explanation highlights segment significance, relevance scores, and event types (e.g., increasing, decreasing, maxima, minima).
You can explore the example noteboks for each dataset in the [examples/] folder.
python lomatce_simulation.py --model [model-name] --dataset [dataset-name] --num_runs [100] --class_labels [list-of-classes] --replacement_method random --num_samples 1000python lomatce_simulation.py -- model FCN --dataset Coffee --num_runs 100 --class_labels Arabica Robusta --replacement_method random --num_samples 1000
python lomatce_vs_baseline.py --dataset [dataset-name] --model [model-checkpoint] --class_labels [list-of-classes]