Dissimilarity-Preserving Representation Learning for One-Class Time Series Classification

Stefano Mauceri, James Sweeney, Miguel Nicolau, James McDermott

Research output: Contribution to journalArticlepeer-review

Abstract

We propose to embed time series in a latent space where pairwise Euclidean distances (EDs) between samples are equal to pairwise dissimilarities in the original space, for a given dissimilarity measure. To this end, we use auto-encoder (AE) and encoder-only neural networks to learn elastic dissimilarity measures, e.g., dynamic time warping (DTW), that are central to time series classification (Bagnall et al., 2017). The learned representations are used in the context of one-class classification (Mauceri et al., 2020) on the datasets of UCR/UEA archive (Dau et al., 2019). Using a 1-nearest neighbor (1NN) classifier, we show that learned representations allow classification performance that is close to that of raw data, but in a space of substantially lower dimensionality. This implies substantial and compelling savings in terms of computational and storage requirements for nearest neighbor time series classification.

Original languageEnglish
Pages (from-to)1-12
Number of pages12
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume35
Issue number10
DOIs
Publication statusPublished - 2024

Keywords

  • Approximation algorithms
  • Autoencoders (AEs)
  • Convolutional neural networks
  • Measurement
  • Neural networks
  • one-class classification
  • representation learning
  • Task analysis
  • Time measurement
  • Time series analysis
  • time series classification

Fingerprint

Dive into the research topics of 'Dissimilarity-Preserving Representation Learning for One-Class Time Series Classification'. Together they form a unique fingerprint.

Cite this