ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1010.1526
106
33
v1v2v3v4v5v6 (latest)

Time Series Classification by Class-Based Mahalanobis Distances

7 October 2010
Zoltán Prekopcsák
D. Lemire
    AI4TS
ArXiv (abs)PDFHTML
Abstract

To classify time series by nearest neighbors, we need to specify or learn one or several distances. We consider variations of the Mahalanobis distances which rely on the inverse covariance matrix of the data. Unfortunately -- for time series data -- the covariance matrix has often low rank. To alleviate this problem we can either use a pseudoinverse, covariance shrinking or limit the matrix to its diagonal. We review these alternatives and benchmark them against competitive methods such as the related Large Margin Nearest Neighbor Classification (LMNN) and the Dynamic Time Warping (DTW) distance. As we expected, we find that the DTW is superior, but the Mahalanobis distances are one to two orders of magnitude faster. To get best results with Mahalanobis distances, we recommend learning one distance per class using either covariance shrinking or the diagonal approach.

View on arXiv
Comments on this paper