SDD: Shape-aware Data-driven Attention Mechanism for Time Series Analysis

  • Yanyun Cao
  • , Rundong Zuo*
  • , Rui Cao
  • , Byron Choi
  • , Jianliang Xu
  • , Sourav S Bhowmick
  • *Corresponding author for this work

Research output: Chapter in book/report/conference proceedingConference proceedingpeer-review

Abstract

Multivariate time series (mts ) analysis have extensive applications in various areas such as human activity recognition, healthcare, and economics, among others. Recently, Transformer approaches have been specifically designed for MTS and have consistently reported superior performance. In this paper, we demonstrate a software system for a recent efficient shape-aware Transformer (SDD ), where time-series subsequences (a.k.a shapes) are made available to users for investigation. First, a time-series Transformer, called SVP-T, takes shapes, together with their variable position information (VP information) as input to the training of a Transformer model. These shapes are computed from different variables and time intervals, enabling the Transformer model to learn dependencies simultaneously across both time and variables. Second, a data-driven kernel-based attention mechanism, called DARKER, reduces the time complexity of training Transformer models from O(N2) to O(N), where N is the number of inputs. As a result, the training process by using DARKER offers about 3x-4x speedup over vanilla Transformers'. In this demo, we present the first system (SDD ) that integrates SVP-T and DARKER. In particular, SDD visualizes the SVP-T's attention matrix and allows users to explore key shapes that have high attention weights. Furthermore, users can use SDD to decide the shape input to train a new model, to further balance between efficiency and accuracy.
Original languageEnglish
Title of host publicationCIKM 2025 - Proceedings of the 34th ACM International Conference on Information and Knowledge Management
Place of PublicationNew York, NY, USA
PublisherAssociation for Computing Machinery (ACM)
Pages6614–6618
Number of pages5
ISBN (Electronic)9798400720406
ISBN (Print)9798400720406
DOIs
Publication statusPublished - 10 Nov 2025

Publication series

NameCIKM: Conference on Information and Knowledge Management
PublisherAssociation for Computing Machinery

User-Defined Keywords

  • efficient transformers
  • human-in-the-loop
  • time series analysis

Fingerprint

Dive into the research topics of 'SDD: Shape-aware Data-driven Attention Mechanism for Time Series Analysis'. Together they form a unique fingerprint.

Cite this