[2602.12744] Adaptive Structured Pruning of Convolutional Neural Networks for Time Series Classification

[2602.12744] Adaptive Structured Pruning of Convolutional Neural Networks for Time Series Classification

arXiv - Machine Learning 4 min read Article

Summary

This article presents Dynamic Structured Pruning (DSP), an innovative method for optimizing convolutional neural networks in time series classification, achieving significant model compression while maintaining accuracy.

Why It Matters

As deep learning models become more prevalent in time series classification, their high resource demands pose challenges for deployment on limited-capacity devices. DSP addresses these issues by automating the pruning process, enhancing scalability and efficiency, which is crucial for real-world applications.

Key Takeaways

  • DSP automates the pruning of convolutional neural networks, eliminating the need for manual hyperparameter tuning.
  • The method achieves an average model compression of 58% for LITETime and 75% for InceptionTime without sacrificing accuracy.
  • DSP enhances deployment feasibility for deep learning models on resource-constrained devices.
  • The framework utilizes instance-wise sparsity loss and global activation analysis for effective pruning.
  • Validation across 128 UCR datasets demonstrates DSP's robustness and efficiency in time series classification.

Computer Science > Machine Learning arXiv:2602.12744 (cs) [Submitted on 13 Feb 2026] Title:Adaptive Structured Pruning of Convolutional Neural Networks for Time Series Classification Authors:Javidan Abdullayev, Maxime Devanne, Cyril Meyer, Ali Ismail-Fawaz, Jonathan Weber, Germain Forestier View a PDF of the paper titled Adaptive Structured Pruning of Convolutional Neural Networks for Time Series Classification, by Javidan Abdullayev and 5 other authors View PDF HTML (experimental) Abstract:Deep learning models for Time Series Classification (TSC) have achieved strong predictive performance but their high computational and memory requirements often limit deployment on resource-constrained devices. While structured pruning can address these issues by removing redundant filters, existing methods typically rely on manually tuned hyperparameters such as pruning ratios which limit scalability and generalization across datasets. In this work, we propose Dynamic Structured Pruning (DSP), a fully automatic, structured pruning framework for convolution-based TSC models. DSP introduces an instance-wise sparsity loss during training to induce channel-level sparsity, followed by a global activation analysis to identify and prune redundant filters without needing any predefined pruning ratio. This work tackles computational bottlenecks of deep TSC models for deployment on resource-constrained devices. We validate DSP on 128 UCR datasets using two different deep state-of-the-art archite...

Related Articles

Yupp shuts down after raising $33M from a16z crypto's Chris Dixon | TechCrunch
Machine Learning

Yupp shuts down after raising $33M from a16z crypto's Chris Dixon | TechCrunch

Less than a year after launching, with checks from some of the biggest names in Silicon Valley, crowdsourced AI model feedback startup Yu...

TechCrunch - AI · 4 min ·
Machine Learning

[R] Fine-tuning services report

If you have some data and want to train or run a small custom model but don't have powerful enough hardware for training, fine-tuning ser...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Does ML have a "bible"/reference textbook at the Intermediate/Advanced level?

Hello, everyone! This is my first time posting here and I apologise if the question is, perhaps, a bit too basic for this sub-reddit. A b...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] ICML 2026 review policy debate: 100 responses suggest Policy B may score higher, while Policy A shows higher confidence

A week ago I made a thread asking whether ICML 2026’s review policy might have affected review outcomes, especially whether Policy A pape...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime