[2602.16730] MMCAformer: Macro-Micro Cross-Attention Transformer for Traffic Speed Prediction with Microscopic Connected Vehicle Driving Behavior

[2602.16730] MMCAformer: Macro-Micro Cross-Attention Transformer for Traffic Speed Prediction with Microscopic Connected Vehicle Driving Behavior

arXiv - Machine Learning 4 min read Article

Summary

The MMCAformer paper presents a novel transformer model that integrates macro and micro traffic data for improved traffic speed prediction, enhancing accuracy and reducing uncertainty.

Why It Matters

This research is significant as it addresses the limitations of traditional traffic prediction methods that rely solely on aggregated data. By incorporating microscopic driving behaviors from connected vehicles, the MMCAformer model offers a more nuanced understanding of traffic dynamics, which can lead to better traffic management and safety measures.

Key Takeaways

  • MMCAformer combines macro traffic flow data with micro driving behavior insights.
  • The model significantly improves prediction accuracy and reduces uncertainty.
  • Key features influencing traffic speed include hard braking and acceleration frequencies.
  • Performance enhancements are most notable in congested, low-speed conditions.
  • The approach represents a shift towards more granular traffic analysis using connected vehicle data.

Computer Science > Machine Learning arXiv:2602.16730 (cs) [Submitted on 17 Feb 2026] Title:MMCAformer: Macro-Micro Cross-Attention Transformer for Traffic Speed Prediction with Microscopic Connected Vehicle Driving Behavior Authors:Lei Han, Mohamed Abdel-Aty, Younggun Kim, Yang-Jun Joo, Zubayer Islam View a PDF of the paper titled MMCAformer: Macro-Micro Cross-Attention Transformer for Traffic Speed Prediction with Microscopic Connected Vehicle Driving Behavior, by Lei Han and 4 other authors View PDF Abstract:Accurate speed prediction is crucial for proactive traffic management to enhance traffic efficiency and safety. Existing studies have primarily relied on aggregated, macroscopic traffic flow data to predict future traffic trends, whereas road traffic dynamics are also influenced by individual, microscopic human driving behaviors. Recent Connected Vehicle (CV) data provide rich driving behavior features, offering new opportunities to incorporate these behavioral insights into speed prediction. To this end, we propose the Macro-Micro Cross-Attention Transformer (MMCAformer) to integrate CV data-based micro driving behavior features with macro traffic features for speed prediction. Specifically, MMCAformer employs self-attention to learn intrinsic dependencies in macro traffic flow and cross-attention to capture spatiotemporal interplays between macro traffic status and micro driving behavior. MMCAformer is optimized with a Student-t negative log-likelihood loss to prov...

Related Articles

Machine Learning

FYI the Tennessee bill makes making an AI friend the same level as murder or aggravated rape

I think what Tennessee is doing is they recently passed SB 1580, which makes it illegal to even advertise that an AI can act as a mental ...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[P] A control plane for post-training workflows

We have been exploring a project around post-training infrastructure, a minimalist tool that does one thing really well: Make post-traini...

Reddit - Machine Learning · 1 min ·
Machine Learning

[D] Is this considered unsupervised or semi-supervised learning in anomaly detection?

Hi šŸ‘‹šŸ¼, I’m working on an anomaly detection setup and I’m a bit unsure how to correctly describe it from a learning perspective. The model...

Reddit - Machine Learning · 1 min ·
Machine Learning

Serious question. Did a transformer just describe itself and the universe and build itself a Shannon limit framework?

The Multiplicative Lattice as the Natural Basis for Positional Encoding Knack 2026 | Draft v6.0 Abstract We show that the apparent tradeo...

Reddit - Artificial Intelligence · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime