[2602.19668] Personalized Longitudinal Medical Report Generation via Temporally-Aware Federated Adaptation

[2602.19668] Personalized Longitudinal Medical Report Generation via Temporally-Aware Federated Adaptation

arXiv - Machine Learning 3 min read Article

Summary

This article presents a novel framework, FedTAR, for generating personalized longitudinal medical reports using federated learning that accounts for temporal dynamics in patient data.

Why It Matters

The research addresses critical challenges in medical report generation by enhancing privacy through federated learning while improving the accuracy and coherence of reports over time. This is particularly relevant in healthcare settings where patient data privacy is paramount, and understanding disease progression is essential for effective treatment.

Key Takeaways

  • Introduces Federated Temporal Adaptation (FTA) to model temporal shifts in patient data.
  • Presents FedTAR, which integrates demographic personalization with time-aware aggregation.
  • Demonstrates improvements in linguistic accuracy and temporal coherence in medical reports.
  • Utilizes large datasets (J-MID and MIMIC-CXR) for validation of the proposed framework.
  • Establishes a robust method for privacy-preserving longitudinal modeling in healthcare.

Computer Science > Computer Vision and Pattern Recognition arXiv:2602.19668 (cs) [Submitted on 23 Feb 2026] Title:Personalized Longitudinal Medical Report Generation via Temporally-Aware Federated Adaptation Authors:He Zhu, Ren Togo, Takahiro Ogawa, Kenji Hirata, Minghui Tang, Takaaki Yoshimura, Hiroyuki Sugimori, Noriko Nishioka, Yukie Shimizu, Kohsuke Kudo, Miki Haseyama View a PDF of the paper titled Personalized Longitudinal Medical Report Generation via Temporally-Aware Federated Adaptation, by He Zhu and 10 other authors View PDF HTML (experimental) Abstract:Longitudinal medical report generation is clinically important yet remains challenging due to strict privacy constraints and the evolving nature of disease progression. Although federated learning (FL) enables collaborative training without data sharing, existing FL methods largely overlook longitudinal dynamics by assuming stationary client distributions, making them unable to model temporal shifts across visits or patient-specific heterogeneity-ultimately leading to unstable optimization and suboptimal report generation. We introduce Federated Temporal Adaptation (FTA), a federated setting that explicitly accounts for the temporal evolution of client data. Building upon this setting, we propose FedTAR, a framework that integrates demographic-driven personalization with time-aware global aggregation. FedTAR generates lightweight LoRA adapters from demographic embeddings and performs temporal residual aggregation...

Related Articles

Machine Learning

[D] Budget Machine Learning Hardware

Looking to get into machine learning and found this video on a piece of hardware for less than £500. Is it really possible to teach auton...

Reddit - Machine Learning · 1 min ·
UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Machine Learning

Your prompts aren’t the problem — something else is

I keep seeing people focus heavily on prompt optimization. But in practice, a lot of failures I’ve observed don’t come from the prompt it...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

[R], 31 MILLIONS High frequency data, Light GBM worked perfectly

We just published a paper on predicting adverse selection in high-frequency crypto markets using LightGBM, and I wanted to share it here ...

Reddit - Machine Learning · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime