[2602.22521] TFPS: A Temporal Filtration-enhanced Positive Sample Set Construction Method for Implicit Collaborative Filtering

[2602.22521] TFPS: A Temporal Filtration-enhanced Positive Sample Set Construction Method for Implicit Collaborative Filtering

arXiv - Machine Learning 4 min read Article

Summary

The paper presents TFPS, a method for enhancing positive sample construction in implicit collaborative filtering through temporal filtration, improving recommendation accuracy.

Why It Matters

This research addresses a gap in collaborative filtering by optimizing positive sample construction, which is crucial for improving recommendation systems. By incorporating temporal information, TFPS enhances the model's ability to adapt to user preferences over time, potentially leading to better user experiences in various applications.

Key Takeaways

  • TFPS improves positive sample construction in collaborative filtering.
  • The method incorporates temporal information to enhance accuracy.
  • Extensive experiments demonstrate TFPS's effectiveness on real-world datasets.
  • TFPS can be integrated with various implicit collaborative filtering methods.
  • The approach provides theoretical insights into performance metrics like Recall@k and NDCG@k.

Computer Science > Information Retrieval arXiv:2602.22521 (cs) [Submitted on 26 Feb 2026] Title:TFPS: A Temporal Filtration-enhanced Positive Sample Set Construction Method for Implicit Collaborative Filtering Authors:Jiayi Wu, Zhengyu Wu, Xunkai Li, Rong-Hua Li, Guoren Wang View a PDF of the paper titled TFPS: A Temporal Filtration-enhanced Positive Sample Set Construction Method for Implicit Collaborative Filtering, by Jiayi Wu and 4 other authors View PDF HTML (experimental) Abstract:The negative sampling strategy can effectively train collaborative filtering (CF) recommendation models based on implicit feedback by constructing positive and negative samples. However, existing methods primarily optimize the negative sampling process while neglecting the exploration of positive samples. Some denoising recommendation methods can be applied to denoise positive samples within negative sampling strategies, but they ignore temporal information. Existing work integrates sequential information during model aggregation but neglects time interval information, hindering accurate capture of users' current preferences. To address this problem, from a data perspective, we propose a novel temporal filtration-enhanced approach to construct a high-quality positive sample set. First, we design a time decay model based on interaction time intervals, transforming the original graph into a weighted user-item bipartite graph. Then, based on predefined filtering operations, the weighted user-i...

Related Articles

Machine Learning

I tried building a memory-first AI… and ended up discovering smaller models can beat larger ones

Dataset Model Acc F1 Δ vs Log Δ vs Static Avg Params Peak Params Steps Infer ms Size Banking77-20 Logistic TF-IDF 92.37% 0.9230 +0.00pp +...

Reddit - Artificial Intelligence · 1 min ·
Llms

[D] Howcome Muon is only being used for Transformers?

Muon has quickly been adopted in LLM training, yet we don't see it being talked about in other contexts. Searches for Muon on ConvNets tu...

Reddit - Machine Learning · 1 min ·
Machine Learning

[P] Run Karpathy's Autoresearch for $0.44 instead of $24 — Open-source parallel evolution pipeline on SageMaker Spot

TL;DR: I built an open-source pipeline that runs Karpathy's autoresearch on SageMaker Spot instances — 25 autonomous ML experiments for $...

Reddit - Machine Learning · 1 min ·
Improving AI models’ ability to explain their predictions
Machine Learning

Improving AI models’ ability to explain their predictions

AI News - General · 9 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime