Question regarding Transformer's pipeline module [D]

Reddit - Machine Learning 1 min read

About this article

from transformers import pipeline , DistilBertTokenizer , DistilBertModel model = DistilBertModel . from_pretrained ('distilbert-base-cased-distilled-squad') # Load a model that is already trained on Question Answering extractor = pipeline ("question-answering") def get_emotion_cause (text, emotion): question = f"Show the reason why the text convey {emotion} symptoms?" # The model extracts the 'cause' span from the text result = extractor(question = question, context = text) return result ['a...

You've been blocked by network security.To continue, log in to your Reddit account or use your developer tokenIf you think you've been blocked by mistake, file a ticket below and we'll look into it.Log in File a ticket

Originally published on May 03, 2026. Curated by AI News.

Related Articles

Machine Learning

token budget is becoming part of my agent workflow design

I think token budget is becoming part of agent workflow design. If every run feels expensive, people under-test. They save quota, overthi...

Reddit - Artificial Intelligence · 1 min ·
Improving AI models’ ability to explain their predictions
Machine Learning

Improving AI models’ ability to explain their predictions

AI News - General · 9 min ·
New technique makes AI models leaner and faster while they’re still learning
Machine Learning

New technique makes AI models leaner and faster while they’re still learning

AI News - General · 9 min ·
Llms

Could the best LLM be able to generate a symbolic AI that is superior to itself, or is there something superior about matrices vs graphs?

Deep neural network AIs have beaten symbolic AIs across the board on many tasks, but is there a chance that symbolic AIs written by DNNs(...

Reddit - Artificial Intelligence · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime