[2509.14537] ClearFairy: Capturing Creative Workflows through Decision Structuring, In-Situ Questioning, and Rationale Inference

[2509.14537] ClearFairy: Capturing Creative Workflows through Decision Structuring, In-Situ Questioning, and Rationale Inference

arXiv - AI 3 min read Article

Summary

The paper introduces ClearFairy, an AI assistant designed to enhance decision-making in creative workflows by structuring reasoning and inferring missing rationales, significantly improving the quality of explanations in UI design.

Why It Matters

ClearFairy addresses a critical gap in creative workflows by making implicit decisions explicit, thus fostering better collaboration and knowledge sharing. The findings indicate a substantial improvement in the quality of rationales, which can enhance the effectiveness of generative AI tools in design applications.

Key Takeaways

  • ClearFairy improves decision-making by structuring cognitive steps in creative workflows.
  • The system significantly increases the quality of explanations from 14% to 83%.
  • ClearFairy uses in-situ questioning to detect weak explanations and infer missing rationales.
  • A dataset of 417 decision steps is released to support future research.
  • The approach enhances generative AI agents in design tools like Figma.

Computer Science > Human-Computer Interaction arXiv:2509.14537 (cs) [Submitted on 18 Sep 2025 (v1), last revised 25 Feb 2026 (this version, v2)] Title:ClearFairy: Capturing Creative Workflows through Decision Structuring, In-Situ Questioning, and Rationale Inference Authors:Kihoon Son, DaEun Choi, Tae Soo Kim, Young-Ho Kim, Sangdoo Yun, Juho Kim View a PDF of the paper titled ClearFairy: Capturing Creative Workflows through Decision Structuring, In-Situ Questioning, and Rationale Inference, by Kihoon Son and 5 other authors View PDF HTML (experimental) Abstract:Capturing professionals' decision-making in creative workflows (e.g., UI/UX) is essential for reflection, collaboration, and knowledge sharing, yet existing methods often leave rationales incomplete and implicit decisions hidden. To address this, we present the CLEAR approach, which structures reasoning into cognitive decision steps-linked units of actions, artifacts, and explanations making decisions traceable with generative AI. Building on CLEAR, we introduce ClearFairy, a think-aloud AI assistant for UI design that detects weak explanations, asks lightweight clarifying questions, and infers missing rationales. In a study with twelve professionals, 85% of ClearFairy's inferred rationales were accepted (as-is or with revisions). Notably, the system increased "strong explanations"-rationales providing sufficient causal reasoning-from 14% to 83% without adding cognitive demand. Furthermore, exploratory applications ...

Related Articles

UMKC Announces New Master of Science in Artificial Intelligence
Ai Infrastructure

UMKC Announces New Master of Science in Artificial Intelligence

UMKC announces a new Master of Science in Artificial Intelligence program aimed at addressing workforce demand for AI expertise, set to l...

AI News - General · 4 min ·
Improving AI models’ ability to explain their predictions
Machine Learning

Improving AI models’ ability to explain their predictions

AI News - General · 9 min ·
Llms

LLM agents can trigger real actions now. But what actually stops them from executing?

We ran into a simple but important issue while building agents with tool calling: the model can propose actions but nothing actually enfo...

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

OkCupid gave 3 million dating-app photos to facial recognition firm, FTC says

submitted by /u/Mathemodel [link] [comments]

Reddit - Artificial Intelligence · 1 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime