[2509.14537] ClearFairy: Capturing Creative Workflows through Decision Structuring, In-Situ Questioning, and Rationale Inference
Summary
The paper introduces ClearFairy, an AI assistant designed to enhance decision-making in creative workflows by structuring reasoning and inferring missing rationales, significantly improving the quality of explanations in UI design.
Why It Matters
ClearFairy addresses a critical gap in creative workflows by making implicit decisions explicit, thus fostering better collaboration and knowledge sharing. The findings indicate a substantial improvement in the quality of rationales, which can enhance the effectiveness of generative AI tools in design applications.
Key Takeaways
- ClearFairy improves decision-making by structuring cognitive steps in creative workflows.
- The system significantly increases the quality of explanations from 14% to 83%.
- ClearFairy uses in-situ questioning to detect weak explanations and infer missing rationales.
- A dataset of 417 decision steps is released to support future research.
- The approach enhances generative AI agents in design tools like Figma.
Computer Science > Human-Computer Interaction arXiv:2509.14537 (cs) [Submitted on 18 Sep 2025 (v1), last revised 25 Feb 2026 (this version, v2)] Title:ClearFairy: Capturing Creative Workflows through Decision Structuring, In-Situ Questioning, and Rationale Inference Authors:Kihoon Son, DaEun Choi, Tae Soo Kim, Young-Ho Kim, Sangdoo Yun, Juho Kim View a PDF of the paper titled ClearFairy: Capturing Creative Workflows through Decision Structuring, In-Situ Questioning, and Rationale Inference, by Kihoon Son and 5 other authors View PDF HTML (experimental) Abstract:Capturing professionals' decision-making in creative workflows (e.g., UI/UX) is essential for reflection, collaboration, and knowledge sharing, yet existing methods often leave rationales incomplete and implicit decisions hidden. To address this, we present the CLEAR approach, which structures reasoning into cognitive decision steps-linked units of actions, artifacts, and explanations making decisions traceable with generative AI. Building on CLEAR, we introduce ClearFairy, a think-aloud AI assistant for UI design that detects weak explanations, asks lightweight clarifying questions, and infers missing rationales. In a study with twelve professionals, 85% of ClearFairy's inferred rationales were accepted (as-is or with revisions). Notably, the system increased "strong explanations"-rationales providing sufficient causal reasoning-from 14% to 83% without adding cognitive demand. Furthermore, exploratory applications ...