[2502.01160] Scalable Precise Computation of Shannon Entropy

[2502.01160] Scalable Precise Computation of Shannon Entropy

arXiv - AI 4 min read Article

Summary

This paper presents a scalable tool, PSE, for precise computation of Shannon entropy, optimizing the process to enhance efficiency in quantitative information flow analyses.

Why It Matters

Understanding and quantifying information leakage is crucial in security and privacy contexts. This research improves the efficiency of entropy computation, which can significantly impact the development of secure systems and data protection methodologies.

Key Takeaways

  • PSE tool significantly outperforms existing methods in entropy computation.
  • Introduces a novel knowledge compilation language, DDAND, for efficient processing.
  • Demonstrates a tenfold efficiency increase in solving benchmarks compared to previous tools.

Computer Science > Artificial Intelligence arXiv:2502.01160 (cs) [Submitted on 3 Feb 2025 (v1), last revised 18 Feb 2026 (this version, v3)] Title:Scalable Precise Computation of Shannon Entropy Authors:Yong Lai, Haolong Tong, Zhenghang Xu, Minghao Yin View a PDF of the paper titled Scalable Precise Computation of Shannon Entropy, by Yong Lai and 3 other authors View PDF HTML (experimental) Abstract:Quantitative information flow analyses (QIF) are a class of techniques for measuring the amount of confidential information leaked by a program to its public outputs. Shannon entropy is an important method to quantify the amount of leakage in QIF. This paper focuses on the programs modeled in Boolean constraints and optimizes the two stages of the Shannon entropy computation to implement a scalable precise tool PSE. In the first stage, we design a knowledge compilation language called \ADDAND that combines Algebraic Decision Diagrams and conjunctive decomposition. \ADDAND avoids enumerating possible outputs of a program and supports tractable entropy computation. In the second stage, we optimize the model counting queries that are used to compute the probabilities of outputs. We compare PSE with the state-of-the-art probabilistic approximately correct tool EntropyEstimation, which was shown to significantly outperform the previous precise tools. The experimental results demonstrate that PSE solved 56 more benchmarks compared to EntropyEstimation in a total of 459. For 98\% of t...

Related Articles

Machine Learning

[HIRING] Machine Learning Evaluation Specialist | Remote | $50/hr

​ We are onboarding domain experts with strong machine learning knowledge to design advanced evaluation tasks for AI systems. About the R...

Reddit - ML Jobs · 1 min ·
Machine Learning

Japan is adopting robotics and physical AI, with a model where startups innovate and corporations provide scale

Physical AI is emerging as one of the next major industrial battlegrounds, with Japan’s push driven more by necessity than anything else....

Reddit - Artificial Intelligence · 1 min ·
Machine Learning

mining hardware doing AI training - is the output actually useful

there's this network that launched recently routing crypto mining hardware toward AI training workloads. miners seem happy with the econo...

Reddit - Artificial Intelligence · 1 min ·
AI is changing how small online sellers decide what to make | MIT Technology Review
Machine Learning

AI is changing how small online sellers decide what to make | MIT Technology Review

Entrepreneurs based in the US are using tools like Alibaba’s Accio to compress weeks of product research and supplier hunting into a sing...

MIT Technology Review · 8 min ·
More in Machine Learning: This Week Guide Trending

No comments

No comments yet. Be the first to comment!

Stay updated with AI News

Get the latest news, tools, and insights delivered to your inbox.

Daily or weekly digest • Unsubscribe anytime