Yuwen Huang’s homepage

Information Theory · Statistical Physics · Optimization · Quantum Information

Structured inference and optimization
with rigorous guarantees

I am a postdoctoral researcher working at the intersection of information theory, statistical physics, machine learning, optimization, and quantum information. My research develops provable and scalable methods for inference, counting, and optimization by combining probabilistic graphical models, combinatorics, Bethe and graph-cover techniques, tensor-network representations, ideas from statistical physics, and distributed quantum computation. [ISIT2020] [ITW2022] [ISIT2024] [TIT-Sub] [Quantum2026]

A central theme of my work is to use structural insight to design algorithms that are both mathematically rigorous and computationally practical. I am also interested in how these ideas can lead to future applications in machine learning. [ICML2025]

Current focus

Graphical-model methods, counting and inference problems, Bethe and statistical-physics perspectives, optimization with guarantees, tensor-network methods, and distributed quantum systems. [ISIT2024] [TIT2024] [TIT-Sub] [Quantum2026]

Future direction

Bringing these structure-aware tools into machine learning, especially where uncertainty, efficiency, and provable behavior all matter. I also continue to explore research directions in combinatorics and inference, quantum information processing, and their potential applications in learning theory. [ICML2025] [Quantum2026]

Theory to Impact

Theory

information theory and statistical physics

[ISIT2020] [ITW2022]

Algorithms

Inference, counting, and optimization with structure and guarantees

[ISIT2024] [TIT2024] [TIT-Sub]

Systems

quantum networks

[Quantum2026]

Impact

ML applications and efficient computation

[ICML2025]

Research Framework

My work brings together four technical directions. The common objective is to develop scalable methods for structured inference, counting, and optimization with firm theoretical guarantees.

Probabilistic graphical models

Bethe methods, graph covers, message passing, statistical-physics perspectives, and structure-aware formulations for inference and counting. [ISIT2020] [ITW2022] [ISIT2024] [TIT-Sub]

Optimization and decision-making

Structure-exploiting algorithms with provable guarantees in convex and nonconvex settings. [TIT2024] [ICML2025]

Tensor networks and quantum-enabled methods

High-dimensional representations and quantum-inspired tools for compact, efficient computation. [ISIT2021] [Quantum2026]

Distributed quantum computing and networks

Structure-aware quantum architectures for scalable optimization, inference, and analytics. [ICML2025] [Quantum2026]

Shared Goal Provable and scalable methods for inference, counting, and optimization in modern structured systems.

Potential Applications

These methods are motivated by core theoretical questions, but they also point toward practical applications in machine learning, large-scale analytics, and emerging quantum platforms.

latent structure · dependencies · constraints

Structured machine learning

Methods based on graphical models and combinatorial structure can support learning problems where dependencies, constraints, and latent structure should be modeled explicitly rather than ignored.

Potential role: more structured and interpretable ML pipelines.

approximation quality · confidence · decision support

Uncertainty quantification and reliable analytics

Scalable approximate inference can be useful when decisions must be made under uncertainty and exact probabilistic computation is too expensive.

Potential role: reliable uncertainty estimates in complex systems.

compact representations · efficient computation

Model compression and high-dimensional computation

Tensor-network and quantum-inspired ideas may provide principled ways to represent large models or high-dimensional objects more compactly.

Potential role: efficient representations for large-scale ML models.

networked processors · coordination · scalability

Distributed quantum optimization

Distributed quantum systems and quantum networks suggest new computational settings where structure-aware decomposition may be essential for scalability.

Potential role: scalable optimization on emerging quantum platforms.

Research Interests

Probabilistic graphical models and combinatorial inference

Scalable approximate inference, uncertainty quantification, and counting problems arising in structured learning and data analysis. [ISIT2023] [TIT2024] [TIT-Sub]

Optimization for learning and sequential decision-making

Structure-exploiting algorithms for machine learning and decision problems, with emphasis on rigorous guarantees. [ICML2025]

Tensor-network, quantum-inspired, and quantum-enabled methods

Methods for high-dimensional inference, compact representations, and efficient computation in large structured systems. [ISIT2021] [Quantum2026]

Distributed quantum computing and quantum networks

Emerging computational platforms for scalable optimization, inference, and data-intensive analytics. [Quantum2026]