Yuwen Huang’s homepage

Information Theory · Optimization · Quantum Information

Structured inference and optimization
with rigorous guarantees

I am a postdoctoral researcher working at the intersection of information theory, machine learning, optimization, and quantum information. My research develops provable and scalable methods for inference, counting, and optimization by combining probabilistic graphical models, combinatorics, Bethe and graph-cover techniques, tensor-network representations, and distributed quantum computation.

A central theme of my work is to use structural insight to design algorithms that are both mathematically rigorous and computationally practical. I am also interested in how these ideas can lead to future applications in machine learning.

Current focus

Graphical-model methods, counting and inference problems, optimization with guarantees, tensor-network methods, and distributed quantum systems.

Future direction

Bringing these structure-aware tools into machine learning, especially where uncertainty, efficiency, and provable behavior all matter.

Research Framework

My work brings together four technical directions. The common objective is to develop scalable methods for structured inference, counting, and optimization with firm theoretical guarantees.

Main Research Directions

Provable and Scalable Inference, Counting, and Optimization Graphical Models Bethe methods · Graph covers Inference and counting Optimization Structure-exploiting algorithms Convex and nonconvex settings Tensor Networks High-dimensional representations Quantum-inspired computation Quantum Systems Distributed quantum computing Quantum networks
Shared Goal Provable and scalable methods for inference, counting, and optimization in modern structured systems.

Potential Applications

These methods are motivated by core theoretical questions, but they also point toward practical applications in machine learning, large-scale analytics, and emerging quantum platforms.

latent structure · dependencies · constraints

Structured machine learning

Methods based on graphical models and combinatorial structure can support learning problems where dependencies, constraints, and latent structure should be modeled explicitly rather than ignored.

Potential role: more structured and interpretable ML pipelines.

approximation quality · confidence · decision support

Uncertainty quantification and reliable analytics

Scalable approximate inference can be useful when decisions must be made under uncertainty and exact probabilistic computation is too expensive.

Potential role: reliable uncertainty estimates in complex systems.

compact representations · efficient computation

Model compression and high-dimensional computation

Tensor-network and quantum-inspired ideas may provide principled ways to represent large models or high-dimensional objects more compactly.

Potential role: efficient representations for large-scale ML models.

networked processors · coordination · scalability

Distributed quantum optimization

Distributed quantum systems and quantum networks suggest new computational settings where structure-aware decomposition may be essential for scalability.

Potential role: scalable optimization on emerging quantum platforms.

Research Interests

Probabilistic graphical models and combinatorial inference

Scalable approximate inference, uncertainty quantification, and counting problems arising in structured learning and data analysis.

Optimization for learning and sequential decision-making

Structure-exploiting algorithms for machine learning and decision problems, with emphasis on rigorous guarantees.

Tensor-network, quantum-inspired, and quantum-enabled methods

Methods for high-dimensional inference, compact representations, and efficient computation in large structured systems.

Distributed quantum computing and quantum networks

Emerging computational platforms for scalable optimization, inference, and data-intensive analytics.