Haiyu Zhang
Haiyu Zhang
Hi! I am a second-year master student in mathematics at Southern University of Science and Technology (SUSTech) in ShenZhen, China. My advisor is Yifei Zhu. I received my B.S. from SUSTech in 2024.
My research interests primarily lie in the mathematical foundations of machine learning. I am also interested in designing and improving theoretically grounded machine learning algorithms and models. My previous work primarily focused on the intersection of machine learning with geometry and topology.
Email: 12432036@mail.sustech.edu.cn
Papers
Topology-enhanced machine learning for consonant recognition,
Pingyao Feng*, Qingrui Qu*, Haiyu Zhang, Siheng Yi, Zhiwang Yu, Zeyang Ding and Yifei Zhu,
Preprint.
Presentations
Fractal Structure and Generalization Bounds with Intrinsic Dimension for Stochastic Optimization Algorithms
The concept of "intrinsic dimension" (ID) is motivated by Leo Breiman's first question: Why don't heavily parameterized neural networks overfit? Existing studies suggest that the fractal dimension of weight trajectories, which is regarded as ID, can serve as an intrinsic measure of "effective" hypothesis complexity to shed light on the generalization mystery. A rigorous framework for generalization bounds, established through PAC-Bayesian theory, has demonstrated a positive correlation between ID and the generalization gap.
This talk reviews these existing results, elaborates on the limitations of current research, and proposes a research plan based on these findings.
Understanding Neural Networks: A Perspective on Representability and Interpretability
This talk reviews the work on neural network approximation theory, explaining the roles of depth and width in function approximation and representability of different architectures (MLP, RNN, Transformer, etc.). Furthermore, it presents some results of visualizing the training process of neural networks from geometric and topological perspectives, and discusses the grokking phenomenon.
Topological Data Analysis and Topological Deep Learning: with Applications to Image Data
This talk summarizes the foundational work by Carlsson et al. on Topological Data Analysis and Topological Deep Learning in image processing, and presents some results of reproduction and improvement of relevant models.
It begins by reviewing the groundbreaking TDA study on natural image patches in Carlsson et al. 2008, which extracted a submanifold with the topology of the Klein bottle. The talk then presents results reproducing the work of Gabrielsson and Carlsson 2019 on the topology of neural networks using Persistent Homology. It shows that CNN kernels progressively learn topological structures (circles, Klein bottles) that align with the intrinsic topology of high-contrast image patches. These findings
suggest that learned convolution kernels act as discrete first- and second-order differential operators (e.g., Prewitt, Laplacian), enabling the extraction of local features such as edges and textures by taking the inner product with local patches of the image.
Building on this insight, and inspired by Love et al. 2023, an optimized topological CNN model for image classification is proposed. This model improves performance by embedding the topological structure into the neural network and sampling convolution kernels from non-uniform probability distributions defined on these manifolds.
Entropy of Measure Preserving Endomorphisms
This talk primarily references the first chapter of the book, Conformal Fractals – Ergodic Theory Methods. It introduces the motivations, definitions and properties of the conditional entropy of partitions and the measure-theoretic entropy of measure preserving endomorphisms.
Conferences and workshops
International Workshop on Algebraic Topology (IWoAT), SUSTech, 2025.
The 4th Annual Centre for Topological Data Analysis Conference (Spires 2024), University of Oxford, 2024. A poster was presented at poster session.
Greater Bay Area Topology Conference, SUSTech, 2024.
Teaching
Teaching assistant for Applied and Computational Topology, 2025 Fall
Teaching assistant for Calculus, 2024 Fall