copy and paste this google map to your website or blog!
Press copy button and paste into your blog or website.
(Please switch to 'HTML' mode when posting into your blog. Examples: WordPress Example, Blogger Example)
Yang Song Dec 10, 2023: The Strategic Explorations team at OpenAI is recruiting! Contact me if you are a hardcore researcher engineer with a passion for advancing fundamental methodologies in the training and inference of diffusion models, consistency models, or large language models : Dec 10, 2023: I have postponed my starting date as an Assistant Professor at Caltech EE CMS until January 2026
publications - Yang Song Consistency models (CMs) are a powerful class of diffusion-based generative models optimized for fast sampling Most existing CMs are trained using discretized timesteps, which introduce additional hyperparameters and are prone to discretization errors
blog | Yang Song Generative Modeling by Estimating Gradients of the Data Distribution This blog post focuses on a promising new direction for generative modeling
Sliced Score Matching: A Scalable Approach to Density and Score . . . An overview for our UAI 2019 paper on Sliced Score Matching We show how to use random projections to scale up score matching—a classic method to learn unnormalized probabilisic models—to high-dimensional data Theoretically, sliced score matching produces a consistent and asymptotic normal estimator under some regularity conditions We apply sliced score matching to training deep energy
Yang Song ComputerScienceDepartment Email:songyang@stanford edu StanfordUniversity Website:yang-song github io Stanford,CA94305 Research Topics MachineLearning;GenerativeModels;InverseProblemSolving;AISafety Education StanfordUniversity Ph D inComputerScience June2022(expected) M S inComputerScience July2020 AdvisedbyProfessorStefanoErmon TsinghuaUniversity
cv - Yang Song The personal website of Yang Song Organizer: NeurIPS 2022 Workshop on Score-Based Methods Area Chair: CVPR (2023), NeurIPS (2023) Journal Reviewer: JRSS-B, JMLR, TMLR, IEEE TPAMI, ACM TKDD, Medical Image Analysis, IEEE TIFS, IEEE TNNLS, IEEE TDSC, IET CV Conference Reviewer Program Committee: NeurIPS (2019-2022, 2016), ICML (2019-2023), ICLR (2019-2023), AISTATS (2020-2022), UAI (2020
Consistency Models - Yang Song Intuition: The probability flow ODE defines a one-to-one mapping between noise and data Consistency models learn to estimate this mapping Definition: Parameterization: Need to satisfy the boundary condition at !=0 Sampling: one-step or multi-step Training:
repositories - Yang Song repositories Most of my research has code open-sourced at GitHub GitHub Account
Accelerating Natural Gradient with Higher-Order Invariance An overview for our ICML 2018 paper, Accelerating Natural Gradient with Higher-Order Invariance Natural gradient update loses its invariance due to the finite step size In this paper, we study the invariance of natural gradient from the perspective of Riemannian geometry, and propose several new update rules to improve its invariance