Poster #P17




Tailoring Denoising Diffusion Probabilistic Models to Stochastic Thermodynamics

Daniel Nagel, Tristan Bereau



Stochastic thermodynamics has proven useful in understanding the dynamics of complex systems in non-equilibrium states. In particular, entropy production is an important concept within this framework due to its close relation to the Hamiltonian of the system and its significant role in the Crooks fluctuation relation. However, its computational complexity due to its dependence on the estimation of time-dependent probability distributions limits its application to smaller systems. To address these challenges, machine learning techniques have emerged as valuable tools. While many of these techniques, in particular denoising diffusion probabilistic models (DDPMs),¹ are inspired by statistical mechanics, we aim here to strengthen this link. By rewriting the DDPM score in terms of stochastic thermodynamical quantities, we explore the potential to impose physical constraints within the machine learning model, thereby improving learning efficiency while maintaining consistency with nonequilibrium statistical mechanics. Our work represents a step toward an integrated framework that combines the strengths of machine learning and stochastic thermodynamics, offering a new perspective for studying complex systems on a larger scale with greater computational efficiency.


  1. J. Ho, A. Jain, and P. Abbeel, Advances in neural information processing systems 2020, 33, 6840.





 Daniel Nagel

  •   Heidelberg University