Project DeepSPIN is organising the Workshop on Interactions Between Nonextensive Entropies, Machine Learning, Language, and Physics on July 5, 2022, which will be held on IST's Amphitheaterre Abreu Faro, in the Interdisciplinary Complex.
Nonextensive statistical mechanics  is a generalization of the standard Boltzmann-Gibbs theories of statistical mechanics, inspired by the seminal work of Constantino Tsallis. This generalized theory has had a very strong impact in many disciplines and a wide range of applications, including statistical mechanics, thermodynamics, information geometry, statistics, machine learning, and natural language processing.
In this interdisciplinary workshop, will be present Constantino Tsallis, as well as speakers from several disciplines, ranging from mathematics and physics to machine learning and language, who will discuss their use of nonextensive entropies and Tsallis statistics on various applications, namely, Mário Figueiredo, Ben Peters, André Martins, José Mourão, and Frederico Fiuza.
11:00 - 12:00 | "Why is it easier to understand what is energy than what is entropy?" by Constantino Tsallis
14:00 - 14:30 | "Tsallis entropies and kernel methods" by Mário Figueiredo
14:30 - 15:00 | "Tsallis entropies and entmax for language generation" by Ben Peters
15:00 - 15:45 | "From Sparse Modeling to Sparse Communication" by André Martins
16:00 - 16:3 | “Training deep neural networks: replace gradient descent by the Feynman path integral and possible extension to the nonextensive formalism” by José Mourão
16:30 - 17:00 | “Accelerating the understanding of nonlinear dynamical systems using machine learning” by Frederico Fiuza
The workshop will be also transmitted by Zoom. Here's the link: https://videoconf-colibri.zoom.us/j/91599759679
 Tsallis, Constantino, "Introduction to nonextensive statistical mechanics: approaching a complex world." Springer 1.1 (2009): 2-1
To know more about this project, follow the link: