Tsallis entropy
Published:
Tsallis entropy is a generalization of the standard Boltzmann-Gibbs entropy. It was introduced in 1988 by Constantino Tsallis as a basis for generalizing the standard statistical mechanics, and is identical in form to Havrda-Charvát structural α-entropy within Information Theory.
It is widely used in complex systems due its ability to confirm the predictions and consequences that are derived from this nonadditive entropy, such as nonextensive statistical mechanics, which generalizes the Boltzmann-Gibbs theory.
The formal definition is:
\[{\displaystyle S_{q}({p_{i}})={k \over q-1}\left(1-\sum _{i}p_{i}^{q}\right),}\]where $q$ is real parameter sometimes called entropic-index. When ${\displaystyle q\to 1}$ , the usual Boltzmann-Gibbs entropy is recovered.
See also
Boltzmann-Gibbs entropy, Rényi entropy
Material
- http://tsallis.cat.cbpf.br/biblio.htm
Papers
- Plastino, A. R., & Plastino, A. (1993). Tsallis’ entropy, Ehrenfest theorem and information theory. Physics Letters A, 177(3), 177-179.
- Abe, S. (2000). Axioms and uniqueness theorem for Tsallis entropy. Physics Letters A, 271(1), 74-79.
- Curado, E. M. F., & Tsallis, C. (1992). Generalized statistical mechanics: connection with thermodynamics. Journal of Physics A: Mathematical and General, 25(4), 1019.
- dos Santos, R. J. (1997). Generalization of Shannon’s theorem for Tsallis entropy. Journal of Mathematical Physics, 38(8), 4104.
- Martınez, S., Nicolás, F., Pennini, F., & Plastino, A. (2000). Tsallis’ entropy maximization procedure revisited. Physica A: Statistical Mechanics and its Applications, 286(3), 489-502.