Abstract
We shall prove that the celebrated Rényi entropy is the first example of a new family of infinitely many multi-parametric entropies. We shall call them the Z-entropies. Each of them, under suitable hypotheses, generalizes the celebrated entropies of Boltzmann and Rényi. A crucial aspect is that every Z-entropy is composable (Tempesta 2016 Ann. Phys.365, 180–197. (doi:10.1016/j.aop.2015.08.013)). This property means that the entropy of a system which is composed of two or more independent systems depends, in all the associated probability space, on the choice of the two systems only. Further properties are also required to describe the composition process in terms of a group law. The composability axiom, introduced as a generalization of the fourth Shannon–Khinchin axiom (postulating additivity), is a highly non-trivial requirement. Indeed, in the trace-form class, the Boltzmann entropy and Tsallis entropy are the only known composable cases. However, in the non-trace form class, the Z-entropies arise as new entropic functions possessing the mathematical properties necessary for information-theoretical applications, in both classical and quantum contexts. From a mathematical point of view, composability is intimately related to formal group theory of algebraic topology. The underlying group-theoretical structure determines crucially the statistical properties of the corresponding entropies.
1. Introduction
Since the pioneering work by Boltzmann, Clausius and Gibbs, the notion of entropy has been widely investigated for its prominence in thermodynamics and statistical mechanics, in both classical and quantum contexts [1–3]. The research activity inaugurated by Shannon, Khinchin and Rényi recognized the foundational role of entropy in modern information theory [4–8]. In particular, new entropic functions, designed for different purposes, have been introduced (see, for example, [9]).
The purpose of this paper is to show that the theory of generalized entropies can be mathematically interpreted, and widely extended, by means of an approach based on formal group theory [10]. Since the seminal paper by Bochner [11], formal groups have been intensively investigated in the last decades, because they play a prominent role in several branches of mathematics, especially algebraic topology [12–15], combinatorics [16], the theory of elliptic curves [17], arithmetic and analytic number theory [18–22].
In order to relate the notion of entropy to group theory, we shall discuss the mathematical requirements that an entropic function has to satisfy. A natural set of conditions is represented by the first three Shannon–Khinchin (SK) axioms (see appendix A for their formulation). These axioms correspond to the requirements of continuity, expansibility (adding an event of zero probability does not affect an entropy) and of the maximization of entropy on the uniform distribution.
At the same time, another crucial property that, in our opinion, any entropy must satisfy is that of composability [23]. Essentially, it requires that the entropy of a system obtained by composing two independent systems A and B can always be expressed in terms of the entropies of A and of B only, for any possible choice of the probability distributions of A and B.
This property is at the heart of the notion of entropy. Indeed, it is related to the requirement, pointed out in [24], that entropy is usefully defined on macroscopic states of a given system, without the need for any knowledge of the underlying microscopical dynamics.
As clarified in [23], composability is even more demanding: we should be able to compose two independent systems in a commutative way, and three independent systems in an associative way. Also, if we compose a system with another one in a zero-entropy configuration, the entropy of the compound system should be equal to the entropy of the first system. In other words, a group-theoretical structure is needed. All these requirements were encoded in [23] in the composability axiom. It generalizes the fourth Shannon–Khinchin axiom (additivity of an entropy) and allows a large family of entropic functions to be constructed.
Concerning the mathematical form that generalized entropies can assume, we distinguish two different classes.
The first one is the trace-form class. It contains entropies of the form , where is a probability distribution and F(pi) is a suitable function. The standard case of the Boltzmann–Gibbs entropy is recovered when . In the last three decades, this family of entropic functions has been widely investigated, due to its relevance in the theory of complex systems.
The second one is the class of non-trace-form entropies. The most well-known representative is the Rényi entropy , α>0. It was introduced in [5] and has inspired much work in information theory. Its quantum version is crucial in the study of entanglement of quantum complex systems.
A huge part of the set of entropies introduced in the last three decades, perhaps because of the influence of the classical Boltzmann entropy, belongs to the trace-form class. Also, these new entropies are supposed to recover Boltzmann’s entropy in some appropriate limit. Nevertheless, the trace-form class has a serious drawback: the generic lack of composability, with the two remarkable exceptions of the Boltzmann entropy and the Tsallis entropy (which in this paper we present in a two-parametric form).
However, many generalized trace-form entropies are composable in a weak sense only, i.e. over the uniform probability distribution. This feature of weak composability is certainly necessary for thermodynamic purposes, but its lack of generality makes it unsatisfactory. However, one of the outcomes of our analysis is that by renouncing the trace-form structure we can define new composable entropies. To summarize, the most relevant result of this work is the following.
(a) Main result
There exists a new family of non-trace-form strictly composable entropies.
Precisely, we shall prove that the celebrated Rényi entropy is the simplest representative of a huge family of entropies that we shall call the Z-entropies. Each of them depends on a real parameter α and possesses a group structure. Indeed, to each entropy there is associated a suitable invertible function G, allowing under certain hypotheses a formal group law to be defined [10], i.e. a two-variable series determined by the relation Ψ(x,y)=G(G−1(x)+G−1(y)) which satisfies the axioms of an Abelian group. The group law is nothing but the functional equation obeyed by the considered Z-entropy under the composition of two statistically independent systems. In this framework, the Rényi entropy has associated the function Ψ(x,y)=x+y, obtained when G=Id, i.e. the additive group law. The corresponding functional equation is Sα(A∪B)=Sα(A)+Sα(B), for any couple of independent subsystems A, B. This construction enables us to formulate the whole theory of generalized entropies into a firm foundational group-theoretical framework.
A very general expression of the non-trace-form family of group entropies is (see definition 6.2 below)
A fundamental property of the Z-entropies (1.1) is that, under suitable hypotheses, they generalize, at the same time, the entropies of Boltzmann and Rényi in a unified expression. In a specific example, the Tsallis entropy (jointly with those of Boltzmann and Rényi) is also recovered.
By choosing adequately the generalized group logarithm in equation (1.1), we can also define strictly composable analogues of well-known entropies, including those of Kaniadakis, Borges–Roditi, etc. None of them is indeed composable.
In the trace-form case, the group-theoretical approach to the concept of entropy has been already formulated in [23]. There, the notion of universal-group entropy, as the entropy related to the Lazard universal formal group, has been introduced.
From the previous considerations, it emerges that there exists a dual construction among the two classes of entropies. Precisely, each group law admits (at least) two realizations in terms of two different entropic forms, belonging to the two classes ZG,α[p] and SU[p], respectively.
The additive group has two representatives: the Boltzmann entropy and the Rényi entropy. The multiplicative group Ψ(x,y)=x+y+γxy, where , has interesting realizations, as the Tsallis entropy in the trace-form case, and the entropy (8.1) in the non-trace-form family.
There are several reasons to introduce the class of Z-entropies. A first motivation comes from the need to establish a mathematically unifying approach to the theory of generalized entropies. From this point of view, the group entropies, namely entropies satisfying the first three SK axioms and possessing a group-theoretical structure ensuring the validity of the composition process over all possible states, seem to possess a special role. From a mathematical point of view, each group entropy of the Z-class satisfies two crucial properties: strict composability and extensivity. Indeed, one can show that under mild conditions, for a given phase space growth rate W=W(N), there exists a specific entropy of the Z-class such that, over the uniform distribution, . A priori this allows, at least formally, one to develop (in a micro-canonical picture) a formalism similar to the classical one for ergodic systems. However, the purpose of this work is not to develop thermodynamics. The full clarification of the possible thermodynamic meaning of the class of group entropies (as already pointed out in [23,25,26]) is an open problem.
Another reason comes from the study of quantum entanglement. Indeed, quantum versions of Z-entropies (briefly discussed in this work) could be very useful in the detection of entanglement for quantum complex systems as spin chains. An example is proposed where a Z-entropy can be used, instead of the quantum Rényi entropy, as a measure of the ground state entanglement for a generalized Lipkin–Meshkov–Glick model that was recently introduced [27]. In particular, this new quantum entropy is extensive (i.e. proportional to L) in a context when the entropy of Rényi is not.
At the same time, the ZG,α entropies are possibly relevant in information theory. Indeed, they widely generalize both the classical α-divergences and the Rényi divergences. One can also expect that Z-entropies could play an important role in information geometry.
To conclude, a few words concerning the organization of the article. In §2, the group-theoretical apparatus necessary for the subsequent discussion is sketched. In §3, we formulate the composability axiom. In §4, generalized logarithms are derived from group laws. In §5, we discuss properties of the trace-form class and more specifically of Tsallis entropy, which is the most general composable trace-form entropy. The new family of Z-entropies is defined in §6 and some of its properties are proved in §7. In §8, some representatives of the Z-family are discussed in detail. In particular, the relevant example of the Za,b-entropy is presented in §9. A quantum version of the Z-class is proposed in §10. Some open problems are proposed in the final section 11.
2. Groups and entropies: a general approach
In the subsequent sections, we shall present a comprehensive theory of generalized entropies based on formal group laws. We will start by recalling some basic facts and definitions of the theory of formal groups (see [10] for a thorough exposition, and [28] for a shorter introduction).
(a) The Lazard formal group
Let R be a commutative associative ring with identity, and R{x1,x2,…} be the ring of formal power series in the variables x1,x2,… with coefficients in R.
A commutative one-dimensional formal group law over R is a formal power series Ψ(x,y)∈R{x,y} such that [11]
Definition 2.1
The existence of an inverse formal series ω(x) ∈R{x} such that Ψ(x,ω(x))=0 is a consequence of definition 2.1. Let be the ring of integral polynomials in infinitely many variables. We shall consider the series
For any commutative one-dimensional formal group law over any ring R, there exists a unique homomorphism L→R under which the Lazard group law is mapped into the given group law (the universal property of the Lazard group).
Let R be a ring with no torsion.1 Then, for any commutative one-dimensional formal group law Ψ(x,y) over R, there exists a series such that
In the subsequent considerations, the universal formal group will play the role of a very general composition law admissible for the construction of the entropies of the Z-family.
3. The composability axiom
In this section, we shall first define the notion of composability, in the strict and weak sense, by following (and slightly generalizing) [23].
An entropy S is strictly (or strongly) composable if there exists a continuous function of two real variables Φ(x,y) such that
(C1)
Definition 3.1
(C2) Symmetry:
(C3) Associativity:
(C4) Null-composability:
This axiom is necessary to ensure that a given entropy may be suitable for thermodynamic purposes. Indeed, it should be obviously symmetric with respect to the exchange of the labels A and B. At the same time, if we compose a given system with another system in a state with zero entropy, the total entropy should coincide with that of the given system. Finally, the composability of more than two independent systems in an associative way is crucial to define a zeroth law.
A group entropy is a function that satisfies the first three SK axioms and is strictly composable.Definition 3.2
Observe that in the class of formal power series, the function Φ(x,y), satisfying the conditions (3.1)–(3.3), according to definition 2.1 is a formal group law over the reals, with a suitable formal series inverse φ(x) such that Φ(x,φ(x))=0 [10]. The general form of this group law is .Remark 3.3
Other properties of Φ(x,y) can be found in [23], where the notion of weak composability was also proposed. The weak formulation essentially requires the composability of the generalized logarithm associated with a given entropy. The weak property is satisfied by almost all of the generalized entropies proposed in the literature. However, it is not obvious: indeed, there are entropic forms that satisfy the first three SK axioms, but are not weakly composable.
In [25], this notion (called there ‘composability’ tout court) has been formulated for a class of entropic functions coming from difference operators.
4. Generalized logarithms and exponentials from group laws
There is a simple construction allowing a generalized logarithm from a given group law to be defined. Needless to say, many other definitions of a generalized logarithm can be proposed.
A generalized group logarithm is a continuous, strictly increasing and strictly concave function , with (possibly depending on a set of real parameters); the functional equation
Definition 4.1
Observe that the eq. (4.1) has a simple realization
One of our key results is the following simple proposition, which is a restatement of the previous observation.
Let G be a continuous strictly increasing function, vanishing at zero. The function FG(x) defined byProposition 4.2
It suffices to observe that
Proof.
This result can also be formulated in a field of characteristic zero for G in the class of formal power series.
By way of an example, when χ(x,y)=x+y, we have directly that G(t)=t and . If χ(x,y)=x+y+(1−q)xy, an associated function G(t) is provided by G(t)=(e(1−q)t−1)/(1−q) and the group logarithm converts into the Tsallis logarithm
Let G be a strictly increasing real analytic function of the form eq. (2.2). For , the requirement of concavity is guaranteed, for instance, by the simple (but quite restrictive) condition
Remark 4.3
Thus, given a group law, under mild hypotheses we may determine a generalized group logarithm, for instance by means of relation (4.3) and the condition (4.6) (see [25] for a construction of group logarithms from difference operators via the associated group exponential G).
The inverse of a generalized group logarithm will be called the associated generalized group exponential.Definition 4.4
This function can be represented in the form
When G(t)=t, we have returned to the standard exponential; when G(t)=(e(1−q)t−1)/(1−q), we recover the q-exponential eq(x)=[1+(1−q)t]1/(1−q), and so on.
From a computational point of view, observe that the formal compositional inverse G−1(s), such that G(G−1(s))=s and G−1(G(t))=t, can be obtained by means of the Lagrange inversion theorem. We get the formal power series
Remark 4.5
5. On the trace-form class of entropies
(a) Basic properties
Let us denote by N the number of particles of a complex system.
An entropic function S(p1,…,pW) is said to be extensive over the uniform distribution if there exists a function (‘growth rate’ or ‘occupation law’ of phase space) W=W(N) such that,
Definition 5.1
Here λ may depend on thermodynamic variables but not on N.
An entropy S belongs to the trace-form class if it can be written in the form , where F is a real continuous function, strictly concave, with F(0)=F(1)=0 and .Definition 5.2
(b) A general logarithm for a composable entropy
The equation for the multiplicative formal group allows us to give a two-parametric presentation of the logarithm (4.5). Indeed, we observe that the functional equation
To this aim, we consider the function defined by
The previous constraints in the space of parameters ensure that the entropy Sa,q satisfies the first three SK axioms for all real values of q and specific values of the parameter a (for a=1, q→1 we recover the Boltzmann entropy). Note that the entropies Sa,q and Sq are related by means of the simple formulae
The entropy (5.1) is concave in the regions:
(i) (ii) Proposition 5.3
It is easy to show that the entropy Sa,q is extensive (on equal probabilities) for W(N)∼Nρ. Indeed, the value q*=1−1/aρ is found such that , in a suitable range of a, for values of ρ>1.
One also obtains the following interesting property.
Let A and B be two independent systems. The entropy Sa,qis strictly composable, with composition law given byProposition 5.4
The entropy Sa,q represents the general solution of the previous functional equation in the trace-form class.
Comment. It is interesting to observe that the Landsberg–Vedral entropy [30] , also called the normalized Tsallis entropy, satisfies the law SLV(A∪B)=SLV(A)+SLV(B)+(q−1)SLV(A)SLV(B). This entropy, therefore, belongs to the family of composable entropies. However, it is not of trace-form type.
A fundamental property of the trace-form class is the following.
The Tsallis entropy (with the Boltzmann entropy as its reduction) is the only known entropy of the trace-form class which is strictly composable.Remark 5.5
6. New strictly composable entropies
(a) The Z-family
If we restrict ourselves to the trace-form class, composability is realized essentially in the case of the Tsallis entropy, with the Boltzmann entropy as its fundamental reduction.
However, if we relax the trace-form requirement, new possibilities arise. Indeed, Renyi’s entropy Rα does not have a trace-form structure, and is strictly composable (additive). Thus, we shall introduce a new family of group entropies, depending on an entropic parameter α, that do not belong to the trace-form class and generalize the Rényi entropy.
In the following, we shall often assume for simplicity the hypothesis of generalized group logarithms of the form , with G a strictly increasing analytic function of the form (2.2), where ai (i=1,2,…) are parameters which can vanish; we always assume that they are independent of the entropic parameter α.Remark 6.1
Let {pi}i=1,…,W, W≥1, with , be a discrete probability distribution. Let be a generalized group logarithm, according to definition 4.1. The associated Z-entropy of order α is defined to be the function
Definition 6.2
The following theorem establishes one of the most relevant properties of the Z-class of entropies: its strict composability.
The ZG,α-entropy is strictly composable: given any two statistically independent systems A, B, defined on an arbitrary probability distribution {pi}i=1,…,W, it satisfies the composition ruleTheorem 6.3
Let and be two sets of probabilities associated with two statistically independent systems A and B. The joint probability is given by . We have
Proof.
(b) Rényi entropy
The celebrated Rényi entropy is the first example in the Z-class. It is obtained by identifying the generalized logarithm with the standard one. Rényi’s entropy is indeed the first representative of an infinite tower of entropies that we shall construct in the following sections. It corresponds to a solution to the functional equation ZG,α(A∪B)=ZG,α(A)+ZG,α(B).
7. Main properties of the class of Z-entropies
(a) Limiting properties
The class of Z-entropies possesses a simple relation to the most celebrated entropies, in particular to Boltzmann’s and Rényi’s entropies, as stated by the following results.
Assuming the hypotheses of remark 6.1, in the limit α→1, the ZG,α-entropy reduces to the Boltzmann–Gibbs entropy.Proposition 7.1
Proof.
Under the hypotheses of remark 6.1, the ZG,α-entropy generalizes the Rényi entropy.Proposition 7.2
It suffices to observe that the function reduces to when ai→0, i=1,2,…, in the expression (2.2) of G(t). ▪Proof.
A crucial property of the Z-class is that it is compatible with the SK requirements, as made clear by the following statement.
(b) The Shannon–Khinchin axioms
The ZG,α-entropy satisfies the first three SK axioms, for 0<α<1.Theorem 7.3
We shall check the validity of the axioms in the formulation given in appendix A.
(SK1) By construction, the function (1.1) is continuous with respect to all of its arguments pi. (SK2) The entropy (1.1) is strictly concave for 0<α<1. Indeed, given two probability distributions and , we have
Proof.
(SK3) By adding an event of zero probability, for α>0, α≠1 we have
We conclude that the Z-entropies are group entropies.
(c) Extensivity
We shall discuss the extensivity properties of the Z-family of entropies over the uniform probability distribution (i.e. pi=1/W for all i=1,…,W). In other words, we wish to determine the conditions ensuring that a Z-entropy is proportional to the number N of particles of a given system, in the large N limit, when all states are equiprobable. We have that, for N large (and focusing on the leading constribution to the asymptotic behaviour)
However, the function W(N) must be interpretable as a phase space growth rate. A sufficient condition is that W(N) be defined for all as a real function, with . These requirements usually restrict the space of allowed parameters of the considered entropy.
Consequently, provided the previous condition is satisfied, the entropies of the Z-family are extensive on a specific regime, given by a growth rate W(N).
The previous properties altogether indicate the potential relevance of the notion of Z-entropy in thermodynamical contexts and, more generally, in the theory of complex systems.
(d) The Schur concavity of Z-entropies
In theorem 7.3, we have assumed 0<α<1 to guarantee concavity of the Z-family. However, if we take α>1, under mild hypotheses this family satisfies the interesting property of Schur concavity. We recall here some basic facts about majorization [31]. A vector majorizes or dominates another vector (and we write a≽b) if the following properties are satisfied:
Given two probability distributions and , such that , we shall say that an entropy S is Schur concave if S[p]≥S[r].Definition 7.4
The following result holds.
Assuming the hypotheses of remark 6.1, the ZG,α-entropy is Schur concave for α>1.Theorem 7.5
As, according to definition 6.2 and our hypotheses, any entropy of the class (1.1) is permutation invariant in the variables (p1,…,pW) and its first derivatives exist, we shall use the Schur–Ostrowski criterion for Schur concavity [32]. Precisely, we have to prove that, for any probability distribution p,
Proof.
8. A tower of new entropic forms
The previous construction allows us to establish a correspondence among trace-form and non-trace-form entropies. Precisely, given a well-defined trace-form entropy, weakly composable, we can associate with it a new, non-trace-form, but strictly composable, generalized entropy. In this way, we can generate a tower of new examples of group entropies that parallel the well-known entropies, with infinitely many more in addition.
As we have clarified, both the entropies of Boltzmann and Rényi correspond to the additive formal group. Let us consider the first example of non-additive entropic function of the Z-class (1.1), naturally associated with the multiplicative group law.
We shall call the function
Definition 8.1
The composition law for the entropy (8.1) is
A new example can be obtained from the group law
The function
Definition 8.2
The strict composability of the entropy (8.3) can be directly ascertained from the formula
This procedure can be easily extended to infinitely many group laws; correspondingly, it yields a sequence of entropic forms. Another particularly interesting case will be discussed below.
9. The Za,b-entropy: a generalization of the Boltzmann, Tsallis, Rényi and Sharma–Mittal entropies
(a) Definition
In this section, we present a first example of a composable, three-parametric entropy. Perhaps the most appealing feature of this entropy is that it generalizes some of the most important entropies known in the literature, which have played a prominent role in information theory, thermodynamics and generally speaking in applied mathematics.
The Za,b-entropy is defined to be the function
Definition 9.1
(b) Limiting properties
The Za,b-entropy reduces to the Boltzmann entropy in the limit α→1. We also have the following notable properties.
The Za,b-entropy reduces to the Rényi entropy in the double limit a→0, b→0.Proposition 9.2
The Za,b-entropy reduces to the Tsallis entropy in the double limit a→1, b→0.Proposition 9.3
Proof.
The Za,b-entropy reduces to the Zk,α-entropy in the limit a= −b=k.Proposition 9.4
The Za,b-entropy reduces to the Sharma–Mittal entropy in the limit b→0.Proposition 9.5
(c) Group-theoretical structure: the Abel formal group laws
The Za,b-entropy is related to the Borges–Roditi logarithm:
10. Quantum Z-entropies
The class of Z-entropies can be quantized by means of a standard procedure. Precisely, let us consider a quantum system whose states belong to a finite-dimensional Hilbert space and are represented by semi-positive definite density matrices ρ (e.g. [39,40]).
The family of quantum Z-entropies is defined by the formal relation
Definition 10.1
For each specific entropy, the set of parameters appearing in equation (10.1) are supposed to satisfy suitable constraints and, if necessary, certain conventions are tacitly adopted (as in the case of von Neumann entropy); however, we will not treat in detail these issues here.
We shall discuss now the example of the quantum version of the Za,b-entropy. Similar considerations apply to other cases.
The quantum Za,b-entropy is defined to be
Definition 10.2
One of the most interesting aspects of the family of quantum entropies (10.1) is that they are directly related to the von Neumann entropy and to the quantum Rényi entropy . Concerning the quantum Za,b-entropy, it possesses other interesting limits. For instance, it can be related to the quantum version of the Sharma–Mittal entropy
A priori, due to the multi-parametric nature of these entropies, they can be particularly useful in the study of entangled systems. In this perspective, we believe that Za,b[ρ] deserves special attention. An analysis of this aspect is outside the scope of this paper and will be discussed elsewhere.
(a) Applications: the generalized isotropic Lipkin–Meshkov–Glick model
As an example of application of the entropic forms presented in the previous sections, we shall consider the generalized isotropic Lipkin–Meshkov–Glick model of N interacting particles introduced in [27]. Its Hamiltonian reads
The ground state entanglement entropies for this model can be easily computed in the case of our group entropies, starting from the results of [27]. There, the reduced density matrix for a block of L spins was computed, when the system is in its ground state. In the large L limit, we obtain the formulae
The relevance of formula (10.5) for Za,0[ρ] is due to the fact that it solves an open problem proposed in [27]. Indeed, the quantum Rényi entropy is not extensive, i.e. proportional to L. The quantum Tsallis entropy is extensive, but for m≥3 only. Observe that, instead, the entropy Za,0 can be made linear in L. Indeed, we see immediately that the extensivity requirement implies α=1−2/am, which is satisfied for any m, for suitable values of a.
11. Open problems and future perspectives
The present work represents a first exploration of the mathematical properties of a new, large family of non-trace-form group entropies coming from formal group theory via the composability axiom. The underlying group-theoretical structure is responsible for essentially all the relevant properties of this family.
The results obtained above suggest that the Z-class can be a flexible tool, offering in a future perspective new insight into different contexts of the theory of classical and quantum complex systems, in ecology and social sciences. For instance, we plan to define generalized diversity indices related to the class. Also, Z-entropies are particularly suitable for defining new divergences, generalizing those of Rényi [41] and Kullback & Leibler [42] and are potentially relevant in the context of information geometry [43].
A directly related expression for Z-entropies is given by the formula
We wish to point out that some of the definitions of the theory can be reformulated in different ways, and some conditions can be relaxed. The present approach has the advantage of providing a large class of group entropies in a simple way; however, the problem of determining the most general form for group entropies is open. We shall discuss in detail progress on these issues elsewhere.
In our opinion, the crucial role played by the group-theoretical approach in the description of compound systems paves the way towards a re-foundation of the theory of generalized entropies in terms of group entropies.
An open problem is to establish whether the Z-entropies are Lesche stable. We conjecture that, in this respect, the entropies of the class (1.1) have essentially the same behaviour as the Rényi entropy. As was proved in [44], from many respects, Rényi’s entropy can also be considered an observable. This conjecture will be thoroughly analysed in a future work.
We also wish to mention that a generalization of the correspondence among entropies and zeta functions [22,25,26] to the case of multi-parametric entropies and multiple zeta values and polylogarithms is an interesting open problem.
Competing interests
I have no competing interests.
Funding
This work has been partly supported by the research project FIS2015-63966, MINECO, Spain, and by the ICMAT Severo Ochoa project SEV-2015-0554 (MINECO).
Acknowledgements
I thank J. A. Carrasco, A. González López, M. A. Rodríguez and G. Sicuro for useful discussions.
Appendix A. The Shannon–Khinchin axioms
As customary in the literature on generalized entropies, here we re-state the original requirements of Shannon and Khinchin for an entropy to be admissible, in terms of four requirements.
(SK1) (Continuity). The function S(p1,…,pW) is continuous with respect to all its arguments.
(SK2) (Maximum principle). The function S(p1,…,pW) takes its maximum value over the uniform distribution pi=1/W, i=1,…,W.
(SK3) (Expansibility). Adding an event of zero probability to a probability distribution does not change its entropy: S(p1,…,pW,0)=S(p1,…,pW).
(SK4) (Additivity). Given two subsystems A, B of a statistical system,
Footnotes
References
- 1
Beck C, Schögl F . 1993Thermodynamics of chaotic systems: an introduction.Cambridge Nonlinear Science Series . Cambridge, UK: Cambridge University Press. Google Scholar - 2
Callen HB . 1985Thermodynamics and an introduction to thermostatistics, 2nd edn. New York, NY: John Wiley and Sons. Google Scholar - 3
- 4
- 5
Rényi A . 1961On measures of information and entropy. In Proc. of the 4th Berkeley Symp. on Mathematics, Statistics and Probability, Berkeley, CA, 20 June–30 July 1960, pp. 547–561. Berkeley, CA: University of California Press. Google Scholar - 6
Shannon CE . 1948A mathematical theory of communication. Bell Syst. Tech. J. 27, 379–423; 27, 623–653. (doi:10.1002/j.1538-7305.1948.tb01338.x) Crossref, Google Scholar - 7
Shannon C, Weaver W . 1949The mathematical theory of communication. Urbana, IL: University of Illinois Press. Google Scholar - 8
Khinchin AI . 1957Mathematical foundations of information theory. New York, NY: Dover. Google Scholar - 9
Tsallis C . 2009Introduction to nonextensive statistical mechanics—approaching a complex world. Berlin, Germany: Springer. Google Scholar - 10
- 11
Bochner S . 1946Formal Lie groups. Ann. Math. 47, 192–201. (doi:10.2307/1969242) Crossref, ISI, Google Scholar - 12
Bukhshtaber VM, Mishchenko AS, Novikov SP . 1971Formal groups and their role in the apparatus of algebraic topology. Uspehi Mat. Nauk 26, 161–154. [Transl. Russ. Math. Surv.26, 63–90 (1971).] Google Scholar - 13
Novikov SP . 1967The methods of algebraic topology from the point of view of cobordism theory. Izv. Akad. Nauk SSSR Ser. Mat. 31, 885–951. [Transl. Math. SSR–Izv.1, 827–913 (1967).] Google Scholar - 14
Quillen D . 1969On the formal group laws of unoriented and complex cobordism theory. Bull. Am. Math. Soc. 75, 1293–1298. (doi:10.1090/S0002-9904-1969-12401-8) Crossref, ISI, Google Scholar - 15
Faltings G . 2008Néron models and formal groups. Milan J. Math. 76, 93–123. (doi:10.1007/s00032-008-0086-z) Crossref, ISI, Google Scholar - 16
Baker A . 1987Combinatorial and arithmetic identities based on formal group laws.Lecture Notes in Mathematics, vol. 1298 , pp. 17–34. Berlin, Germany: Springer. Google Scholar - 17
Serre J-P . 1966Courbes elliptiques et groupes formels. Annu. Coll. France49–58. [Oeuvres, vol. II, 71, 315–324.] Google Scholar - 18
Marmi S, Tempesta P . 2012Hyperfunctions, formal groups and generalized Lipschitz summation formulas. Nonlinear Anal. 75, 1768–1777. (doi:10.1016/j.na.2011.09.013) Crossref, ISI, Google Scholar - 19
Tempesta P . 2007Formal groups, Bernoulli-type polynomials and L-series. C. R. Math. Acad. Sci. Paris I 345, 303–306. (doi:10.1016/j.crma.2007.05.016) Crossref, ISI, Google Scholar - 20
Tempesta P . 2008On Appell sequences of polynomials of Bernoulli and Euler type. J. Math. Anal. Appl. 341, 1295–1310. (doi:10.1016/j.jmaa.2007.07.018) Crossref, ISI, Google Scholar - 21
Tempesta P . 2010L-series and Hurwitz zeta functions associated with the universal formal group. Ann. Scuola Normale Superiore Classe Sci.IX, 1–12. Google Scholar - 22
Tempesta P . 2015The Lazard formal group, universal congruences and special values of zeta functions. Trans. Am. Math. Soc. 367, 7015–7028. (doi:10.1090/tran/6234) Crossref, ISI, Google Scholar - 23
Tempesta P . 2016Beyond the Shannon-Khinchin formulation: the composability axiom and the universal group entropy. Ann. Phys. 365, 180–197. (doi:10.1016/j.aop.2015.08.013) Crossref, ISI, Google Scholar - 24
Gell-Mann M . 1995The quark and the jaguar: adventures in the simple and the complex. New York, NY: Macmillan. Google Scholar - 25
Tempesta P . 2011Group entropies, correlation laws and zeta functions. Phys. Rev. E 84, 021121. (doi:10.1103/PhysRevE.84.021121) Crossref, ISI, Google Scholar - 26
Tempesta P . 2015A theorem on the existence of generalized trace-form entropies. Proc. R. Soc. A 471, 20150165. (doi:10.1098/rspa.2015.0165) Link, Google Scholar - 27
Carrasco JA, Finkel F, González-López A, Rodríguez MA, Tempesta P . 2016Generalized isotropic Lipkin-Meshkov-Glick models: ground state entanglement and quantum entropies. J. Stat. Mech. 2016, 033114. (doi:10.1088/1742-5468/2016/03/033114) Crossref, ISI, Google Scholar - 28
Serre J-P 1992Lie algebras and Lie groups.Lecture Notes in Mathematics, vol. 1500 . Berlin, Germany: Springer. Google Scholar - 29
Aczél J . 2006Lectures on functional equations and their applications. New York, NY: Dover. [Republication of the 1966 book published by Academic Press.] Google Scholar - 30
Landsberg PT, Vedral V . 1998Distributions and channel capacities in generalized statistical mechanics. Phys. Lett. A 247, 211–217. (doi:10.1016/S0375-9601(98)00500-3) Crossref, ISI, Google Scholar - 31
Hardy GH, Littlewood JE, Pólya G . 1952Inequalities, 2nd edn. London, UK: Cambridge University Press. Google Scholar - 32
Pečarić JE, Proschan F, Tong YL . 1992Convex functions, partial orderings and statistical applications. New York, NY: Academic Press. Google Scholar - 33
Sharma BD, Mittal DP . 1975New nonadditive measures of entropy for discrete probability distributions. J. Math. Sci. 10, 28–40. Google Scholar - 34
Kaniadakis G . 2002Statistical mechanics in the context of special relativity. Phys. Rev. E 66, 056125. (doi:10.1103/PhysRevE.66.056125) Crossref, ISI, Google Scholar - 35
Abel NH . 1823Méthode générale pour trouver des fonctions d’une seule quantité variable, lorsqu’une propriété de ces fonctions est exprimée par une équation entre deux variables. Mag. Nat. 1, 216–229. [Reprinted in Oeuvres Complétes (eds L Sylow, S Lie), vol. 1. Christiania (1881).] Google Scholar - 36
Clarke F, Johnson K . 2009On the classifying ring for Abel formal group laws. J. Pure Appl. Algebra 213, 1290–1298. (doi:10.1016/j.jpaa.2008.11.031) Crossref, ISI, Google Scholar - 37
Bukhshtaber VM, Kholodov AN . 1990Formal groups, functional equations and generalized cohomology theories. Math. USSR Sbornik 181, 75–94. [English transl. Math. S. B.69, 77–97 (1991).] Google Scholar - 38
Busato P . 2002Realization of Abel’s universal formal group law. Math. Z. 239, 527–561. (doi:10.1007/s002090100323) Crossref, ISI, Google Scholar - 39
Nielsen MA, Chuang IL . 2010Quantum computation and quantum information. Cambridge, UK: Cambridge University Press. Crossref, Google Scholar - 40
Horodecki R, Horodecki P, Horodecki M, Horodecki K . 2009Quantum entanglement. Rev. Mod. Phys. 81, 865. (doi:10.1103/RevModPhys.81.865) Crossref, ISI, Google Scholar - 41
van Erven T, Harremoës P . 2014Rényi divergence and Kullback-Leibler divergence. IEEE Trans. Inf. Theory 60, 3797–3820. (doi:10.1109/TIT.2014.2320500) Crossref, ISI, Google Scholar - 42
Kullback S, Leibler RA . 1951On information and sufficiency. Ann. Math. Stat. 22, 79–86. (doi:10.1214/aoms/1177729694) Crossref, Google Scholar - 43
Amari S, Nagaoka H . 2000Methods of information geometry. New York, NY: Oxford University Press. Google Scholar - 44
Jizba P, Arimitsu T . 2004Observability of Rényi’s entropy. Phys. Rev. E 69, 026128. (doi:10.1103/PhysRevE.69.026128) Crossref, ISI, Google Scholar


