Statistical mechanical theory of an oscillating isolated system. The relaxation to equilibrium

In this contribution we show that a suitably defined nonequilibrium entropy of an N-body isolated system is not a constant of the motion in general and its variation is bounded, the bounds determined by the thermodynamic entropy, i.e., the equilibrium entropy. We define the nonequilibrium entropy as a convex functional of the set of n-particle reduced distribution functions (n=0,......., N) generalizing the Gibbs fine-grained entropy formula. Additionally, as a consequence of our microscopic analysis we find that this nonequilibrium entropy behaves as a free entropic oscillator. In the approach to the equilibrium regime we find relaxation equations of the Fokker-Planck type, particularly for the one-particle distribution function.


I. Introduction 2
II. Hamiltonian dynamics 5 III. Nonequilibrium entropy 9 IV. Entropy production. The law of increase of entropy 12

V. Relaxation equations 13
VI. Approach to equilibrium 15

I. INTRODUCTION
It is a widely recognized fact that a general mathematical theoretical proof of the second law is still lacking. As stated in Ref. 1 and quoted here just for illustration sake "To the best of our knowledge no theoretical mathematical derivation of the second law has been given up until now; instead it has been based on Kelvin's or Clausius's principles of the impossibility of perpetual motion of the second kind 2 , which are based on experiment 3 ". This lack of definitive theoretical proof has lead to reports on the violation of the second law 4 or tests over its validity in some particular cases 5 , 6 .
The first significant contribution to the interpretation of the second law of Thermodynamics and the explanation of irreversibility goes back to Boltzmann. Nevertheless, it is known that Boltzmann's contribution was criticized by arguing that this contradicts the predictions based on the microscopic equations of motion. Later, Gibbs and P. Ehrenfest & T. Ehrenfest worked on this problem by introducing coarse-graining. However, those coarsegraining analyses require the introduction of a priori equal probability principles, which are hard to justify on physical grounds as was criticized by Einstein 7 .
In this scenario, our contention is to discern the connection between the microscopic description of an isolated N-body system given through the classical Hamiltonian dynamics and the description at the macroscopic level expressed by the second law.
It is known that for thermodynamic equilibrium the entropy can be given by the Gibbs where k B is the Boltzmann constant and F the full phase-space distribution function which we assume normalized to unity, i.e. Tr(F ) = 1. However, this expression is not adequate for representing the entropy of nonequilibrium isolated systems for which no bath is present 8 .
The reason is that although in the case of a time-dependent distribution function out of equilibrium, the entropy S given through (1) remains constant. This is not difficult to show given that F evolves according to the Liouville equation where [.., ..] P is the Poisson bracket. In fact, by using Eq. (2) the rate of change of the entropy (1) is Therefore here we will generalize the Gibbs's statistics to account for the entropy variations in nonequilibrium systems. This constitutes an application of our previous results 9 .
Our starting point is the description of the state of an isolated N-body system in terms of the set of n-particle reduced distribution functions in the framework of the BBGKY [Bogolyubov-Born-Green-Kirkwood-Yvon] description 10 . Unlike equilibrium, an overall picture in terms of the full phase-space distribution function does not contain the amount of detail necessary to describe a nonequilibrium system. Nonequilibrium systems manifest a random clusterization which makes their distribution in the phase space unstable, thus there is a continuous process of creation of n-particle clusters at the expense of the annihilation of p-particle clusters with n = p. This fact is taken into account in the BBGKY hierarchy making this an appropriate framework for the description of nonequilibrium systems. In this context, since the collisions become explicit through the collision term in the equations of motion, the n-particle reduced distribution functions are not constant of the motion, therefore a way of defining the entropy to embody the approach to equilibrium might be expressed in terms of this set of reduced distributions. This is what we do here: we propose a functional of the set of n-particle reduced distribution functions which generalizes the Gibbs entropy as the nonequilibrium entropy of the isolated N-body system. We will show that this entropy is not a constant of the motion and reaches its maximum value at equilibrium.
In the next section, we introduce the Hamiltonian dynamics of the N-body system and obtain the generalized Liouville equation. In section 3, we define the nonequilibrium entropy analyzing its properties. Section 4 is devoted to computing the entropy production and to the derivation of the kinetic equation for the one-particle reduced distribution function. In section 5 we describe the approach to equilibrium. Finally in section 6, we emphasize our main conclusions.

II. HAMILTONIAN DYNAMICS
Let us consider an N-body system with a Hamiltonian containing a kinetic energy term plus a two-particle interaction potential with m being the mass of a particle, and φ (|q j − q k |) ≡ φ j,k the interaction potential.
Moreover, the equations of motion are As said in the introduction, the statistical description of the system can be performed in terms of the full phase-space distribution function F (x N , t), where x N = {x 1 , ..., x N } and x j = (q j , p j ) or alternatively in terms of the distribution vector 11 f. Both previous descriptions are completely equivalent, however the second one is more appropriate for nonequilibrium systems. Here, is the set of all the n-particle reduced distribution functions, with x n = {x 1 , ..., x n }, n = 0, ......, N and where the n-particle reduced distribution functions are obtained by integrating over the N − n particles with f o = 1. The dynamics of the reduced distribution vector follows from the Liouville equation (2) by integration according to Eq. (7), thus one obtains 11,12,13 where F j,n+1 = −∇ j,n+1 φ j,n+1 and is the n-particle Hamiltonian.
In a compact way and in the language of Hilbert spaces we can write Eq. (8) 11,9 i ∂ ∂t constituting the generalized Liouville equation which succinctly expresses the BBGKY hierarchy of equations. Here, L is the generalized Liouvillian, a nonHermitian operator whose diagonal part PL is defined through 14,11 where |n represents the n-particle state. In addition, the nondiagonal part QL is given Here, P and Q, its complement with respect to the identity, are projector operators. From its definition through Eq. (11) one can see that PL is a (N + 1) × (N + 1) diagonal block Hermitian matrix. On the other hand, from Eq. (12) it is possible to infer that QL is a nonHermitian (N + 1) × (N + 1) diagonal block matrix with nonzero elements only along the diagonal (n, n + 1) with n > 1 11 . In terms of the projectors just introduced, Eq. (10) can be rewritten Hence, the formal solution of Eq. (13) can be written as an integral equation which can be formally solved to give 11 where the evolution operator U (t, 0) is given by a perturbative development as Here, ..... < t 1 < t 0 = t, and the integration proceeds from right to left. Differentiating Eq. (16) one find the evolution equation for U (t, 0). If now, we make a time-translation, and change the origin of the time scale so that the time series begin at time t with t ′ l = t l − t, (1 ≤ l ≤ j) and under time reversal Interchanging the integration limits and the integrals we obtain after relabeling the dummy integration variables 15 where now, ]. In addition, by differentiating Eq.
(20) one gets the evolution equation for U (0, t) The propagator U (0, t) given through Eq. (19) propagates backwards in time from t to 0, hence this must coincide with the inverse U (t, 0) −1 of U (t, 0) so that It can be verified that U (0, t) is the inverse of U (t, 0) 15 . To begin, U (0, t) U (t, 0) = 1 for t = 0. Now by differentiating and taking into account Eqs. (17) and (21) we reach so U (0, t) U (t, 0) = 1 for all t.
Since PL is Hermitian, all its eigenvalues are real 16 , which means that f(t) as given through Eqs. (15) and (16) will have an oscillatory behavior.

III. NONEQUILIBRIUM ENTROPY
Here as the nonequilibrium entropy for the N-body system we propose 9,14 a convex functional of the distribution vector which generalizes the Gibbs formula . In Eq. (24), S eq is the thermodynamic entropy (i.e. the equilibrium entropy) and f eq is the equilibrium distribution vector satisfying Lf eq = 0, the Yvon-Born-Green (YBG) equilibrium hierarchy 11 . Therefore, f eq is an eigenfunction of L with eigenvalue 0. Moreover, is zero at equilibrium and is a negative quantity, which shows that S is maximum at equilibrium where its value is S eq .
Note that the BBGKY scenario describes an interacting mixture of fluids made up of particle clusters in the phase space. Two such fluids differ in the size of the clusters they contain and each fluid contributes its own entropy, the n-particle entropy, to the total nonequilibrium entropy of our system. Likewise, the interaction between different fluids leads to the creation of n-particles clusters at the expense of the annihilation of p-particle clusters with n = p.
More interestingly here the most important property of the entropy we propose is its direction of change in a natural process. To elucidate this, we must establish the entropy bounds, if any. Hence, let us define the n-particle entropies Since the full distribution function F contains more information than f n , one might expect that S n ≥ S N . This can be proved from the convexity of the logarithmic function, ln x ≤ x−1 which can be rewritten 17 where there is strict inequality unless f = g. Hence, assuming that f = F and g = f n , from Eq. (28) one derives Analogously, it can be proved In light of this, we find that S is bounded This result together with our comments at the end of the previous section leads us to conclude that the nonequilibrium entropy S behaves as a free oscillator with an amplitude of oscillation △ S = −S N /2. Hence, there is no possibility of time arrow and a question arises as to how the equilibrium could be reached and more deeply how to re-read the second law for isolated systems. We will try to answer these questions in the next section.
To end this section, in view of our previous conclusion we assume the existence of a potential associated to the harmonic entropic oscillator with S * = S eq + S N 2 . Therefore, the potential bounds satisfy and the effective elastic constant is given through Consequently, where ϕ stands for the initial conditions. This system has a first integral of the motion its 'energy' given by 18 which is constant. Moreover, the period of the oscillations τ satisfies and should be coherent with the recurrence period of the Poincare cycles.

IV. ENTROPY PRODUCTION. THE LAW OF INCREASE OF ENTROPY
The rate of change of the nonequilibrium entropy or entropy production is obtained by taking the time derivative of Eq. (24), giving In a more explicit way, after using Eqs. (8) (10)-(12), Eq. (41) can be rewritten , and T is the kinetic temperature taking into account that the dependence of f eq,n in the velocities is given through a local Maxwellian. The entropy production given in Eq. (42) vanishes at equilibrium and in any other case it should not be necessarily zero. In addition, because p j is arbitrary is sufficient to satisfy the extremum condition δṠ/δf n | eq = 0, withṠ = ∂S/∂t. Precisely, Eq. (43) gives rise to the YBG hierarchy 9,19 .
On the other hand, by using Eq. (43), we can rewrite the entropy production given through Eq. (42) as which is the starting equation to analyze the relaxation to equilibrium. To this end, as in Nonequilibrium Thermodynamics 20 , from Eq. (44) we can establish the phenomenological relation where L j,i is a phenomenological matrix which in general might depend on the nonequilibrium thermodynamic force (F i − F eq i ). In terms of the mobility M j,i = L j,i /T f n we can rewrite Eq. (45) as where we have defined the current J j = f n p j − f eq,n p j , so J j and (F i − F eq i ) constitute a pair of conjugated current and thermodynamic force, respectively .
constituting the law of increase of entropy.

V. RELAXATION EQUATIONS
In this section we will analyze the relaxation to equilibrium by deriving the relaxation equation for the one particle reduced distribution function. To obtain such an equation we introduce the inverse mobility matrix ζ eq j,i ( n i=1 ζ eq j,i M eq i,l = δ j,l ), the friction matrix which allows us to invert the near equilibrium version of Eq. (46) At this point it will be useful to introduce the physical volume of the system V as a scale factor, thus we will redefine the reduced distribution functions 13 .
Additionally, we must also redefine the forces, writingF i /V andF eq i /V instead of F i and F eq i . Hence, for n = 1, we obtain from Eq. (8) Thus, by using Eqs. (43), (48) and (50) we obtain the kinetic equation where ζ ≡ ζ eq 2,1 . In the thermodynamic limit with ρ = N/V being the density. This equation constitutes a generalization of the Bhatnagar-Gross-Krook (BGK) relaxation model 9 .
To illustrate the approach to equilibrium let us write By introducing the factorization given through Eq. (53) into Eq. (52) after integration in p we obtain where is the current of the probability density φ(q, t) or first moment of the density ψ q , which satisfies the equation Here, for time t ≫ (ρζ) −1 , Eq. (56) leads to which substituted into Eq. (54) and assuming that ψ q is a local Maxwellian such that where the lineal differential operator L q is given through This equation contains a term k B T ∂/∂q ln f eq that plays the role of a thermal force usually introduced in polymer dynamics 21 .
In the next section, starting in Eq. (58) and from the properties of L q defined through Eq. (59) we will study the approach to equilibrium.

VI. APPROACH TO EQUILIBRIUM
The differential operator L q introduced in the previous section is a nonHermitian operator whose Hermitian conjugated is defined through left-hand eigenfunctions through and which we chose to be orthonormal, dqΩ p (q)ω r (q) = δ p,r . As has been said in section III, φ eq (q) is an eigenfunction with eigenvalue 0 of the evolution operator L q , thus it is possible to write and integrating, one obtains where the right-had side in the last equality has been obtained by using Eq. (59) and integrating by parts. Also, the positivity of ζ discussed in the context of Eq. (47) has been taken into account. Now any distribution φ(q, t) can by expanded in terms of the eigenfunctions where taking into account the orthonormality condition which gives Since ω 0 = 1 and φ(q, t) should be normalized, α 0 = 1. Therefore, showing that after a long period of time equilibrium is eventually reached.

VII. CONCLUSIONS
We find that the description of an N-body isolated system in the framework of the BBGKY hierarchy enables us to prove that the nonequilibrium entropy is not a constant of the motion. We emphasize that the nonequilibrium entropy should be defined as a convex functional of the distribution vector. Our contention is that the adequate functional is the one given in Eq. (24). Moreover, this description reconciles the reversibility of the Hamiltonian dynamics with the approach to equilibrium.
Due to the periodic character of the solution of the microscopic equations given through Eqs. (15) and (16), we realize that the nonequilibrium entropy corresponds to a dynamical system that behaves as a free oscillator, an 'entropic oscillator' with well established bounds determined by the equilibrium entropy which is the maximum entropy. We have also manage to construct the Hamiltonian for this entropic oscillator. Hence, the approach to equilibrium occurs when the entropy production is positive, i.e., when the dynamical system is rising through the walls of the elastic potential defined in Eq. (35). In other words, the entropy production is positive when the balance of forces appearing in the integrand of Eq. (42) is opposite to velocity thus preventing the expansion of the N-body system in the phase space.
The natural extension of our theory to study non isolated systems, i.e. dissipative N-body systems, would be to consider a damped entropic oscillator instead a free oscillator. In the case of the damped entropic oscillator the system collapses in the equilibrium state which is the attractor of the dynamics.
Performing a nonequilibrium thermodynamic analysis we are able to derive relaxation equations of the Fokker-Planck type, particularly for the one-particle distribution function.
Finally, through an spectral analysis we show how these equations describe the approach to equilibrium.