Acessibilidade / Reportar erro

Computational methods inspired by Tsallis statistics: Monte Carlo and molecular dynamics algorithms for the simulation of classical and quantum systems

Abstract

Tsallis's generalization of statistical mechanics is summarized. A modification of this formalism which employs a normalized expression of the q-expectation value for the computation of equilibrium averages is reviewed for the cases of pure Tsallis statistics and Maxwell-Tsallis statistics. Monte Carlo and Molecular Dynamics algorithms which sample the Tsallis statistical distributions are presented. These methods have been found to be effective in the computation of equilibrium averages and isolation of low lying energy minima for low temperature atomic clusters, spin systems, and biomolecules. A phase space coordinate transformation is proposed which connects the standard Cartesian positions and momenta with a set of positions and momenta which depend on the potential energy. It is shown that pure Tsallis statistical averages in this transformed phase space result in the q-expectation averages of Maxwell-Tsallis statistics. Finally, an alternative novel derivation of the Tsallis statistical distribution is presented. The derivation begins with the classical density matrix, rather than the Gibbs entropy formula, but arrives at the standard distributions of Tsallis statistics. The result suggests a new formulation of imaginary time path integrals wich may lead to an improvement in the simulation of equilibrium quantum statistical averages.


Computational methods inspired by Tsallis statistics: Monte Carlo and molecular dynamics algorithms for the simulation of classical and quantum systems

John E. Straub1,2 and Ioan Andricioaei1,3

1 Department of Chemistry, Boston University, Boston, MA 02215

2 Institute for Advanced Studies, The Hebrew University of Jerusalem, Givat Ram 91904 Israel

3 Present address: Department of Chemistry and Chemical Biology,

Harvard University, 12 Oxford Street, Cambridge, MA 02138

Received 07 December, 1998

Tsallis's generalization of statistical mechanics is summarized. A modification of this formalism which employs a normalized expression of the q-expectation value for the computation of equilibrium averages is reviewed for the cases of pure Tsallis statistics and Maxwell-Tsallis statistics. Monte Carlo and Molecular Dynamics algorithms which sample the Tsallis statistical distributions are presented. These methods have been found to be effective in the computation of equilibrium averages and isolation of low lying energy minima for low temperature atomic clusters, spin systems, and biomolecules. A phase space coordinate transformation is proposed which connects the standard Cartesian positions and momenta with a set of positions and momenta which depend on the potential energy. It is shown that pure Tsallis statistical averages in this transformed phase space result in the q-expectation averages of Maxwell-Tsallis statistics. Finally, an alternative novel derivation of the Tsallis statistical distribution is presented. The derivation begins with the classical density matrix, rather than the Gibbs entropy formula, but arrives at the standard distributions of Tsallis statistics. The result suggests a new formulation of imaginary time path integrals wich may lead to an improvement in the simulation of equilibrium quantum statistical averages.

I Introduction

Ten years ago, Tsallis published his seminal work on a possible generalization of the standard Gibbs-Boltzmann statistical mechanics [1]. His intriguing theory began with a reexpression of the Gibbs-Shannon entropy formula

where dG = drNdpN is a phase space increment. On the right of this expression, the identity ln x = limn ® 0 (xn-1)/n has been used to transform the logarithm and q is a real number [1,2]. A similar result was previously presented in the context of generalized information entropies but had apparently not been applied to describe the physical world [3]. Tsallis's bold move was to do just that.

Tsallis noted a number of properties of Sq, which he referred to as a "generalized entropy,'' and the associated statistical distributions. He found that much of the standard mathematical structure of Gibbs-Boltzmann statistical mechanics is preserved. This is interesting in itself. However, even more interesting was what is not preserved. Thermodynamic state functions such as the entropy and energy were no longer extensive functions of the system. This prompted the use of a generalized formalism based on the non-additive function Sq to derive, for non-extensive systems, a variety of results of the standard statistical mechanics (see [4] and references therein).

II "Pure'' Tsallis statistics

Tsallis derived the probability of finding the system at a given point in phase space G = (rN, pN) by extremizing Sq subject to the constraints

where H(G) is the system Hamiltonian for N distinguishable particles in d dimensions. The result is

where

plays the role of the canonical ensemble partition function. Using the identity limn ® 0 (1 + an)1/n = exp(a), in the limit that q = 1, the standard probability distribution of classical Gibbs-Boltzmann statistical mechanics

is recovered. However, there is a problem. For certain values of q and a harmonic potential, the distribution pq(G) has infinite variance and higher moments.

II.1 Introduction of the "q-expectation'' value

To address this problem, Tsallis derived the statistical probability by extremizing Sq subject to the modified constraints

where the averaged energy is defined using a "q-expectation'' value. The result is

with the "partition function''

In the limit that q = 1 the classical canonical density distribution is recovered.

To be consistent, the q-expectation value is also used to compute the average of an observable A

However, since the averaging operator is not normalized, in general for q ¹ 1. Moreover, it is necessary to compute Zq to determine the average. To avoid this difficulty, a different generalization of the canonical ensemble average was proposed [5]

It is obviously normalized and convenient to apply.

II.2 Monte Carlo estimates of Tsallis statistical averages

The question arises, how might we simulate systems that are well described by the Tsallis statistical distributions when q ¹ 1? A point in phase space or configuration space can be said to have the probability

where is the effective potential energy [1]

From this expression it is possible to simply state a Monte Carlo algorithm which samples the equilibrium Tsallis statistical distribution. (1) A new configuration is chosen within a region with uniform constant probability. (2) The point in configuration space is accepted or rejected according to the criterion

where the change in the effective potential energy is . (3) Repeat the previous steps. This Monte Carlo algorithm will satisfy detailed balance and eventually sample the equilibrium Tsallis distribution.

II.3 Monte Carlo estimates of Gibbs-Boltzmann statistical averages

While this Monte Carlo algorithm will sample the generalized statistical distributions, it can also be used to effectively sample the configuration space that is statistically important to the standard canonical ensemble probability density [6]. The first method of this kind was the "q-jumping Monte Carlo'' method. (1) At random a choice is made to make a uniformly distributed "local'' move, with probability 1-PJ, or a global "jump'' move, with probability PJ. (2a) The local trial move is sampled from a uniform distribution, for example from a cube of side D. (2b) The jump trial move is sampled from the generalized statistical distribution

at q > 1. (3a) The local move is accepted or rejected by the standard Metropolis acceptance criterion with probability

(3b) The jump trial move is accepted or rejected according to the probability

where the bias due to the non-symmetric trial move has been removed. This algorithm satisfies detailed balance. The delocalized Tsallis statistical distributions are sampled to provide long range moves between well-separated but thermodynamically significant basins of configuration space. Such long range moves are randomly shuffled with short range moves which provide good local sampling within the basins. However, the Monte Carlo walk samples the equilibrium Gibbs-Boltzmann distribution.

When q > 1, it has been shown that the generalized q-jumping Monte Carlo trajectories will cross barriers more frequently and explore phase space more efficiently than standard Monte Carlo (without jump moves). (For a review of recent methods for enhanced phase-space sampling see [7].) We have shown how this property can be exploited to derive effective Monte Carlo methods which provide significantly enhanced sampling relative to standard methods.

II. 4 Problems for many-body systems

Consider a system of N particles in d dimensions. Using the standard procedure of integrating over the momenta in Cartesian coordinates, we can write the average of a mechanical property A(rN) using Eq. (11) as [5]

This definition of the normalized statistical average is based on and proportional to the q-expectation value. However, it is more useful since it is not necessary to evaluate the partition function to compute the average. Nevertheless, this result is unsatisfactory. It is poorly behaved in the N ® ¥ thermodynamic limit. We will address this problem in the next section in the context of Maxwell-Tsallis statistics.

III Maxwell-Tsallis statistics

For many systems for which it is difficult to define effective Monte Carlo trial moves, Molecular Dynamics methods can be used to compute equilibrium averages. The desire to create such a molecular dynamics algorithm to sample the Tsallis statistical distribution led to the use of Maxwell-Tsallis statistics [6,5]. The equilibrium distribution is taken to be a hybrid product of (1) a Tsallis statistical distribution (for the configurations) and (2) a Maxwell distribution (for the momenta) as

where

is the transformed potential and

is the kinetic energy of the N-body system defined in the usual way.

III.1 Ensemble averages

Consider a system of N particles in d dimensions. Using the standard procedure of integrating over the momenta in Cartesian coordinates, we can write the average of a mechanical property A(rN) using Eq. (11) as

This definition is based on and proportional to the q-expectation value. This result lacks the odd N-dependence in the exponent of the configurational weighting function found for the case of pure Tsallis statistics in Eq. (18). While it is not clear which result is "right,'' this expression is certainly more satisfying in the N ® ¥ thermodynamic limit.

III.2 Molecular Dynamics estimates of Maxwell-Tsallis statistical averages

Standard Molecular Dynamics for an ergodic system generates time averages which agree with averages taken over a microcanonical distribution. To compute averages such as Eq. (22) for the generalized canonical ensemble probability density using MD, we will employ a trick. We define a molecular dynamics algorithm such that the trajectory samples the distribution Pq(rN) by having the trajectory move on a temperature dependent, but static, effective potential [6]. The equation of motion takes on a simple and suggestive form

for a particle of mass mk, at position rk, and defined by Eq. (13). It is known that in the canonical ensemble a constant-temperature molecular dynamics algorithm generates samples from the configuration space according to the Boltzmann probability. As a result, this generalized molecular dynamics will sample the Tsallis statistical distribution Pq(rN).

The effective force employed is the "exact'' force for standard molecular dynamics, -ÑrkU, scaled by

which is a function of the potential energy. This scaling function is unity when q = 1 but can otherwise have a strong influence on the dynamics. Assume that the potential is defined to be a positive function. In the regime q > 1, the scaling function aq (rN,b) is largest near low lying minima of the potential. In barrier regions, where the potential energy is large, the scaling function aq (rN,b) is small. This has the effect of reducing the magnitude of the force in the barrier regions. A particle attempting to pass over a potential energy barrier will meet with less resistance when q > 1 than when q = 1. At equilibrium, this leads to more delocalized probability distributions with an increased probability of sampling barrier regions.

III.3 Molecular dynamics estimates of Gibbs-Boltzmann statistical averages

The generalized molecular dynamics described above generates trajectories which, averaged over time, sample the Tsallis statistical distribution. To compute averages over the Gibbs-Boltzmann distribution we reweight each measurement as

Using this expression, the standard (q = 1) Gibbs-Boltzmann equilibrium average properties may be calculated over a trajectory which samples the generalized statistical distribution for q ¹ 1 with the advantage of enhanced sampling for q > 1.

This method leads to an enhanced sampling of conformation space. However, it suffers a bit from the ease with which the trajectory moves through high energy regions in the q > 1 regimes [6]. Most of those regions of high potential energy are not thermodynamically important. It is good to visit them, as the q-jumping Monte Carlo does, but only on the way to another thermodynamically important region. The q-jumping Monte Carlo method has the advantage that the trajectory samples the Gibbs-Boltzmann distribution (no reweighting is necessary). The walk is compelled to spend time in thermodynamically significant regions.

This equation of motion is consistent with Tsallis statistics in as much as a long time dynamical average for an ergodic system will provide results identical to the average over the Tsallis statistical distribution. However, it cannot be said to tell us about the true dynamics of the system. In the next section, we present an alternative interpretation of the origin of Maxwell-Tsallis statistics.

III.4 From Newton's equation to Tsallis statistics through a deformation of space

The Tsallis statistical distribution is typically derived from an extremization of the reformulated Gibbs entropy. As we have shown, it is possible to derive a microscopic dynamics that generates time averages which are in agreement with statistical averages over the Tsallis distribution. However, there is no unique way to do this. We have presented one method based on the Maxwell-Tsallis model. The momenta are treated in the standard way and the dynamics is a normal Hamiltonian flow over an effective potential energy function .

A second possibility is to begin with the Hamiltonian

in the phase space GT = (rT, pT). We couple this expression with two definitions. First, the modified potential energy UT is defined through the equality

For example, if U(r) = r2 and rT = r2 then UT(rT) = rT. Second, the transformation between the coordinate xk and is defined

where djk is the Kronecker delta, and similarly

As these definitions indicate, the Jacobian transformation matrix is diagonal and in fact can be written

The determinant of this Jacobian matrix is simply

where d is the dimension of the space in which the N particle system is found. This determinant describes how the incremental volume of configuration space in the standard Cartesian coordinates (dV)N = dx1dy1¼dyNdzN is transformed to the increment (dVT)N is the Tsallisian space.

How does this metric deform the ordinary Cartesian space? When q = 1, we recover the standard metric. When q > 1, in regions of high potential energy, the distance between two points will be effectively dialated. This leads to a dialation of the barrier and saddle regions relative to regions of lower potential energy and a reduction in the associated force. Note that the usual caveats apply. The degree of both the absolute and relative contraction will depend on the zero of energy, and we must ignore those cases where the scaling factor is negative.

With these definitions it is possible to rewrite the Hamiltonian as

The equations of motion for the kth atom are

which results in

Rewriting this expression using the metric transformation defined above we obtain

which we can recognize as

This is precisely Newton's equation that we found using Maxwell-Tsallis statistics - the standard momentum for a particle moving over an effective potential . So this coordinate transformation (r, p) ® (rT, pT) maps the Hamiltonian dynamics of r on (r) onto a Hamiltonian dynamics of rT on UT(rT).

If we insert this Hamiltonian in our definition of the expectation value of a property A(r) we find

which, after a change of variables for the integration, becomes

Performing the integral over the momenta we find

which we found earlier using Maxwell-Tsallis statistics. However, the Maxwell-Tsallis statistics was an "impure'' application of the formalism since the kinetic energy was assumed to have a Gaussian Maxwell distribution.

Recall that in the application of "pure'' Tsallis statistics we found

This result was deemed to be unsatisfactory due to the unwelcome dependence on N in the thermodynamic limit (N ® ¥). Our new result allows us to follow a pure application of the Tsallis statistics using an effective Hamiltonian that both satisfies Newton's equations of motion on the ordinary potential and leads to an intuitively satisfying definition of a statistical average.

IV Another path to the Tsallis statistical distributions

We begin with the classical density matrix exp(-bH). Rather than writing

as is commonly done in discussions of path integral representations of the density matrix, suppose that we express the exponential as a limit

Now suppose that we remove the kernel of the limit

and consider it for arbitrary P. If we substitute

we find that P = 1,2,3,4 ¼¥ becomes and

The right hand side of this expression is the Tsallis statistical distribution which was originally derived by extremizing the "entropy'' Sq subject to the constraints that the distribution is normalized and that the average energy is computed in the standard way.

Now suppose that we instead define

so that P = 1,2,3,4 ¼¥ becomes . The resulting distribution is

The right hand side of the expression is precisely the Tsallis statistical distribution pq(G) derived by extremizing Sq subject to the constraints that the distribution is normalized and the average energy is defined in terms of the "q-expectation'' value.

How can we interpret these results? Tsallis showed how a generalized statistics can originate from the Gibbs entropy formula and the identity . He then stripped away the limit and interpreted the kernel where n = q-1

However, it is possible to reach the same distribution from a different starting point - the equilibrium density matrix. We rewrite the density matrix using the identity exp(-h) = limP ® ¥ (1 + h/P)-P. We then strip away the limit and interpret the kernel where P = 1/(q-1)

in the spirit of Tsallis statistics.

For the later expression, derived using the constraint based on the q-expectation value, when P = 1 we have the interesting case of q = 2. In the limit that P = ¥, we recover the Gibbs-Boltzmann statistical distribution. Intermediate values of P provide cases in between these limits.

IV.1 Recovering Maxwell-Tsallis statistics

Think of the classical density matrix

where we have separated the kinetic energy and potential energy contributions. Now suppose that we carry out the approximation described above for the Boltzmann factor alone. We find

This is precisely the expression for the Maxwell-Tsallis statistical distribution considered earlier where P = 1/(q-1).

IV.2 Tsallis statistics and Feynman path integral quantum mechanics

The quantum mechanical density matrix can be written

If restrict ourselves to finite P, the kernel is only approximate since K and U do not, in general, commute. In the path integral formalism, P is interpreted as the number of pseudoparticles in the isomorphic classical system. When P = 1, one recovers the fully classical system. When P = ¥, one obtains an exact representation of the quantum mechanical system. For intermediate values of P we have a semi-classical representation of the system.

Suppose that we write

This result provides us with a path integral representation akin to the classical Maxwell-Tsallis statistics where the pseudoparticle "necklace'' of beads, each harmonically restrained to its nearest neighbors, interact in imaginary time through the logarithmic potential

Since

this is an exact expression for the density matrix. It is possible to employ this expression in path integral simulations of condensed phase systems. Ordinarily, in systems where quantum effects are most important, one must approach large values of P before there is convergence to the exact quantum mechanical average. As we have shown, if the necklace samples the Tsallis statistical distribution it should visit regions of higher potential energy more frequently and the distribution should be significantly more delocalized than the standard representation for the same number of beads in the necklace P. This implies that this representation might provide faster convergence to the quantum mechanical limit than the standard form.

We gratefully acknowledge the National Science Foundation for support (CHE-9632236), the Center for Scientific Computing and Visualization at Boston University for computational resources, Troy Whitfield for helpful discussions, and the Institute for Advanced Studies at The Hebrew University of Jerusalem for support and hospitality.

References

    1.
  • C. Tsallis, J. Stat. Phys. 52, 479 (1988).
  • 2.
  • E.M.F. Curado and C. Tsallis, J. Phys. A: Math. Gen. 24, L69 (1991).
  • 3.
  • A. Wehrl, Rev. Mod. Phys. 50, 221 (1970); Z. Daróczy, Inf. Control 16, 36 (1970).
  • 4.
  • C. Tsallis, Phys. Rev. Lett. 75, 3589 (1995). See also http://tsallis.cat.cbpf.br .
  • 5.
  • J.E. Straub and I. Andricioaei. Exploiting Tsallis statistics. In P. Deuflhard, J. Hermans, B. Leimkuhler, A. Mark, S. Reich and R. D. Skee, editors, Algorithms for Macromolecular Modeling (Lecture Notes in Computational Science and Engineering), vol. 4, page 189. Springer Verlag, 1998.
  • 6.
  • I. Andricioaei and J.E. Straub, J. Chem. Phys. 107, 9117 (1997).
  • 7.
  • B.J. Berne and J.E. Straub, Curr. Opin. Struc. Bio. 7, 181 (1997).
  • 1. C. Tsallis, J. Stat. Phys. 52, 479 (1988).
  • 2. E.M.F. Curado and C. Tsallis, J. Phys. A: Math. Gen. 24, L69 (1991).
  • 3. A. Wehrl, Rev. Mod. Phys. 50, 221 (1970);
  • 4. C. Tsallis, Phys. Rev. Lett. 75, 3589 (1995).
  • 5. J.E. Straub and I. Andricioaei. Exploiting Tsallis statistics. In P. Deuflhard, J. Hermans, B. Leimkuhler, A. Mark, S. Reich and R. D. Skee, editors, Algorithms for Macromolecular Modeling (Lecture Notes in Computational Science and Engineering), vol. 4, page 189. Springer Verlag, 1998.
  • 6. I. Andricioaei and J.E. Straub, J. Chem. Phys. 107, 9117 (1997).

Publication Dates

  • Publication in this collection
    17 Sept 1999
  • Date of issue
    Mar 1999

History

  • Received
    07 Dec 1998
Sociedade Brasileira de Física Caixa Postal 66328, 05315-970 São Paulo SP - Brazil, Tel.: +55 11 3091-6922, Fax: (55 11) 3816-2063 - São Paulo - SP - Brazil
E-mail: sbfisica@sbfisica.org.br