+-------------------------------+
| Writing | Lucas V. Schuermann |
+-------------------------------+
#
The Ising Model
Published 02/24/2016
=======================
Table of Contents

## Introduction

The Ising model is a mathematical model of ferromagnetism in statistical mechanics, and is one of the simplest systems that exhibits a phase transition. In this post, we will explain the quantum origins of ferromagnetism and the Ising model. We will also demonstrate a Python program that simulates the Ising model by means of the Metropolis algorithm, a Monte Carlo method. Some useful properties, including order parameters and heat capacity, can be obtained from the data collected from the simulation.

Ferromagnetism is responsible for the behavior of magnets encountered in everyday life. For example, ferromagnetism is used in electromagnets, electric motors, generators, transformers, and magnetic storage media, such as tape recorders and hard disks.

Ferromagnetism arises from two sources: electron magnetic moments and exchange interactions. These effects are purely quantum mechanical and cannot be explained through classical physics (a result known as the Bohr-van Leeuwen theorem).

## Electron magnetic moments

The electron magnetic moment is the magnetic moment of an electron caused by its angular momentum and its electric charge. There are two contributions to the angular momentum of an electron: the spin angular momentum and the orbital angular momentum.

Electron

The spin angular momentum is intrinsic to the electron and arises from a property of elementary particles called spin. The spin of an electron can be measured as being in one of two states: spin up or spin down.

Electron momentum

The second contribution to the angular momentum of an electron is called the orbital angular momentum. The orbital angular momentum is a result of the electron orbiting the nucleus of the atom. The orbital angular momentum of an electron is similar to the angular momentum encountered in classical mechanics.

## The exchange interaction and external fields

The following formula expresses the magnetic moment, here denoted by the Greek letter μ\mu, of an electron.

μ=geJ2me\mu = g\frac{eJ}{2m_{e}}

ee is the electric charge of the electron, JJ is its total angular momentum, and mem_{e} is its mass. The letter gg stands for a correction term called the Landé g-factor. As expected, the magnetic moment is proportional both to the electric charge and the angular momentum.

The presence of unpaired electrons and the alignment of adjacent spins (whether they are parallel or antiparallel) affects the electron location and therefore their electrostatic repulsion. The interaction between electron moments is called the exchange interaction. The exchange interaction creates an energy difference between the spin up and spin down states of the dipoles.

Spin alignment

In ferromagnetic materials, nearby spins tend to align in the same direction. Having the same spin state reduces the electrostatic repulsion and is therefore more stable. Similarly, spins tend to align with an external magnetic field:

Magnetic spin alignment

U=μBU = -\mu \cdot B

## Magnetization

When the magnetic dipoles in a piece of matter are aligned, pointing in the same direction, their individually tiny magnetic fields add up together to create a much larger, stronger field that we can detect. This is called magnetization.

Magnetization

In a ferromagnetic material, the dipoles tend to align spontaneously, even in the absence of an external field. This is called spontaneous magnetization.

Here we have a graph depicting the magnetization curve for nickel nanowires at room temperature. On the horizontal axis is the external applied field (measured in milli-teslas). On the vertical axis is the mass magnetization expressed in Amperes meters squared per kilogram. As you can see, the magnetization increases when the applied field increases and vice versa, until the magnetization of the material cannot be increased or decreased further (saturation):

Saturation

## The Curie temperature

As the temperature of a ferromagnetic material increases, the thermal motion of the dipoles competes with their tendency to align. When the temperature rises beyond a certain point, called the Curie temperature, the material can no longer maintain a spontaneous magnetization, so its ability to be magnetized or attracted to a magnet disappears. This is an example of a phase transition (second-order, discontinuous in the second derivative of the free energy).

Curie temperature

This graph shows the saturation magnetization (i.e. the magnetization obtained in a high magnetic field) of nickel (which is ferromagnetic) as a function of temperature. The saturation magnetization decreases with increasing temperature until it falls to zero at the Curie temperature, where the material becomes paramagnetic.

## The Ising model

The study of ferromagnetic phase transitions in the Ising model had a significant impact on the development of statistical physics. The 2D square-lattice Ising model in particular is one of the simplest statistical models to show a phase transition.

The Ising model consists of a graph (usually a lattice or grid) of spins that can be in one of two states: 11 or 1-1. Each spin can interact with its neighbors. This picture shows an example of a two-dimensional spin-lattice:

Spin lattice

Consider a set of lattice sites, here denoted by the Greek letter Λ\Lambda. The spin state of a lattice site ii is denoted by σi\sigma_{i} and can be either 11 or 1-1:

σi{1,1}\sigma_{i} \in \{1,-1\}

A spin configuration, here denoted by σ\sigma, is an assignment of a spin value to each and every lattice site:

σ=(σiiΛ)\sigma=(\sigma_{i} \mid i \in \Lambda)

## Spin interactions

For any two adjacent sites ii and jj, there is an interaction term between them denoted by JijJ_{ij}. This interaction terms determine how much the spins tend to align in the same direction (or point in opposite directions):

Jiji,jΛJ_{ij} \mid i,j\in\Lambda

If the interaction term is positive, as in a ferromagnetic material, spins tend to be aligned (meaning that configurations in which adjacent spins are of the same sign have higher probability). If the interaction term is negative, as in an antiferromagnetic material, adjacent spins tend to have opposite signs. If the interaction term is zero, there is no interaction between neighboring spins:

Jij>0J_{ij} > 0 interaction is ferromagnetic
Jij<0J_{ij} < 0 interaction is antiferromagnetic
Jij=0J_{ij} = 0 interaction is non-ferromagnetic

Every site also has an external magnetic field interacting with it. The external field at a site ii is denoted by hih_i as follows:

hiiΛh_i \mid i \in \Lambda

hi>0h_i > 0 spins tend to line up in the positive direction
hi<0h_i < 0 spins tend to line up in the negative direction
hi=0h_i = 0 there is no external influence on the spin site

## Energy of a spin configuration

The energy, here denoted by HH, of a spin configuration σ\sigma is given by

H(σ)=(i,j)PJijσiσjμiΛhiσiH(\sigma)=-\sum_{(i,j)\in P}J_{ij}\sigma_i\sigma_j-\mu\sum_{i\in\Lambda}h_i\sigma_i

Here, PP denotes the set of pairs of adjacent spin sites, and μ\mu again denotes the magnetic moment. The first sum is over pairs of adjacent spins (where every pair is counted once). Notice that, for positive JJ, adjacent spins with the same sign result in lower energy than adjacent spins with opposite signs. Hence aligned spins have lower energy and are therefore more stable. The second sum is over each spin site. Here, again, if the spin direction and the direction of the external magnetic field are aligned, the energy is reduced.

The probability of the system being in an equilibrium state with configuration σ\sigma at some temperature τ\tau is given by

P(σ)=eH(σ)τZτP(\sigma)=\frac{e^{\frac{-H(\sigma)}{\tau}}}{Z_{\tau}}

where H(σ)H(\sigma) is, again, the energy of the configuration σ\sigma. This is called the Boltzmann distribution. ZZ is called the partition function. The partition function, in statistical mechanics, describes the statistical properties of a system in thermodynamic equilibrium. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives.

Zτ=σeH(σ)τZ_{\tau}=\sum_{\sigma}e^{\frac{-H(\sigma)}{\tau}}

Here the partition function plays the role of a normalizing constant, ensuring that the probabilities sum up to one. In particular, the ratio of the probabilities of two states depends only on their difference in energy:

P(σ)P(σ)=eH(σ)H(σ)τ\frac{P(\sigma')}{P(\sigma)}=e^{\frac{H(\sigma)-H(\sigma')}{\tau}}

Using this model, in the limit of large numbers of spin sites, we can answer a number of statistical questions that are physically significant, such as the probability that neighboring sites have the same spin or at what temperature a phase transition occurs.

## Simulating the Ising model

The Ising model can be difficult to simulate if there are many states in the system. When there is a large number of sites, there is an absolutely enormous number of possible configurations.

To see this, suppose there are NN spin sites. Since every spin site has two spin states, there are 2N2^N different possible spin configurations of the system. This is the reason why the Ising model is simulated using Monte Carlo methods. Monte Carlo methods are algorithms that rely on repeated random sampling to obtain numerical results. They are often used in physical and mathematical problems when it is difficult or impossible to use other techniques, particularly those based on analytical solutions.

The Metropolis–Hastings algorithm is the most commonly used Monte Carlo algorithm to simulate the Ising model. This algorithm relies on a concept called single-spin-flip dynamics, which means that, at each step of the simulation, the algorithm will randomly pick a site and calculate what the overall change in energy would be if the spin at that site were flipped.

Spin flip

The algorithm must then determine, based on the energies of the current and candidate configurations, whether or not to accept the candidate configuration as the next configuration of the system:

Spin flip configuration change

First, the algorithm must determine whether the new energy of the system would be higher or lower than its current energy. Suppose the new energy is lower than the current energy:

H(σ)<H(σ)H(\sigma') < H(\sigma)

If the new energy is lower than the current energy, always accept the transition. On the other hand, if the new energy is higher,

H(σ)>H(σ)H(\sigma') > H(\sigma)

only accept the transition with a certain probability:

P=eH(σ)H(σ)τP=e^{\frac{H(\sigma)-H(\sigma')}{\tau}}

Notice that the probability of transitioning into a configuration with higher energy decreases as the energy difference increases. This means higher-energy configurations are less accessible than lower-energy ones.

On the other hand, the probability of accessing higher energy configurations increases as the temperature increases. This temperature dependence is crucial to determining whether spontaneous magnetization can occur.

In summary, the simulation will always move the system into a lower energy state. It can also move the system into higher and higher energy states with smaller and smaller probability (again, depending on the temperature).

## Example results from a simulation

Example code for a simulation of the Ising model based on the Metropolis algorithm can be found on our Github page. Here is an example of spontaneous magnetization occurring at a low temperature when the simulation is left running for a period of time:

Spontaneous magnetization

Notice that, as time progresses, larger and larger clusters of spin sites sharing the same spin value form. In this case, spontaneous magnetization does occur.

On the other hand, if we increase the temperature of the simulation above the Curie temperature, no spontaneous magnetization occurs due to large thermal fluctuations. In this case, the last frame would look just as noisy as the first one.

Here we create a plot of the energy of the system at equilibrium for different temperature values, based on data collected from the simulation. Notice that the energy increases as the temperature of the system increases. A sharp increase in energy occurs in the region where the phase transition occurs:

Phase transition

This is a plot of the entropy of the system versus its temperature:

Entropy

## The order parameter

The order parameter (here denoted by ξ\xi) is a measure of the order or disorder of a state. In a fully disordered state, the average magnetization is 00, while in a fully ordered state, the average magnetization is at a maximum. Hence the order parameter can be defined as

ξ=NNN\xi=\frac{|N_{\uparrow}-N_{\downarrow}|}{N}

which is 00 in the completely disordered state and 11 in the fully ordered state. This quantity is closely related to the average magnetization of the system, which is a measurable quantity in a real system. The state becomes fully disordered at high temperatures and fully ordered (i.e. all spins are aligned in the same direction) at low temperatures. Notice again the steep decrease in order in the phase transition region:

Phase transition 2

## The heat capacity

Another property of the system we can measure from our Ising model simulation is its heat capacity. The heat capacity is defined as the ratio of the amount of heat energy transferred to an object and the resulting increase in temperature of the object:

C=dQdTC=\frac{dQ}{dT}

Here we create a plot of the heat capacity versus the temperature of the system based on the energy values we collected (taking finite differences):

Heat capacity

Note that the heat capacity peaks at the critical temperature where the phase transition occurs.

## Effect of grid size on physical accuracy

These figures show the magnetization and heat capacity versus temperature curves for various lattice sizes in the absence of an external magnetic field. The two-dimensional array of spins is initialized in a fully aligned state for each different value of the temperature:

Magnetization and heat capacity

Notice that the height of the peak in the heat capacity curve increases with increasing array size. Furthermore, the peak becomes narrower and narrower and the phase transition becomes steeper and steeper as the array size increases.

The thermodynamic limit is approached as the array size approaches infinity. This means that second-order phase transitions, such as the one reproduced by the Ising model, are characterized by a local quasi-singularity in the heat capacity.

## Concluding remarks

The field of scientific computing, also called computational science, is a rapidly growing multidisciplinary field that uses algorithms to understand and solve complex problems in the sciences. Computer simulations are immensely useful in modeling and understanding real-world physical systems. Examples of these include lattice-based quantum chromodynamics (lattice QCD), smoothed particle hydrodynamics (SPH), numerical relativity, and the finite-difference time-domain (FDTD) method for electromagnetics. We hope to post some explorations of these methods on this blog in the near future.

## Citation

Cited as:

Martin, Carlos G. and Schuermann, Lucas V. (Feb 2016). The Ising Model. Writing. https://lucasschuermann.com/writing/ising-model

Or

@misc{martin2016ising,
  title   = {The Ising Model},
  author  = {Martin, Carlos G. and Schuermann, Lucas V.},
  year    = {2016},
  month   = {Feb},
  url     = "https://lucasschuermann.com/writing/ising-model"
}