Skip to main content
Version: 1.0

Appendix I: features Wiki

This section provides a brief introduction on the features used in PWMLFF. The related literature is also listed, for readers' reference.

What are features?

Features (or descriptors) are quantities that describe the local atomic environment of an atom. They are required preserve the translational, rotational, and permutational symmetries. Features are usually used as the input of various regressors(linear model, NN, .etc), which output atomic energies and forces.

Features are differentiable functions of the spatial coordinates, so that force can be calculated as

Fi=dEtotdRi=j,αEjGj,αGj,αRi \mathbf{F_i} = - \frac{d E_{tot}}{d \mathbf{R_i}} = - \sum_{j,\alpha} \frac{\partial E_j}{\partial G_{j,\alpha}} \frac{\partial G_{j,\alpha}}{ \partial \mathbf{R_i}}

where :jj is the index of neighbor atom within the cutoff radius, and :α\alpha the index of feature.

Additionally, features are required to be rotionally, translationally, and permutaionally invariant.

2-b and 3-b features with piecewise cosine functions (feature 1 & 2)

Given a center atom, the piecewise cosine functions are used as the basis to describe its local environment. The praph below gives you an idea of how they look like.
features

We now define the pieceswise cosine functions, in both 2-body and 3-body feaures. Given the inner and outer cut RinnerR_{inner} and RouterR_{outer}, the degree of the basis MM, the width of piecewise function hh, and the interatomic distance between the center atom ii and the neighbor jj RijR_{ij}, one defines the basis function as

ϕα(Rij)={12cos(RijRαhπ)+12,RijRα<h0,otherwise \phi_\alpha (R_{ij}) = \begin{cases} \frac{1}{2}\cos(\frac{R_{ij}-R_{\alpha}}{h}\pi) + \frac{1}{2} &, |R_{ij} - R_{\alpha}| < h \\ 0 &, \text{otherwise} \\ \end{cases}

with

Rα=Rinner+(α1)h, α=1,2,...,M R_{\alpha} = R_{inner} + (\alpha - 1) h,\ \alpha = 1,2,...,M

The expression of 2-b feature with center atom ii is thus

Gα,i=mϕα(Rij) G_{\alpha,i} = \sum_{m} \phi_{\alpha}(R_{ij})

and 3-b feature

Gαβγ,i=j,kϕα(Rij)ϕβ(Rik)ϕγ(Rjk) G_{\alpha\beta\gamma,i} = \sum_{j,k} \phi_{\alpha}(R_{ij}) \phi_{\beta}(R_{ik}) \phi_{\gamma}(R_{jk})

where m\sum_{m} and m,n\sum_{m,n} sum over all atoms within cutoff RouterR_{outer} of atom ii

In practice, these two features are usually used in pair.

Reference:

Huang, Y., Kang, J., Goddard, W. A. & Wang, L.-W. Density functional theory based neural network force fields from energy decompositions. Phys. Rev. B 99, 064103 (2019)

2-b and 3-b Gaussian feature (feature 3 & 4)

These two are the features first used in Behler-Parrinello Neural Network. Given the cutoff radius RcR_c, and the interatomic distance RijR_{ij} with center atom ii, define cutoff function fcf_c

fc(Rij)={12cos(πRijRc)+12,Rij<Rc0,otherwise f_c(R_{ij}) = \begin{cases} \frac{1}{2}\cos(\frac{\pi R_{ij}}{R_c}) + \frac{1}{2} &, R_{ij} < R_c \\ 0 &, \text{otherwise} \\ \end{cases}

The 2-b Gaussian feature of atom ii is defined as

Gi=jie(η(RijRs)2)fc(Rij) G_i = \sum_{j \neq i} e^{(-\eta(R_{ij} - R_s)^2)} f_c (R_{ij})

where η\eta and RsR_s are parameters defined by user.

The 3-b Gaussian feature of atom ii is defined as

Gi=21ζj,ki(1+λcosθijk)ζ eη(Rij2+Rik2+Rjk2)fc(Rij)fc(Rik)fc(Rjk) G_i = 2^{1-\zeta} \sum_{j,k \neq i} (1+\lambda \cos \theta_{ijk} )^\zeta\ e^{-\eta(R_{ij}^2 + R_{ik}^2 + R_{jk}^2)} f_c (R_{ij}) f_c (R_{ik}) f_c (R_{jk})

where

cosθijk=RijRikRijRik \cos \theta_{ijk} = \frac{\mathbf{R_{ij}} \cdot \mathbf{R_{ik}}}{|\mathbf{R_{ij}}||\mathbf{R_{ik}}|}

and η\eta, ζ\zeta, and λ=±1\lambda = \pm1 are parameters defined by user.

In practice, these two features are usually used in pair.

Reference:

J. Behler and M. Parrinello, Generalized Neural-Network Representation of High Dimensional Potential-Energy Surfaces. Phys. Rev. Lett. 98, 146401 (2007)

Moment Tensor Potential (feature 5)

In MTP, the local environment of the center atom :math:i is characterized by

ni=(zi,zj,rij) \mathbf{n_i} = (z_i, z_j, \mathbf{r_{ij}})

where ziz_i is the atom type of the center atom, zjz_j atom type of the neighbor jj, and rij\mathbf{r_{ij}} the relative coordinates of neighbors. Next, energy contribution of each atom is expanded as

Ei(ni)=αcαBα(ni) E_i(\mathbf{n_i}) = \sum_\alpha c_\alpha B_\alpha(\mathbf{n_i})

where BαB_\alpha are the basis functions of choice and cαc_\alpha the parameters to be fitted.

We now introduce moment tensors MμνM_{\mu\nu} to define the basis functions

Mμν(ni)=jfμ(rij,zi,zj)νrij M_{\mu\nu} (\mathbf{n_i}) = \sum_j f_\mu (|\mathbf{r_{ij}}|,z_i,z_j) \bigotimes_\nu \mathbf{r_{ij}}

These moments contain both radial and angular parts. The radial parts can be expanded as

fμ(rij,zi,zj)=βcμ,zi,zj(β)Q(β)(rij) f_\mu (|\mathbf{r_{ij}}|,z_i,z_j) = \sum_\beta c^{(\beta)}_{\mu,z_i,z_j} Q^{(\beta)}(|\mathbf{r_{ij}}|)

where Q(β)(rij)Q^{(\beta)}(|\mathbf{r_{ij}}|) are the radial basis funtions. Specifically,

Q(β)(rij)={ϕ(β)(rij)(Rcut(rij))2,(rij)<Rcut0,otherwise Q^{(\beta)}(|\mathbf{r_{ij}}|) = \begin{cases} \phi^{(\beta)}(|\mathbf{r_{ij}}|) (R_{cut} - (|\mathbf{r_{ij}}|))^2 &, (|\mathbf{r_{ij}}|) < R_{cut} \\ 0 &,\text{otherwise} \end{cases}

where ϕ(β)\phi^{(\beta)} are polynomials (e.g. Chebyshev polynomials) defined on the interval [Rmin,RcutR_{min},R_{cut}]

The angular part νrij\bigotimes_\nu \mathbf{r_{ij}}, which means taking tensor product of rij\mathbf{r_{ij}} ν\nu times, contains the angular information of the neighborhood ni\mathbf{n_i}. ν\nu determines the rank of moment tensor. With ν=0\nu=0 one gets a constant scalar, ν=1\nu=1 a vector (rank-1 tensor), ν=2\nu=2 a matrix (rank-2 tensor), .etc.

Define further the level of moments as

lev(Mμν)=2+4μ+ν lev(M_{\mu \nu}) = 2 + 4\mu + \nu

This is an empirical formula.

Reference

I.S. Novikov, etal, The MLIP package: moment tensor potential with MPI and active learning. Mach. Learn.: Sci. Technol, 2, 025002 (2021)

Spectral Neighbor Analysis Potential (feature 6)

DP-Chebyshev (feature 7)

This feature attempts to mimic the behavior of DP's embedding network. It uses the Chebyshev polynomial as the basis.

DP-Gaussian (feature 8)

This feature attempts to mimic the behavior of DP's embedding network.