MEGNet framework - 分子/晶体通用图网络机器学习框架

网友投稿 798 2022-10-30

MEGNet framework - 分子/晶体通用图网络机器学习框架

MEGNet framework - 分子/晶体通用图网络机器学习框架

Table of Contents

IntroductionMEGNet FrameworkInstallationUsageDatasetsImplementation detailsComputing requirementsKnown limitationsContributorsReferences

Introduction

This repository represents the efforts of the Materials Virtual Lab in developing graph networks for machine learning in materials science. It is a work in progress and the models we have developed thus far are only based on our best efforts. We welcome efforts by anyone to build and test models using our code and data, all of which are publicly available. Any comments or suggestions are also welcome (please post on the Github Issues page.)

A web app using our pre-trained MEGNet models for property prediction in crystals is available at http://megnet.crystals.ai.

MEGNet framework

The MatErials Graph Network (MEGNet) is an implementation of DeepMind's graph networks[1] for universal machine learning in materials science. We have demonstrated its success in achieving very low prediction errors in a broad array of properties in both molecules and crystals (see "Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals"[2]).

Briefly, Figure 1 shows the sequential update steps of the graph network, whereby bonds, atoms, and global state attributes are updated using information from each other, generating an output graph.

Figure 1. The graph network update function.

Figure 2 shows the overall schematic of the MEGNet. Each graph network module is preceded by two multi-layer perceptrons (known as Dense layers in Keras terminology), constituting a MEGNet block. Multiple MEGNet blocks can be stacked, allowing for information flow across greater spatial distances. The number of blocks required depend on the range of interactions necessary to predict a target property. In the final step, a set2set is used to map the output to a scalar/vector property.

Figure 2. Schematic of MatErials Graph Network.

Installation

Megnet can be installed via pip for the latest stable version:

pip install megnet

For the latest dev version, please clone this repo and install using:

python setup.py develop

Usage

Our current implementation supports a variety of use cases for users with different requirements and experience with deep learning. Please also visit the notebooks directory for Jupyter notebooks with more detailed code examples.

Using pre-built models

In our work, we have already built MEGNet models for the QM9 data set and Materials Project dataset. These models are provided as serialized HDF5+JSON files. Users who are purely interested in using these models for prediction can quickly load and use them via the convenient MEGNetModel.from_file method. These models are available in the mvl_models folder of this repo. The following models are available:

QM9 molecule data: HOMO: Highest occupied molecular orbital energyLUMO: Lowest unoccupied molecular orbital energyGap: energy gapZPVE: zero point vibrational energyµ: dipole momentα: isotropic polarizability: electronic spatial extentU0: internal energy at 0 KU: internal energy at 298 KH: enthalpy at 298 KG: Gibbs free energy at 298 KCv: heat capacity at 298 Kω1: highest vibrational frequency. Materials Project data: Formation energy from the elementsBand gapLog 10 of Bulk Modulus (K)Log 10 of Shear Modulus (G)

The MAEs on the various models are given below:

Performance of QM9 MEGNet-Simple models

PropertyUnitsMAE
HOMOeV0.043
LUMOeV0.044
GapeV0.066
ZPVEmeV1.43
µDebye0.05
αBohr^30.081
<R2>Bohr^20.302
U0eV0.012
UeV0.013
HeV0.012
GeV0.012
Cvcal/(molK)0.029
ω1cm^-11.18

Performance of MP-2018.6.1

PropertyUnitsMAE
EfeV/atom0.028
EgeV0.33
K_VRHlog10(GPa)0.050
G_VRHlog10(GPa)0.079

Performance of MP-2019.4.1

PropertyUnitsMAE
EfeV/atom0.026
EfermieV0.288

New models will be added as they are developed in the mvl_models folder. Each folder contains a summary of model details and benchmarks. For the initial models and bencharmks comparison to previous models, please refer to "Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals"[2].

Below is an example of crystal model usage:

from megnet.utils.models import load_modelfrom pymatgen import Structure, Lattice# load a model in megnet.utils.models.AVAILABLE_MODELSmodel = load_model("logK_MP_2018") # We can construct a structure using pymatgenstructure = Structure(Lattice.cubic(3.167), ['Mo', 'Mo'], [[0, 0, 0], [0.5, 0.5, 0.5]])# Use the model to predict bulk modulus K. Note that the model is trained on# log10 K. So a conversion is necessary.predicted_K = 10 ** model.predict_structure(structure).ravel()print('The predicted K for {} is {} GPa'.format(structure.formula, predicted_K[0]))

A full example is in notebooks/crystal_example.ipynb.

For molecular models, we have an example in notebooks/qm9_pretrained.ipynb. We support prediction directly from a pymatgen molecule object. With a few more lines of code, the model can predict from SMILES representation of molecules, as shown in the example. It is also straightforward to load a xyz molecule file with pymatgen and predict the properties using the models. However, the users are generally not advised to use the qm9 molecule models for other molecules outside the qm9 datasets, since the training data coverage is limited.

Below is an example of predicting the "HOMO" of a smiles representation

from megnet.utils.molecule import get_pmg_mol_from_smilesfrom megnet.models import MEGNetModel# same model API for molecule and crystals, you can also use the load_model method # as in previous examplemodel = MEGNetModel.from_file('mvl_models/qm9-2018.6.1/HOMO.hdf5')# Need to convert SMILES into pymatgen Moleculemol = get_pmg_mol_from_smiles("C")model.predict_structure(mol)

Training a new MEGNetModel from structures

For users who wish to build a new model from a set of crystal structures with corresponding properties, there is a convenient MEGNetModel class for setting up and training the model. By default, the number of MEGNet blocks is 3 and the atomic number Z is used as the only node feature (with embedding).

from megnet.models import MEGNetModelfrom megnet.data.crystal import CrystalGraphimport numpy as npnfeat_bond = 10r_cutoff = 5gaussian_centers = np.linspace(0, r_cutoff + 1, nfeat_bond)gaussian_width = 0.5graph_converter = CrystalGraph(cutoff=r_cutoff)model = MEGNetModel(graph_converter=graph_converter, centers=gaussian_centers, width=gaussian_width)# Model training# Here, `structures` is a list of pymatgen Structure objects.# `targets` is a corresponding list of properties.model.train(structures, targets, epochs=10)# Predict the property of a new structurepred_target = model.predict_structure(new_structure)

Note that for realistic models, the nfeat_bond can be set to 100 and epochs can be 1000. In some cases, some structures within the training pool may not be valid (containing isolated atoms), then one needs to use train_from_graphs method by training only on the valid graphs.

Following the previous example,

model = MEGNetModel(graph_converter=graph_converter, centers=gaussian_centers, width=gaussian_width)graphs_valid = []targets_valid = []structures_invalid = []for s, p in zip(structures, targets): try: graph = model.graph_converter.convert(s) graphs_valid.append(graph) targets_valid.append(p) except: structures_invalid.append(s)# train the model using valid graphs and targetsmodel.train_from_graphs(graphs_valid, targets_valid)

For model details and benchmarks, please refer to "Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals"[2]

Pre-trained elemental embeddings

A key finding of our work is that element embeddings from trained formation energy models encode useful chemical information that can be transferred learned to develop models with smaller datasets (e.g. elastic constants, band gaps), with better converegence and lower errors. These embeddings are also potentially useful in developing other ML models and applications. These embeddings have been made available via the following code:

from megnet.data.crystal import get_elemental_embeddingsel_embeddings = get_elemental_embeddings()

An example of transfer learning using the elemental embedding from formation energy to other models, please check notebooks/transfer_learning.ipynb.

Customized Graph Network Models

For users who are familiar with deep learning and Keras and wish to build customized graph network based models, the following example outlines how a custom model can be constructed from MEGNetLayer, which is essentially our implementation of the graph network using neural networks:

from tensorflow.keras.layers import Input, Densefrom tensorflow.keras.models import Modelfrom megnet.layers import MEGNetLayer, Set2Setn_atom_feature= 20n_bond_feature = 10n_global_feature = 2# Define model inputsint32 = 'int32'x1 = Input(shape=(None, n_atom_feature)) # atom feature placeholderx2 = Input(shape=(None, n_bond_feature)) # bond feature placeholderx3 = Input(shape=(None, n_global_feature)) # global feature placeholderx4 = Input(shape=(None,), dtype=int32) # bond index1 placeholderx5 = Input(shape=(None,), dtype=int32) # bond index2 placeholderx6 = Input(shape=(None,), dtype=int32) # atom_ind placeholderx7 = Input(shape=(None,), dtype=int32) # bond_ind placeholderxs = [x1, x2, x3, x4, x5, x6, x7]# Pass the inputs to the MEGNetLayer layer# Here the list are the hidden units + the output unit, # you can have others like [n1] or [n1, n2, n3 ...] if you want. out = MEGNetLayer([32, 16], [32, 16], [32, 16], pool_method='mean', activation='relu')(xs)# the output is a tuple of new graphs V, E and u# Since u is a per-structure quantity, # we can directly use it to predict per-structure propertyout = Dense(1)(out[2])# Set up the model and compile it!model = Model(inputs=xs, outputs=out)model.compile(loss='mse', optimizer='adam')

With less than 20 lines of code, you have built a graph network model that is ready for materials property prediction!

Implementation details

Graph networks[1] are a superclass of graph-based neural networks. There are a few innovations compared to conventional graph-based neural neworks.

Global state attributes are added to the node/edge graph representation. These features work as a portal for structure-independent features such as temperature, pressure etc and also are an information exchange placeholder that facilitates information passing across longer spatial domains.The update function involves the message interchange among all three levels of information, i.e., the node, bond and state information. It is therefore a highly general model.

The MEGNet model implements two major components: (a) the graph network layer and (b) the set2set layer.[3] The layers are based on keras API and is thus compatible with other keras modules.

Different crystals/molecules have different number of atoms. Therefore it is impossible to use data batches without padding the structures to make them uniform in atom number. MEGNet takes a different approach. Instead of making structure batches, we assemble many structures into one giant structure, and this structure has a vector output with each entry being the target value for the corresponding structure. Therefore, the batch number is always 1.

Assuming a structure has N atoms and M bonds, a structure graph is represented as V (nodes/vertices, representing atoms), E (edges, representing bonds) and u (global state vector). V is a N*Nv matrix. E comprises of a M*Nm matrix for the bond attributes and index pairs (rk, sk) for atoms connected by each bond. u is a vector with length Nu. We vectorize rk and sk to form index1 and index2, both are vectors with length M. In summary, the graph is a data structure with V (N*Nv), E (M*Nm), u (Nu, ), index1 (M, ) and index2 (M, ).

We then assemble several structures together. For V, we directly append the atomic attributes from all structures, forming a matrix (1*N'*Nv), where N' > N. To indicate the belongingness of each atom attribute vector, we use a atom_ind vector. For example if N'=5 and the first 3 atoms belongs to the first structure and the remainings the second structure, our atom_ind vector would be [0, 0, 0, 1, 1]. For the bond attribute, we perform the same appending method, and use bond_ind vector to indicate the bond belongingness. For index1 and index2, we need to shift the integer values. For example, if index1 and index2 are [0, 0, 1, 1] and [1, 1, 0, 0] for structure 1 and are [0, 0, 1, 1] and [1, 1, 0, 0] for structure two. The assembled indices are [0, 0, 1, 1, 2, 2, 3, 3] and [1, 1, 0, 0, 3, 3, 2, 2]. Finally, u expands a new dimension to take into account of the number of structures, and becomes a 1*Ng*Nu tensor, where Ng is the number of structures. 1 is added as the first dimension of all inputs because we fixed the batch size to be 1 (1 giant graph) to comply the keras inputs requirements.

In summary the inputs for the model is V (1*N'*Nv), E (1*M'*Nm), u (1*Ng*Nu), index1 (1*M'), index2 (1*M'), atom_ind (1*N'), and bond_ind (1*M'). For Z-only atomic features, V is a (1*N') vector.

Data sets

To aid others in reproducing (and improving on) our results, we have provided our MP-crystals-2018.6.1 crystal data set via figshare[4]. The MP-crystals-2018.6.1 data set comprises the DFT-computed energies and band gaps of 69,640 crystals from the Materials Project obtained via the Python Materials Genomics (pymatgen) interface to the Materials Application Programming Interface (API)[5] on June 1, 2018. The crystal graphs were constructed using a radius cut-off of 4 angstroms. Using this cut-off, 69,239 crystals do not form isolated atoms and are used in the models. A subset of 5,830 structures have elasticity data that do not have calculation warnings and will be used for elasticity models.

The molecule data set used in this work is the QM9 data set 30 processed by Faber et al.[6] It contains the B3LYP/6-31G(2df,p)-level DFT calculation results on 130,462 small organic molecules containing up to 9 heavy atoms.

Computing requirements

Training: It should be noted that training MEGNet models, like other deep learning models, is fairly computationally intensive with large datasets. In our work, we use dedicated GPU resources to train MEGNet models with 100,000 crystals/molecules. It is recommended that you do the same.

Prediction: Once trained, prediction using MEGNet models are fairly cheap. For example, the http://megnet.crystals.ai web app runs on a single hobby dyno on Heroku and provides the prediction for any crystal within seconds.

Known limitations

isolated atoms error. This error occurs when using the given cutoff in the model (4A for 2018 models and 5A for 2019 models), the crystal structure contains isolated atoms, i.e., no neighboring atoms are within the distance of cutoff. Most of the time, we can just discard the structure, since we found that those structures tend to have a high energy above hull (less stable). If you think this error is an essential issue for a particular problem, please feel free to email us and we will consider releasing a new model with increased cutoff.

Contributors

Chi Chen from the Materials Virtual Lab is the lead developer of MEGNet.Shyue Ping Ong and other members of the Materials Virtual Lab contributes to general improvements of MEGNet and its applications.Logan Ward has made extensive contributions, especially to the development of molecular graph portions of MEGNet.

References

Battaglia, P. W.; Hamrick, J. B.; Bapst, V.; Sanchez-Gonzalez, A.; Zambaldi, V.; Malinowski, M.; Tacchetti, A.; Raposo, D.; Santoro, A.; Faulkner, R.; et al. Relational inductive biases, deep learning, and graph networks. 2018, 1–38. arXiv:1806.01261Chen, C.; Ye, W.; Zuo, Y.; Zheng, C.; Ong, S. P. Graph Networks as a Universal Machine Learning Framework for Molecules and Crystals. Chemistry of Materials 2019, 31(9), 3564-3572. doi:10.1021/acs.chemmater.9b01294Vinyals, O.; Bengio, S.; Kudlur, M. Order Matters: Sequence to sequence for sets. 2015, arXiv preprint. arXiv:1511.06391https://figshare.com/articles/Graphs_of_materials_project/7451351Ong, S. P.; Cholia, S.; Jain, A.; Brafman, M.; Gunter, D.; Ceder, G.; Persson, K. A. The Materials Application Programming Interface (API): A simple, flexible and efficient API for materials data based on REpresentational State Transfer (REST) principles. Comput. Mater. Sci. 2015, 97, 209–215 DOI: 10.1016/j.commatsci.2014.10.037.Faber, F. A.; Hutchison, L.; Huang, B.; Gilmer, J.; Schoenholz, S. S.; Dahl, G. E.; Vinyals, O.; Kearnes, S.; Riley, P. F.; von Lilienfeld, O. A. Prediction errors of molecular machine learning models lower than hybrid DFT error. Journal of Chemical Theory and Computation 2017, 13, 5255–5264. DOI: 10.1021/acs.jctc.7b00577

版权声明:本文内容由网络用户投稿,版权归原作者所有,本站不拥有其著作权,亦不承担相应法律责任。如果您发现本站中有涉嫌抄袭或描述失实的内容,请联系我们jiasou666@gmail.com 处理,核实后本网站将在24小时内删除侵权内容。

上一篇:springboot中.yml文件的值无法读取的问题及解决
下一篇:springboot结合redis实现搜索栏热搜功能及文字过滤
相关文章

 发表评论

暂时没有评论,来抢沙发吧~