Skip to main content
×
Home

Neural Networks for Computational Chemistry: Pitfalls and Recommendations

  • Grégoire Montavon (a1) and Klaus-Robert Müller (a1) (a2)
Abstract
ABSTRACT

There is a long history of using neural networks for function approximation in computational physics and chemistry. Despite their conceptual simplicity, the practitioner may face difficulties when it comes to putting them to work. This small guide intends to pinpoint some neural networks pitfalls, along with corresponding solutions to successfully realize function approximation tasks in physics, chemistry or other fields.

Copyright
References
Hide All
1. Lorenz Sönke, Groß Axel, and Scheffler Matthias. Representing high-dimensional potential-energy surfaces for reactions at surfaces by neural networks. Chemical Physics Letters, 395(4–6):210215, 2004.
2. Manzhos Sergei and Carrington Tucker. A random-sampling high dimensional model representation neural network for building potential energy surfaces. J. Chem. Phys., 125:084109, 2006.
3. Behler Jörg. Neural network potential-energy surfaces in chemistry: a tool for large-scale simulations. Physical Chemistry Chemical Physics, 13(40):1793017955, 2011.
4. Snyder John C., Rupp Matthias, Hansen Katja, Müller Klaus-Robert, and Burke Kieron. Finding Density Functionals with Machine Learning, Physical Review Letters, 108(25): 253002, American Physical Society, 2012.
5. Rupp Matthias, Tkatchenko Alexandre, Müller Klaus-Robert, and Anatole von Lilienfeld O.. Fast and Accurate Modeling of Molecular Atomization Energies with Machine Learning, Physical Review Letters, 108(5):058301, 2012.
6. Montavon Grégoire, Hansen Katja, Fazli Siamac, Rupp Matthias, Biegler Franziska, Ziehe Andreas, Tkatchenko Alexandre, Anatole von Lilienfeld O., and Müller Klaus-Robert. Learning Invariant Representations of Molecules for Atomization Energy Prediction, Advances in Neural Information Processing Systems 25, 449457, 2012.
7. Rumelhart David E., Hinton Geoffrey E., and Williams Ronald J.. Learning representations by backpropagating errors, Nature, 323, 533536, 1986.
8. Bottou Léon. Stochastic Gradient Learning in Neural Networks, Proceedings of Neuro-Nîmes 91, EC2, Nimes, France, 1991.
9. LeCun Yann, Bottou Léon, Orr Geneviève B., and Müller Klaus-Robert. Efficient BackProp, in Orr G. B., and Müller K-R. (Eds), Neural Networks: Tricks of the trade, Springer, 1998.
10. Montavon Grégoire, Orr Geneviève B., and Müller Klaus-Robert (Eds). Neural Networks: Tricks of the Trade. 2nd edn, LNCS 7700, Springer, 2012.
11. Montavon Grégoire, Braun Mikio L., and Müller Klaus-Robert. Kernel analysis of deep networks. Journal of Machine Learning Research, 12:25632581, 2011.
12. Blum Lorenz C. and Reymond Jean-Louis. 970 million druglike small molecules for virtual screening in the chemical universe database GDB-13. Journal of the American Chemical Society, 131(25):87328733, 2009.
13. Rupp Matthias, Tkatchenko Alexandre, Müller Klaus-Robert, and Anatole von Lilienfeld O.. Reply to Comment by J.E. Moussa (Physical Review Letters 109(5): 059801, 2012), Physical Review Letters, 109(5): 059802, American Physical Society, 2012.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

MRS Online Proceedings Library (OPL)
  • ISSN: -
  • EISSN: 1946-4274
  • URL: /core/journals/mrs-online-proceedings-library-archive
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Keywords:

Metrics

Full text views

Total number of HTML views: 0
Total number of PDF views: 7 *
Loading metrics...

Abstract views

Total abstract views: 314 *
Loading metrics...

* Views captured on Cambridge Core between September 2016 - 19th November 2017. This data will be updated every 24 hours.