Skip to main content
×
×
Home

Prediction of body mass index in mice using dense molecular markers and a regularized neural network

  • HAYRETTIN OKUT (a1) (a2), DANIEL GIANOLA (a2) (a3) (a4), GUILHERME J. M. ROSA (a3) (a4) and KENT A. WEIGEL (a2)
Summary

Bayesian regularization of artificial neural networks (BRANNs) were used to predict body mass index (BMI) in mice using single nucleotide polymorphism (SNP) markers. Data from 1896 animals with both phenotypic and genotypic (12 320 loci) information were used for the analysis. Missing genotypes were imputed based on estimated allelic frequencies, with no attempt to reconstruct haplotypes based on family information or linkage disequilibrium between markers. A feed-forward multilayer perceptron network consisting of a single output layer and one hidden layer was used. Training of the neural network was done using the Bayesian regularized backpropagation algorithm. When the number of neurons in the hidden layer was increased, the number of effective parameters, γ, increased up to a point and stabilized thereafter. A model with five neurons in the hidden layer produced a value of γ that saturated the data. In terms of predictive ability, a network with five neurons in the hidden layer attained the smallest error and highest correlation in the test data although differences among networks were negligible. Using inherent weight information of BRANN with different number of neurons in the hidden layer, it was observed that 17 SNPs had a larger impact on the network, indicating their possible relevance in prediction of BMI. It is concluded that BRANN may be at least as useful as other methods for high-dimensional genome-enabled prediction, with the advantage of its potential ability of capturing non-linear relationships, which may be useful in the study of quantitative traits under complex gene action.

  • View HTML
    • Send article to Kindle

      To send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle.

      Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

      Find out more about the Kindle Personal Document Service.

      Prediction of body mass index in mice using dense molecular markers and a regularized neural network
      Available formats
      ×
      Send article to Dropbox

      To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Dropbox.

      Prediction of body mass index in mice using dense molecular markers and a regularized neural network
      Available formats
      ×
      Send article to Google Drive

      To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your <service> account. Find out more about sending content to Google Drive.

      Prediction of body mass index in mice using dense molecular markers and a regularized neural network
      Available formats
      ×
Copyright
Corresponding author
*Corresponding author: University of Wisconsin 1675 Observatory Drive, Madison, WI 53703, USA. Tel: +1 608 772 4922. e-mail: okut@wisc.edu
References
Hide All
Aggarwal, K. K., Singh, Y., Chandra, P. & Puri, M. (2005). Bayesian regularization in a neural network model to estimate lines of code using function points. Journal of Computer Sciences 1, 505509.
Alados, I., Mellado, J. A., Ramos, F. & Alados-Arboledas, L. (2004). Estimating UV erythema1 irradiance by means of neural networks. Photochemistry and Photobiology 80, 351358.
Bishop, C. M. & Tipping, M. E. (1998). A hierarchical latent variable model for data visualization. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(3), 281293.
Chen, L. J., Cui, L. Y., Xing, L. & Han, L. J. (2008). Prediction of the nutrient content in dairy manure using artificial neural network modeling. Journal of Dairy Science 91, 48224829.
Curtis, D. (2007). Comparison of artificial neural network analysis with other multimarker methods for detecting genetic association. BMC Genetics 8, 49.
de los Campos, G., Hugo, N., Gianola, G., Crossa, J., Legarra, A., Manfredi, E., Weigel, K. & Miguel, C. J. (2009). Predicting quantitative traits with regression models for dense molecular markers and pedigree. Genetics 182, 375385.
Demuth, H., Beale, M. & Hagan, M. (2009). Neural Network Toolbox™ 6 User's Guide. The MathWorks Inc. Natick, MA, USA.
Feng, N., Wang, F. & Qiu, Y. (2006). Novel approach for promoting the generalization ability of neural networks. International Journal of Signal Processing 2, 131135.
Fernandez, M. & Caballero, J. (2006). Ensembles of Bayesian-regularized genetic neural networks for modeling of acetylcholinesterase inhibition by huprines. Chemistry and Biology Drug Design 68, 201212.
Foresee, F. D. &. Hagan, M. T. (1997). Gauss-Newton approximation to Bayesian learning. In Proceedings of IEEE International Conference on Neural Networks 1997 (ed. Hagan, M. T.), pp. 19301935.
Forshed, J., Anderson, O. F. & Jacobsson, P. S. (2002). NMR and Bayesian regularized neural network regression for impurity determination of 4-aminophenol. Journal of Pharmaceutical and Biomedical Analysis 29, 495505.
Gencay, R. & Qi, M. (2001). Pricing and hedging derivative securities with neural networks: Bayesian regularization, early stopping, and bagging. IEEE Transactions on Neural Networks 12, 726734.
Guha, R., Stanton, T. D. & Jurs, C. P. (2005). Interpreting computational neural network quantitative structure-activity relationship models: a detailed interpretation of the weights and biases. Journal of Chemical Information and Modelling 45, 1091121.
Hajmeer, M., Basheer, I. & Cliver, D. O. (2006). Survival curves of Listeria monocytogenes in chorizos modeled with artificial neural networks. Food Microbiology 23, 561–70.
Haykin, S. (2008). Neural Networks: Comprehensive Foundation. 2nd edn. Upper Saddle River, NJ: Prentice-Hall. A Comprehensive Foundation 3rd edit: Prentice-Hall.
Joseph, H., Huang, W. L. & Dickman, M. (2003). Neural network modelling of coastal algal blooms. Ecology Modelling 159, 179201.
Kelemen, A. & Liang, Y. (2008). Statistical advances and challenges for analyzing correlated high dimensional SNP data in genomic study for complex. Diseases Statistics Surveys 2, 4360.
Kumar, P., Merchant, S. N. & Desai, U. B. (2004). Improving performance in pulse radar detection using Bayesian regularization for neural network training. Digital Signal Processing 14, 438448.
Lampinen, J. & Vehtari, A. (2001). Bayesian approach for neural networks review and case studies. Neural Networks 14, 257274.
Legarra, A., Robert-Granie, C., Manfredi, E. & Elsen, J. M. (2008). Performance of genomic selection in mice. Genetics 180, 611618.
Long, N., Gianola, D., Rosa, G. J. M., Weigel, K. A. & Avendan, S. (2007). Machine learning classification procedure for selecting SNPs in genomic selection: application to early mortality in broilers. Journal of Animal Breeding and Genetics 124, 377389.
MacKay, D. J. C. (1992). Bayesian interpolation. Neural Computation 4, 415447.
MacKay, J. C. D. (1996). Comparison of approximate methods for handling hyperparameters. Neural Computation 8, 135.
MacKay, J. C. D. (2008). Information theory, inference and learning algorithms. Cambridge: Cambridge University Press.
Maier, H. R. & Dandy, C. G. (2000). Neural networks for the prediction and forecasting of water resources variables: a review of modelling issues and applications. Environmental Modelling and Software 15, 101124.
Marwala, T. (2007). Bayesian training of neural networks using genetic programming. Pattern Recognition Letters 28, 14521458.
Mott, R. (2006). Finding the molecular basis of complex genetic variation in humans and mice. Philosophical Transactions 361, 393401.
Mott, R., Talbot, C. J., Turri, M. G., Collins, A. C. & Flint, J. (2000). A method for fine mapping quantitative trait loci in outbred animal stocks. Proceedings of the National Academy of Sciences of the USA 97, 1264912654.
Mutoh, H., Hamajima, N., Tajima, K., Kobayahsi, T. & Honda, H. (2005). Exhaustive exploring using artificial neural network for identification of SNPs combination to Heliobacter pylori infection susceptibility. Chem-Bio Informatics 5, 1526.
Nguyen, D. & Widrow, B. (1990). Improving the learning speed of two-layer neural networks by choosing initial values of the adaptive weights. Proceedings of International Joint Conference on Neural Networks 3, 2126.
Ping, G., Michael, R. L. & Chen, C. L. P. (2003). Regularization Parameter Estimation for Feedforward Neural Networks. IEEE Transactions of Systems, Man, and Cybernetics—Part B: Cybernetics 33, 3544.
Ripley, B. D. (2007). Pattern Recognition and Neural Networks. New York: Cambridge University Press.
SAS/STAT® (2009). Version 9.13. Cary, NC: SAS Institute Inc.
Shaneh, A. & Butler, G. (2006). Bayesian learning for feed-forward neural network with application to proteomic data: the glycosylation sites detection of the epidermal growth factor-like proteins associated with cancer as a case study. In Canadian AI LNAI 4013, 2006 (ed. Lamontagne, L. & Marchand, M.), pp. 110121. Berlin-Heiddelberg: Springer-Verleg.
Sorich, M. J., Miners, J. O., Ross, A. M., Winker, D. A., Burden, F. R. & Smith, P. A. (2003). Comparison of linear and nonlinear classification algorithms for the prediction of drug and chemical metabolism by human UDP-Glucuronosyltransferase isoforms. Journal of Chemical Information and Computer Sciences 43, 20192024.
Thodberg, H. H. (1996). A review of Bayesian neural networks with an application to near infrared spectroscopy. IEEE Transactions on Neural Networks 7, 5672.
Titterington, D. M. (2004). Bayesian methods for neural networks and related models. Statistical Science 19, 128139.
Tu, J. V. (1997). Advantages and disadvantages of using artificial neural networks versus logistic regression for predicting medical outcomes. Journal of Clinical Epidemiology 49, 12251231.
Useche, F., Hanafey, G. G. & Rafalski, A. (2001). High-throughput identification, database storage and analysis of SNPs in EST sequences. Genome Informatics 12, 194203.
Valdar, W., Solberg, L. C., Gauguier, D., Burnett, S. & Klenerman, P. (2006 a). Genome-wide genetic association of complex traits in heterogeneous stock mice. Nature Genetics 38, 879887.
Valdar, W., Solberg, L. C., Gauguier, D., Cookson, W. O. & Rawlins, J. N. P. (2006 b). Genetic and environmental effects on complex traits in mice. Genetics 174, 959984.
Vazquez, A. I., Rosa, G. J. M., Weigel, K. A., de los Campos, G., Gianola, G. & Allison, D. B. (2010). Predictive ability of subsets of single nucleotide polymorphisms with and without parent average in US Holsteins. Journal of Dairy Sciences 93(12), 59425949.
Wang, H. J., Ji, F., Leung, C. S. & Sum, P. F. (2009). Regularization parameter selection for faulty neural networks. International Journal of Intelligent Systems and Technologies 4, 4548.
Winkler, D. A. & Burden, F. R. (2004). Modelling blood–brain barrier partitioning using Bayesian neural nets. Journal of Molecular Graphics and Modelling 22, 499505.
Xu, M., Zengi, G., Xu, X., Huang, G., Jiang, R. & Sun, W. (2006). Application of Bayesian regularized BP neural network model for trend analysis, acidity and chemical composition of precipitation in North. Water, Air, and Soil Pollution 172, 167184.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Genetics Research
  • ISSN: 0016-6723
  • EISSN: 1469-5073
  • URL: /core/journals/genetics-research
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed