Skip to main content Accesibility Help
×
×
Home

Image-based food portion size estimation using a smartphone without a fiducial marker

  • Yifan Yang (a1) (a2), Wenyan Jia (a1), Tamara Bucher (a3), Hong Zhang (a2) and Mingui Sun (a1)...
Abstract
Objective

Current approaches to food volume estimation require the person to carry a fiducial marker (e.g. a checkerboard card), to be placed next to the food before taking a picture. This procedure is inconvenient and post-processing of the food picture is time-consuming and sometimes inaccurate. These problems keep people from using the smartphone for self-administered dietary assessment. The current bioengineering study presents a novel smartphone-based imaging approach to table-side estimation of food volume which overcomes current limitations.

Design

We present a new method for food volume estimation without a fiducial marker. Our mathematical model indicates that, using a special picture-taking strategy, the smartphone-based imaging system can be calibrated adequately if the physical length of the smartphone and the output of the motion sensor within the device are known. We also present and test a new virtual reality method for food volume estimation using the International Food Unit™ and a training process for error control.

Results

Our pilot study, with sixty-nine participants and fifteen foods, indicates that the fiducial-marker-free approach is valid and that the training improves estimation accuracy significantly (P<0·05) for all but one food (egg, P>0·05).

Conclusions

Elimination of a fiducial marker and application of virtual reality, the International Food Unit™ and an automated training allowed quick food volume estimation and control of the estimation error. The estimated volume could be used to search a nutrient database and determine energy and nutrients in the diet.

Copyright
Corresponding author
* Corresponding author: Email drsun@pitt.edu
References
Hide All
1. Jeran, S, Steinbrecher, A & Pischon, T (2016) Prediction of activity-related energy expenditure using accelerometer-derived physical activity under free-living conditions: a systematic review. Int J Obes (Lond) 40, 11871197.
2. Huang, Y, Xu, J, Yu, B et al. (2016) Validity of FitBit, Jawbone UP, Nike+ and other wearable devices for level and stair walking. Gait Posture 48, 3641.
3. Evenson, KR, Goto, MM & Furberg, RD (2015) Systematic review of the validity and reliability of consumer-wearable activity trackers. Int J Behav Nutr Phys Act 12, 159.
4. Taraldsen, K, Chastin, SF, Riphagen, II et al. (2012) Physical activity monitoring by use of accelerometer-based body-worn sensors in older adults: a systematic literature review of current knowledge and applications. Maturitas 71, 1319.
5. Päßler, S & Fischer, W-J (2014) Food intake monitoring: automated chew event detection in chewing sounds. IEEE J Biomed Health Inform 18, 278289.
6. Bi, Y, Lv, M, Song, C et al. (2016) AutoDietary: a wearable acoustic sensor system for food intake recognition in daily life. IEEE Sensors J 16, 806816.
7. Alshurafa, N, Kalantarian, H, Pourhomayoun, M et al. (2015) Recognition of nutrition intake using time-frequency decomposition in a wearable necklace using a piezoelectric sensor. IEEE Sensors J 15, 39093916.
8. Farooq, M & Sazonov, E (2016) Linear regression models for chew count estimation from piezoelectric sensor signals. In Proceedings of the 2016 10th International Conference on Sensing Technology (ICST), Nanjing, China, 11–13 November 2016, pp. 1–5. New York: Institute of Electrical and Electronics Engineers; available at http://ieeexplore.ieee.org/document/7796222/
9. Hoover, A & Sazonov, E (2016) Measuring human energy intake and ingestive behavior: challenges and opportunities. IEEE Pulse 7, 67.
10. Nørnberg, TR, Houlby, L, Jørgensen, LN et al. (2014) Do we know how much we put on the plate? Assessment of the accuracy of self-estimated versus weighed vegetables and whole grain portions using an intelligent buffet at the FoodScape lab. Appetite 81, 162167.
11. Gemming, L, Doherty, A, Kelly, P et al. (2013) Feasibility of a SenseCam-assisted 24-h recall to reduce under-reporting of energy intake. Eur J Clin Nutr 67, 10951099.
12. Sun, M, Burke, LE, Mao, ZH et al. (2014) eButton: a wearable computer for health monitoring and personal assistance. Proc Des Autom Conf 2014, 16.
13. Sun, M, Burke, LE, Baranowski, T et al. (2015) An exploratory study on a chest-worn computer for evaluation of diet, physical activity and lifestyle. J Healthc Eng 6, 122.
14. Mezgec, S & Korousic Seljak, B (2017) NutriNet: a deep learning food and drink image recognition system for dietary assessment. Nutrients 9, E657.
15. Hassannejad, H, Matrella, G, Ciampolini, P et al. (2016) Food image recognition using very deep convolutional networks. In MADiMa ’16 – Proceedings of the 2nd International Workshop on Multimedia Assisted Dietary Management, Amsterdam, 16 October 2016, pp. 41–49. New York: Association for Computing Machinery; available at https://dl.acm.org/citation.cfm?id=2986042
16. Liu, C, Cao, Y, Luo, Y et al. (2016) DeepFood: deep learning-based food image recognition for computer-aided dietary assessment. In ICOST 2016 Proceedings of the 14th International Conference on Inclusive Smart Cities and Digital Health, Wuhan, China, 25–27 May 2016, pp. 37–48. New York: Springer; available at https://dl.acm.org/citation.cfm?id=2960855
17. Christodoulidis, S, Anthimopoulos, M & Mougiakakou, S (2015) Food recognition for dietary assessment using deep convolutional neural networks. In New Trends in Image Analysis and Processing – ICIAP 2015 Workshops. ICIAP 2015. Lecture Notes in Computer Science, vol. 9281, pp. 458465 [V Murino, E Puppo, D Sona et al., editors]. Cham: Springer.
18. Kagaya, H, Aizawa, K & Ogawa, M (2014) Food detection and recognition using convolutional neural network. In Proceedings of the 22nd ACM International Conference on Multimedia, Orlando, FL, USA, 3–7 November 2014, pp. 1085–1088. New York: Association for Computing Machinery; available at https://dl.acm.org/citation.cfm?id=2647868
19. Jia, W, Li, Y, Qu, R et al. (2018) Automatic food detection in egocentric images using artificial intelligence technology. Public Health Nutr (In the Press).
20. Beltran, A, Dadabhoy, H, Chen, TA et al. (2016) Adapting the eButton to the abilities of children for diet assessment. In Proceedings of Measuring Behavior 2016 – 10th International Conference on Methods and Techniques in Behavioral Research, pp. 72–81 [A Spink, G Riedel, L Zhou et al., editors]. http://www.measuringbehavior.org/files/2016/MB2016_Proceedings.pdf (accessed February 2018).
21. Pew Research Center (2017) Mobile Fact Sheet. http://www.pewinternet.org/fact-sheet/mobile/ (accessed February 2018).
22. Gemming, L, Utter, J & Ni Mhurchu, C (2015) Image-assisted dietary assessment: a systematic review of the evidence. J Acad Nutr Diet 115, 6477.
23. Martin, CK, Nicklas, T, Gunturk, B et al. (2014) Measuring food intake with digital photography. J Hum Nutr Diet 27, Suppl. 1, 7281.
24. Stumbo, PJ (2013) New technology in dietary assessment: a review of digital methods in improving food record accuracy. Proc Nutr Soc 72, 7076.
25. Boushey, CJ, Kerr, DA, Wright, J et al. (2009) Use of technology in children’s dietary assessment. Eur J Clin Nutr 63, Suppl. 1, S50S57.
26. Steele, R (2015) An overview of the state of the art of automated capture of dietary intake information. Crit Rev Food Sci Nutr 55, 19291938.
27. Puri, M, Zhu, Z, Lubin, J et al. (2013) Food recognition using visual analysis and speech recognition. US Patent 2013/0260345 A1. Menlo Park, CA: SRI International.
28. Hardesty, L (2016) Voice-controlled calorie counter. Spoken-language app makes meal logging easier, could aid weight loss. MIT News, 24 March. http://news.mit.edu/2016/voice-controlled-calorie-counter-0324 (accessed February 2018).
29. Khanna, N, Boushey, CJ, Kerr, D et al. (2010) An overview of the technology assisted dietary assessment project at Purdue University. In Proceedings of the 2010 IEEE International Symposium on Multimedia (ISM), Taichung, Taiwan, 13–15 December 2010, pp. 290–295. New York: Institute of Electrical and Electronics Engineers; available at http://ieeexplore.ieee.org/document/5693855/
30. Ashman, AM, Collins, CE, Brown, LJ et al. (2016) A brief tool to assess image-based dietary records and guide nutrition counselling among pregnant women: an evaluation. JMIR Mhealth Uhealth 4, e123.
31. Pendergast, FJ, Ridgers, ND, Worsley, A et al. (2017) Evaluation of a smartphone food diary application using objectively measured energy expenditure. Int J Behav Nutr Phys Act 14, 30.
32. Bucher, T, Weltert, M, Rollo, ME et al. (2017) The international food unit: a new measurement aid that can improve portion size estimation. Int J Behav Nutr Phys Act 14, 124.
33. Bucher, T, Rollo, M, Matthias, W et al. (2017) The international food unit (IFU) can improve food volume estimation. In Abstract Book of ISBNPA Conference, Victoria, Canada, 7–10 June 2017, p. 100. https://www.isbnpa.org/files/articles/2018/01/30/70/attachments/5a70f8d2d3cc2.pdf (accessed March 2018).
34. Jia, W, Yue, Y, Fernstrom, JD et al. (2012) Image-based estimation of food volume using circular referents in dietary assessment. J Food Eng 109, 7686.
35. Yao, N (2010) Food dimension estimation from a single image using structured lights. PhD Thesis, University of Pittsburgh.
36. Langston, J (2016) This smartphone technology 3-D maps your meal and counts its calories. UW News, 19 January. http://www.washington.edu/news/2016/01/19/this-smartphone-app-3-d-maps-your-meal-and-counts-its-calories/ (accessed February 2018).
37. Zhang, Z (2010) Food volume estimation from a single image using virtual reality technology. Masters Thesis, University of Pittsburgh.
38. Jia, W, Chen, HC, Yue, Y et al. (2014) Accuracy of food portion size estimation from digital pictures acquired by a chest-worn camera. Public Health Nutr 17, 16711681.
39. Chae, J, Woo, I, Kim, S et al. (2011) Volume estimation using food specific shape templates in mobile image-based dietary assessment. Proc SPIE 7873, 78730K.
40. Chen, HC, Jia, W, Yue, Y et al. (2013) Model-based measurement of food portion size for image-based dietary assessment using 3D/2D registration. Meas Sci Technol 24, 105701.
41. Ma, Y, Soatto, S, Kosecka, J et al. (2004) Interdisciplinary Applied Mathematics. vol. 26: An Invitation to 3-D Vision: From Images to Geometric Models. New York: Springer.
42. Valkenburg, RJ & McIvor, AM (1998) Accurate 3D measurement using a structured light system. Image Vis Comput 16, 99110.
43. Eftimov, T, Korosec, P & Korousic Seljak, B (2017) StandFood: standardization of foods using a semi-automatic system for classifying and describing foods according to FoodEx2. Nutrients 9, 542.
Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Public Health Nutrition
  • ISSN: 1368-9800
  • EISSN: 1475-2727
  • URL: /core/journals/public-health-nutrition
Please enter your name
Please enter a valid email address
Who would you like to send this to? *
×

Keywords

Metrics

Altmetric attention score

Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed