Skip to main content Accessibility help
×
Hostname: page-component-848d4c4894-wg55d Total loading time: 0 Render date: 2024-05-20T18:41:18.676Z Has data issue: false hasContentIssue false

Chapter 11 - Computer Vision for the Study of Older (and Younger) Adult Faces

Approaches, Advances, and Applications

from Part V - Methodological Approaches to the Study of the Effects of Aging on Emotion Communication

Published online by Cambridge University Press:  07 December 2023

Ursula Hess
Affiliation:
Humboldt-Universität zu Berlin
Reginald B. Adams, Jr.
Affiliation:
Pennsylvania State University
Robert E. Kleck
Affiliation:
Dartmouth College, New Hampshire
Get access

Summary

Computer vision and machine learning are rapidly advancing fields of study. For better or worse, these tools have already permeated our everyday lives and are used for everything from auto-tagging social media images to curating what we view in our news feed. In this chapter, we discuss historical and contemporary approaches used to study face recognition, detection, manipulation, and generation. We frame our discussion within the context of how this work has been applied to the study of older adults, but also acknowledge that more work is needed both within this domain as well as at its intersection with, e.g., race and gender. Throughout the chapter we review, and at the end offer links to (Table 11.1), a number of resources that researchers can start using now in their research. We also discuss ongoing concerns related to the ethics of artificial intelligence and to using this emerging technology responsibly.

Type
Chapter
Information
Emotion Communication by the Aging Face and Body
A Multidisciplinary View
, pp. 265 - 285
Publisher: Cambridge University Press
Print publication year: 2023

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Adams, R. B. Jr., Albohn, D. N., Hedgecoth, N., Garrido, C. O., & Adams, K. D. (2022). Angry White faces: A contradiction of racial stereotypes and emotion-resembling appearance. Affective Science. https://doi.org/10.1007/s42761-021-00091-5CrossRefGoogle Scholar
Adams, R. B. Jr., Garrido, C. O., Albohn, D. N., Hess, U., & Kleck, R. E. (2016). What facial appearance reveals over time: When perceived expressions in neutral faces reveal stable emotion dispositions. Frontiers in Psychology, 7. https://doi.org/10.3389/fpsyg.2016.00986CrossRefGoogle ScholarPubMed
Adams, R. B. Jr., Nelson, A. J., Soto, J. A., Hess, U., & Kleck, R. E. (2012). Emotion in the neutral face: A mechanism for impression formation? Cognition & Emotion, 26(3), 431441. https://doi.org/10.1080/02699931.2012.666502Google Scholar
Albohn, D. N., & Adams, R. B. Jr. (2020a). Everyday beliefs about emotion perceptually derived from neutral facial appearance. Frontiers in Psychology, 11, 264. https://doi.org/10.3389/fpsyg.2020.00264CrossRefGoogle ScholarPubMed
Albohn, D. N., & Adams, R. B. Jr. (2020b). Emotion residue in neutral faces: Implications for impression formation. Social Psychological and Personality Science, 12(4), 479486. https://doi.org/10.1177/1948550620923229CrossRefGoogle Scholar
Albohn, D. N., & Adams, R. B. Jr., (2021). The expressive triad: Structure, color, and texture similarity of emotion expressions predict impressions of neutral faces. Frontiers in Psychology, 12. https://doi.org/10.3389/fpsyg.2021.612923CrossRefGoogle ScholarPubMed
Baltrušaitis, T., Zadeh, A., Lim, Y. C., & Morency, L.-P. (2018). OpenFace 2.0: Facial Behavior Analysis Toolkit. 2018 13th IEEE International Conference on Automatic Face Gesture Recognition (FG 2018), 5966. https://doi.org/10.1109/FG.2018.00019CrossRefGoogle Scholar
Blanz, V., & Vetter, T. (1999). A morphable model for the synthesis of 3D faces. Proceedings of the 26th Annual Conference on Computer Graphics and Interactive Techniques – SIGGRAPH ’99, 187194. https://doi.org/10.1145/311535.311556Google Scholar
Buolamwini, J., & Gebru, T. (2018). Gender shades: Intersectional accuracy disparities in commercial gender classification. Proceedings of Machine Learning Research, 81, 115.Google Scholar
Coeckelbergh, M. (2020). AI ethics. Cambridge, MA: MIT Press.CrossRefGoogle Scholar
Cula, G. O., Bargo, P. R., Nkengne, A., & Kollias, N. (2013). Assessing facial wrinkles: Automatic detection and quantification. Skin Research and Technology, 19(1), e243e251. https://doi.org/10.1111/j.1600-0846.2012.00635.xCrossRefGoogle ScholarPubMed
DeBruine, L. M., Jones, B. C., Little, A. C., & Perrett, D. I. (2008). Social perception of facial resemblance in humans. Archives of Sexual Behavior, 37(1), 6477. https://doi.org/10.1007/s10508-007-9266-0CrossRefGoogle ScholarPubMed
Dotsch, R., & Todorov, A. (2012). Reverse correlating social face perception. Social Psychological and Personality Science, 3(5), 562571. https://doi.org/10.1177/1948550611430272CrossRefGoogle Scholar
Dotsch, R., Wigboldus, D. H. J., Langner, O., & van Knippenberg, A. (2008). Ethnic out-group faces are biased in the prejudiced mind. Psychological Science, 19(10), 978980. https://doi.org/10.1111/j.1467-9280.2008.02186.xCrossRefGoogle ScholarPubMed
Dupré, D., Krumhuber, E. G., Küster, D., & McKeown, G. J. (2020). A performance comparison of eight commercially available automatic classifiers for facial affect recognition. PLoS ONE, 15(4), e0231968. https://doi.org/10.1371/journal.pone.0231968Google ScholarPubMed
Ekman, P., & Friesen, W. V. (1978). Facial action coding system. Palo Alto, CA: Consulting Psychologists Press.Google Scholar
Elbashir, R. M., & Hoon Yap, M. (2020). Evaluation of automatic facial wrinkle detection algorithms. Journal of Imaging, 6(4), 17. https://doi.org/10.3390/jimaging6040017CrossRefGoogle ScholarPubMed
Engelmann, S., Ullstein, C., Papakyriakopoulos, O., & Grossklags, J. (2022). What People Think AI Should Infer From Faces. 2022 ACM Conference on Fairness, Accountability, and Transparency, 128141. https://doi.org/10.1145/3531146.3533080CrossRefGoogle Scholar
Essa, I. A., & Pentland, A. P. (1997). Coding, analysis, interpretation, and recognition of facial expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19(7), 757763. https://doi.org/10.1109/34.598232CrossRefGoogle Scholar
Fink, B., & Prager, M. (2014). The effect of incobotulinumtoxin a and dermal filler treatment on perception of age, health, and attractiveness of female faces. The Journal of Clinical and Aesthetic Dermatology, 7(1), 3640.Google ScholarPubMed
Gal, R., Patashnik, O., Maron, H., Chechik, G., & Cohen-Or, D. (2021). StyleGAN-NADA: CLIP-guided domain adaptation of image generators. ArXiv:2108.00946 [Cs]. http://arxiv.org/abs/2108.00946Google Scholar
Guyuron, B., Rowe, D. J., Weinfeld, A. B., et al. (2009). Factors contributing to the facial aging of identical twins. Plastic and Reconstructive Surgery, 123(4),13211331. https://doi.org/10.1097/PRS.0b013e31819c4d42Google Scholar
Han, H., Otto, C., Liu, X., & Jain, A. K. (2015). Demographic estimation from face images: Human vs. machine performance. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37(6), 11481161. https://doi.org/10.1109/TPAMI.2014.2362759CrossRefGoogle ScholarPubMed
Hess, U., Adams, R. B. Jr., Simard, A., Stevenson, M. T., & Kleck, R. E. (2012). Smiling and sad wrinkles: Age-related changes in the face and the perception of emotions and intentions. Journal of Experimental Social Psychology, 48(6), 13771380. https://doi.org/10.1016/j.jesp.2012.05.018CrossRefGoogle ScholarPubMed
Holding, B. C., Sundelin, T., Cairns, P., Perrett, D. I., & Axelsson, J. (2019). The effect of sleep deprivation on objective and subjective measures of facial appearance. Journal of Sleep Research, 28(6), e12860. https://doi.org/10.1111/jsr.12860CrossRefGoogle ScholarPubMed
Iga, R., Izumi, K., Hayashi, H., Fukano, G., & Ohtani, T. (2003). A gender and age estimation system from face images. SICE 2003 Annual Conference (IEEE Cat. No. 03TH8734), 1, 756761.Google Scholar
Jack, R. E., Sun, W., Delis, I., Garrod, O. G. B., & Schyns, P. G. (2016). Four not six: Revealing culturally common facial expressions of emotion. Journal of Experimental Psychology: General, 145(6), 708730. https://doi.org/10.1037/xge0000162CrossRefGoogle Scholar
Karras, T., Laine, S., & Aila, T. (2018). A style-based generator architecture for generative adversarial networks. ArXiv:1812.04948 [Cs, Stat]. http://arxiv.org/abs/1812.04948Google Scholar
Karras, T., Laine, S., Aittala, M., et al. (2020). Analyzing and improving the image quality of StyleGAN. ArXiv:1912.04958 [Cs, Eess, Stat]. http://arxiv.org/abs/1912.04958CrossRefGoogle Scholar
Krumhuber, E. G., Küster, D., Namba, S., Shah, D., & Calvo, M. G. (2021). Emotion recognition from posed and spontaneous dynamic expressions: Human observers versus machine analysis. Emotion, 21(2), 447451. https://doi.org/10.1037/emo0000712CrossRefGoogle ScholarPubMed
Malatesta, C. Z., Fiore, M. J., & Messina, J. J. (1987). Affect, personality, and facial expressive characteristics of older people. Psychology and Aging, 2(1), 6. https://doi.org/DOI:10.1037/0882-7974.2.1.64Google ScholarPubMed
Minsky, M., & Papert, S. A. (2017). Perceptrons: An introduction to computational geometry. Cambridge, MA: MIT Press.CrossRefGoogle Scholar
North, M. S., & Fiske, S. T. (2015). Modern attitudes toward older adults in the aging world: A cross-cultural meta-analysis. Psychological Bulletin, 141(5), 9931021. https://doi.org/10.1037/a0039469CrossRefGoogle ScholarPubMed
Okada, H. C., Alleyne, B., Varghai, K., Kinder, K., & Guyuron, B. (2013). Facial changes caused by smoking: A comparison between smoking and nonsmoking identical twins. Plastic and Reconstructive Surgery, 132(5), 10851092. https://doi.org/10.1097/PRS.0b013e3182a4c20aCrossRefGoogle ScholarPubMed
Oosterhof, N. N., & Todorov, A. (2008). The functional basis of face evaluation. Proceedings of the National Academy of Sciences, 105(32), 1108711092. https://doi.org/10.1073/pnas.0805664105CrossRefGoogle ScholarPubMed
Palumbo, R., Adams, R. B. Jr., Hess, U., Kleck, R. E., & Zebrowitz, L. (2017). Age and Gender Differences in Facial Attractiveness, but Not Emotion Resemblance, Contribute to Age and Gender Stereotypes. Frontiers in Psychology, 8. https://doi.org/10.3389/fpsyg.2017.01704CrossRefGoogle ScholarPubMed
Peterson, J. C., Uddenberg, S., Griffiths, T. L., Todorov, A., & Suchow, J. W. (2022). Deep models of superficial face judgments. Proceedings of the National Academy of Sciences, 119(17), e2115228119. https://doi.org/10.1073/pnas.2115228119CrossRefGoogle ScholarPubMed
Radford, A., Kim, J. W., Hallacy, C., et al. (2021). Learning transferable visual models from natural language supervision. ArXiv:2103.00020. http://arxiv.org/abs/2103.00020Google Scholar
Roich, D., Mokady, R., Bermano, A. H., & Cohen-Or, D. (2021). Pivotal tuning for latent-based editing of real images. ArXiv:2106.05744 [Cs]. http://arxiv.org/abs/2106.05744Google Scholar
Sarhan, S., Hamad, S., & Elmougy, S. (2016). Human injected by Botox age estimation based on active shape models, speed up robust features, and support vector machine. Pattern Recognition and Image Analysis, 26(3), 617629. https://doi.org/10.1134/S1054661816030184CrossRefGoogle Scholar
Schroff, F., Kalenichenko, D., & Philbin, J. (2015). FaceNet: A Unified Embedding for Face Recognition and Clustering. 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 815823. https://doi.org/10.1109/CVPR.2015.7298682CrossRefGoogle Scholar
Shan, C., Gong, S., & McOwan, P. W. (2005). Robust facial expression recognition using local binary patterns. IEEE International Conference on Image Processing 2005, 2, II370.CrossRefGoogle Scholar
Shen, Y., Yang, C., Tang, X., & Zhou, B. (2020). InterFaceGAN: Interpreting the disentangled face representation learned by GANs. ArXiv:2005.09635. http://arxiv.org/abs/2005.09635Google Scholar
Simonite, T. (2018). When it comes to gorillas, google photos remains blind. Wired. www.wired.com/story/when-it-comes-to-gorillas-google-photos-remains-blind/Google Scholar
Taigman, Y., Yang, M., Ranzato, M., & Wolf, L. (2014). DeepFace: Closing the Gap to Human-Level Performance in Face Verification. 2014 IEEE Conference on Computer Vision and Pattern Recognition, 1701–1708. https://doi.org/10.1109/CVPR.2014.220Google Scholar
Tiddeman, B., Burt, M., & Perrett, D. (2001). Prototyping and transforming facial textures for perception research. IEEE Computer Graphics and Applications, 21(4),4250. https://doi.org/10.1109/38.946630CrossRefGoogle Scholar
Trigueros, D. S., Meng, L., & Hartnett, M. (2018). Face recognition: From traditional to deep learning methods. ArXiv Preprint ArXiv:1811.00116.Google Scholar
Turk, M., & Pentland, A. (1991). Eigenfaces for recognition. Journal of Cognitive Neuroscience, 3(1), 7186.CrossRefGoogle ScholarPubMed
United Nations. (2020). World population ageing, 2019 highlights.Google Scholar
Viola, P., & Jones, M. J. (2004). Robust real-time face detection. International Journal of Computer Vision, 57(2), 137154.CrossRefGoogle Scholar
Wehrli, S., Hertweck, C., Amirian, M., Glüge, S., & Stadelmann, T. (2021). Bias, awareness, and ignorance in deep-learning-based face recognition. AI and Ethics, 2, 509522. https://doi.org/10.1007/s43681-021-00108-6CrossRefGoogle Scholar
Whitehead, R. D., Ozakinci, G., & Perrett, D. I. (2013). A randomized controlled trial of an appearance-based dietary intervention. Health Psychology, 33(1), 99. https://doi.org/10.1037/a0032322CrossRefGoogle ScholarPubMed
Whitehill, J., Serpell, Z., Lin, Y.-C., Foster, A., & Movellan, J. R. (2014). The faces of engagement: Automatic recognition of student engagementfrom facial expressions. IEEE Transactions on Affective Computing, 5(1), 8698. https://doi.org/10.1109/TAFFC.2014.2316163CrossRefGoogle Scholar
Yan, W.-J., Wu, Q., Liang, J., Chen, Y.-H., & Fu, X. (2013). How fast are the leaked facial expressions: The duration of micro-expressions. Journal of Nonverbal Behavior, 37(4), 217230. https://doi.org/10.1007/s10919-013-0159-8CrossRefGoogle Scholar
Zebrowitz, L. A., Franklin, R. G., Hillman, S., & Boc, H. (2013). Older and younger adults’ first impressions from faces: Similar in agreement but different in positivity. Psychology and Aging, 28(1), 202212. https://doi.org/10.1037/a0030927Google ScholarPubMed
Zebrowitz, L. A., Kikuchi, M., & Fellous, J.-M. (2010). Facial resemblance to emotions: Group differences, impression effects, and race stereotypes. Journal of Personality and Social Psychology, 98(2), 175189. https://doi.org/10.1037/a0017990CrossRefGoogle ScholarPubMed

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×