Skip to main content Accessibility help
×
Hostname: page-component-76fb5796d-skm99 Total loading time: 0 Render date: 2024-04-29T23:06:48.293Z Has data issue: false hasContentIssue false

6 - Five Common but Questionable Principles of Multimedia Learning

Published online by Cambridge University Press:  05 June 2012

Richard E. Clark
Affiliation:
University of Southern California
David F. Feldon
Affiliation:
University of California at Los Angeles
Richard Mayer
Affiliation:
University of California, Santa Barbara
Get access

Summary

Principle:A basic generalization that is accepted as true and that can be used as a basis for reasoning or conduct.

OneLook.com Dictionary

Abstract

This chapter describes five commonly held principles about multimedia learning that are not supported by research and suggests alternative generalizations that are more firmly based on existing studies. The questionable beliefs include the expectations that multimedia instruction: (1) yields more learning than live instruction or older media; (2) is more motivating than other instructional delivery options; (3) provides animated pedagogical agents that aid learning; (4) accommodates different learning styles and so maximizes learning for more students; and (5) facilitates student-managed constructivist and discovery approaches that are beneficial to learning.

Introduction

Multimedia instruction is one of the current examples of a new area of instructional research and practice that has generated a considerable amount of excitement. Like other new areas, its early advocates begin with a set of assumptions about the learning and access problems it will solve and the opportunities it affords (see, e.g., a report by the American Society for Training and Development, 2001). The goal of this chapter is to examine the early expectations about multimedia benefits that seem so intuitively correct that advocates may not have carefully examined research evidence for them. If these implicit assumptions are incorrect we may unintentionally be using them as the basis for designing multimedia instruction that does not support learning or enhance motivation.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2005

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Abrahamson, C. E. (1998). Issues in interactive communication in distance education. College Student Journal, 32(1), 33–42Google Scholar
Ackerman, P. L. (1987). Individual differences in skill learning: An integration of psychometric and information processing perspectives. Psychological Bulletin, 102, 3–27CrossRefGoogle Scholar
Ackerman, P. L. (1988). Determinants of individual differences during skill acquisition: Cognitive abilities and information processing. Journal of Experimental Psychology: General, 117, 288–318CrossRefGoogle Scholar
Ackerman, P. L. (1990). A correlational analysis of skill specificity: Learning, abilities, and individual differences. Journal of Experimental Psychology: Learning, Memory, and Cognition, 16, 883–901Google Scholar
Ackerman, P. L. (1992). Predicting individual differences in complex skill acquisition: Dynamics of ability determinants. Journal of Applied Psychology, 77, 598–614CrossRefGoogle ScholarPubMed
American Society for Training and Development (2001). E-Learning: If you build it, will they come?Alexandria, VA: ASTD
Anderson, J. R., & Gluck, K. (2001). What role do cognitive architectures play in intelligent tutoring systems. In Klahr, D. & Carver, S. (Eds.), Cognition and instruction: 25 years of progress (pp. 227–262). Mahwah, NJ: Lawrence ErlbaumGoogle Scholar
André, E., Rist, T., & Müller, J. (1999). Employing AI methods to control the behavior of animated interface agents. Applied Artificial Intelligence, 13, 415–448CrossRefGoogle Scholar
Atkinson, R. K. (2002). Optimizing learning from examples using animated pedagogical agents. Journal of Educational Psychology, 94(2), 416–427CrossRefGoogle Scholar
Bandura, A. (1997). Self-efficacy: The exercise of control. New York: FreemanGoogle Scholar
Barron, K. E., & Harackiewicz, J. M. (2001). Achievement goals and optimal motivation: Testing multiple goal models. Journal of Personality and Social Psychology, 80(5), 706–722CrossRefGoogle ScholarPubMed
Baylor, A. L. (2002). Expanding preservice teachers' metacognitive awareness of instructional planning through pedagogical agents. Educational Technology Research and Development, 50(2), 5–22CrossRefGoogle Scholar
Bernard, R., Abrami, P., Lou, Y., Borokhovski, E., Wade, A., Wozney, L. et al. (in press). How does distance education compare to classroom instruction? A meta-analysis of the empirical literature. Review of Educational ResearchGoogle Scholar
Bloom, B. (1984) The 2-sigma problem: The search for methods of group instruction as effective as one-to-one tutoring. Educational Researcher, 13, 4–16CrossRefGoogle Scholar
Bosseler, A., & Massaro, D. (2003). Development and evaluation of a computer-animated tutor for vocabulary and language learning in children with autism. Journal of Autism and Developmental Disorders, 33(6), 653–672CrossRefGoogle ScholarPubMed
Cassidy, S. (2004). Learning styles: An overview of theories, models, and measures. Educational Psychology, 24(4), 419–444CrossRefGoogle Scholar
Cattell, R. B. (1987). Intelligence: Its structure, growth and action. Amsterdam: North HollandGoogle Scholar
Ceci, S. J., & Liker, J. K. (1986). A day at the races: A study of IQ, expertise, and cognitive complexity. Journal of Experimental Psychology, 115, 255–266CrossRefGoogle Scholar
Chandler, P. & Sweller, J. (1991). Cognitive load theory and the format of instruction. Cognition and Instruction, 8, 293–332CrossRefGoogle Scholar
Choi, S., & Clark, R. E. (2004). Five suggestions for the design of experiments on the effects of animated pedagogical agents. Symposium paper presented at the Annual Meeting of the American Educational Research Association, San Diego, CA
Clark, R. E. (1982). Antagonism between achievement and enjoyment in ATI studies. Educational Psychologist, 17(2), 92–101CrossRefGoogle Scholar
Clark, R. E. (1989). When teaching kills learning: Research on mathemathantics. In Mandl, H. N., Bennett, N., Corte, E., and Freidrich, H. F. (Eds.). Learning and instruction. European research in an international context. Volume II. London: Pergamon Press LtdGoogle Scholar
Clark, R. E. (2000). Evaluating distance education: Strategies and cautions. The Quarterly Journal of Distance Education, 1(1), 5–18Google Scholar
Clark, R. E. (Ed.). (2001). Learning from media: Arguments, analysis and evidence. Greenwich, CT: Information Age PublishersGoogle Scholar
Clark, R. E., & Salomon, G. (1986). Media in teaching. In Wittrock, M. (Ed.), Handbook of research on teaching (3rd ed.). New York: MacmillanGoogle Scholar
Clarke, T., Ayres, P., & Sweller, J. (in press). The impact of sequencing and prior knowledge on learning mathematics through spreadsheet applications. Educational Technology Research and DevelopmentGoogle Scholar
Corbett, A. (2001). Cognitive computer tutors: Solving the two-sigma problem. In Bauer, M., Gmytrasiewicz, P., & Vassileva, J. (Eds.), User Modeling 2001: Proceedings of the 8th International Conference, UM 2001 (pp. 137–146). New York: SpringerCrossRefGoogle Scholar
Craig, S., Driscoll, D. M., & Gholson, B. (2004). Constructing knowledge from dialog in an intelligent tutoring system: Interactive learning, vicarious learning, and pedagogical agents. Journal of Educational Multimedia and Hypermedia, 13(12), 163–183Google Scholar
Craig, S. D., Gholson, B., & Driscoll, D. M. (2002). Animated pedagogical agents in multimedia educational environments: Effects of agent properties, picture features and redundancy. Journal of Educational Psychology, 94(2), 428–434CrossRefGoogle Scholar
Cronbach, L., & Snow, R. E. (1977). Aptitudes and instructional methods. New York: IrvingtonGoogle Scholar
Dehn, D. M. & Mulken, S. (2000). The impact of animated interface agents: A review of empirical research. International Journal of Human-Computer Studies, 52, 1–22CrossRefGoogle Scholar
Jong, T., & Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of Educational Research, 68(2), 179–201CrossRefGoogle Scholar
Doll, J., & Mayr, U. (1987). Intelligenz und schachleistung – eine untersuchung an schachexperten. [Intelligence and achievement in chess – a study of chess masters.] Psychologische Beiträge, 29, 270–289Google Scholar
Duff, A., & Duffy, T. (2002). Psychometric properties of Honey and Mumford's learning styles questionnaire. Learning and Individual Differences, 33, 147–163CrossRefGoogle Scholar
Eccles, J. S., & Wigfield, A. (2000). Schooling's influences on motivation and achievement. In Danziger, S. and Waldfogel, J. (Eds.), Securing the future: Investing in children from birth to college (pp. 153–181). New York: Russell Sage FoundationGoogle Scholar
Eccles, J. S., & Wigfield, A. (2002). Motivational beliefs, values, and goals. Annual Review of Psychology, 53, 109–132CrossRefGoogle ScholarPubMed
Erickson, T. (1997). Designing agents as if people mattered. In Bradshaw, J. M. (Ed.), Software agents (pp. 79–96). Menlo Park, CA: MIT PressGoogle Scholar
Ericsson, K. A., & Lehmann, A. C. (1996). Expert and exceptional performance: Maximal adaptation to task constraints. Annual Review of Psychology, 47, 273–305CrossRefGoogle ScholarPubMed
Frankola, K. (2001). Why online learners drop out. Workforce, 80(10), 52–60Google Scholar
Fredericksen, E., Pickett, A., Shea, P., Pelz, W., & Swan, K. (2000). Student satisfaction and perceived learning with on-line courses: Principles and examples from the SUNY learning network. Retrieved June 30, 2002, from http://www.aln.org/alnweb/journal/Vol4_issue2/le/Fredericksen/LE-fredericksen.htm
Funke, J. (1991). Solving complex problems: exploration and control of complex systems. In Sternberg, R. J. & Frensch, P. A. (Eds.), Complex problem solving: Principles and mechanisms (pp. 185–223). Hillsdale, NJ: ErlbaumGoogle Scholar
Gimino, A. (2000). Factors that influence students' investment of mental effort in academic tasks: A validation and exploratory study. Unpublished doctoral dissertation, University of Southern California, Los Angeles, CA
Goodyear, P., Njoo, M., Hijne, H., & Berkum, J. J. A. (1991). Learning processes, learner attributes and simulations. Education and Computing, 6, 263–304CrossRefGoogle Scholar
Gruber, H., Graf, M., Mandl, H., Renkl, & Stark, R. (1995). Fostering applicable knowledge by multiple perspectives and guided problem solving. Proceedings of the Annual Conference of the European Association for Research on Learning and Instruction, Nijmegen, The Netherlands
Henson, R. K., & Hwang, D. (2002). Variability and prediction of measurement error in Kolb's learning style inventory scores: A reliability generalization study. Educational and Psychological Measurement, 62(4), 712–727CrossRefGoogle Scholar
Hulin, C. L., Henry, R. A., & Noon, S. L. (1990). Adding a dimension: Time as a factor in the generalizability of predictive relationships. Psychological Bulletin, 107, 328–340CrossRefGoogle Scholar
Kalyuga, S., Ayres, P., Chandler, P., & Sweller, J. (2003). Expertise reversal effect. Educational Psychologist, 38, 23–31CrossRefGoogle Scholar
Kalyuga, S., Chandler, P., Tuovinen, J., & Sweller, J. (2001). When problem solving is superior to studying worked examples. Journal of Educational Psychology, 93, 579–588CrossRefGoogle Scholar
Kalyuga, S., & Sweller, J. (in press-a). Measuring knowledge to optimize cognitive load factors during instruction. Journal of Educational PsychologyGoogle Scholar
Kalyuga, S., & Sweller, J. (in press-b). Rapid dynamic assessment of expertise to improve the efficiency of adaptive E-learning. Educational Technology: Research and DevelopmentGoogle Scholar
Kavale, K. A., & Forness, S. R. (1987). Substance over style: Assessing the efficacy of modality testing and teaching. Exceptional Children, 54(3), 228–239CrossRefGoogle Scholar
Kennedy, C. (2000). Quick online survey summary. Retrieved June 30, 2002, from http://www.smccd.net/kennedyc/rsch/qcksrv.htm
Kester, L., Kirschner, P., Merrienboer, J., & Baumer, A. (2001). Just-in-time information presentation and the acquisition of complex cognitive skills. Computers in Human Behavior, 17, 373–391CrossRefGoogle Scholar
Kozma, R. B. (1994). Will media influence learning? Reframing the debate. Educational Technology Research and Development, 42(2), 7–19CrossRefGoogle Scholar
Kozma, R. (2000). Reflections on the state of educational technology. Educational Technology, Research and Development, 48, 5–15CrossRefGoogle Scholar
Land, S., & Hannafin, M. J. (1996). A conceptual framework for the development of theories in action with open learning environments. Educational Technology Research and Development, 44(3), 37–53CrossRefGoogle Scholar
Levin, H. M, Glass, G., & Meister, G. R. (1987). Cost-effectiveness of computer assisted instruction. Evaluation Review, 11(1), 50–72CrossRefGoogle Scholar
Levin, H. M., & McEwan, P. J. (2001). Cost-effectiveness analysis: Methods and applications (2nd ed.). Thousand Oaks, CA: SageGoogle Scholar
Lewis, M., Bishay, M., & McArthur, D. (1993). The macrostructure and microstructure of inquiry activities: Evidence from students using a microworld for mathematical discovery. Proceedings of the World Conference on Artificial Intelligence and Education, Edinburgh, ScotlandGoogle Scholar
Li, L., O'Neil, H. F., & Feldon, D. F. (in press). The effects of effort and worry on distance learning with text and video. The American Journal of Distance EducationGoogle Scholar
Linn, M. C., & Songer, N. B. (1991). Teaching thermodynamics to middle school students: What are appropriate cognitive demands? Journal of Research in Science Teaching, 28, 885–918Google Scholar
Loo, R. (1997). Evaluating change and stability in learning styles: A methodological concern. Educational Psychology, 17, 95–100CrossRefGoogle Scholar
Masunaga, H., & Horn, J. (2001). Expertise and age-related changes in components of intelligence. Psychology and Aging, 16(2), 293–311CrossRefGoogle ScholarPubMed
Mayer, R. E. (2001). Multimedia learning. New York: Cambridge University PressCrossRefGoogle Scholar
Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning? American Psychologist, 59(1), 14–19CrossRefGoogle ScholarPubMed
Mayer, R. E., Dow, G. T., & Mayer, S. (2002). Multimedia learning in an interactive self-explaining environment: What works in the design of agent-based microworlds?Journal of Educational Psychology, 95(4), 806–813CrossRefGoogle Scholar
Mayer, R. E., & Massa, L. J. (2003). Three facets of visual and verbal learners: Cognitive ability, cognitive style, and learning preference. Journal of Educational Psychology, 95(4), 833–846CrossRefGoogle Scholar
Mayer, R. E., Sobko, K., & Mautone, P. D. (2003). Social cues in multimedia learning: Role of speaker's voice. Journal of Educational Psychology, 94, 419–425CrossRefGoogle Scholar
Mielke, K. W. (1968). Questioning the questions of ETV research. Educational Broadcasting, 2, 6–15Google Scholar
Mitrovic, A., & Suraweera, P. (2000). Evaluating an animated pedagogical agent. Lecture Notes in Computer Science, No. 1839, 73–82CrossRefGoogle Scholar
Moreno, R., & Mayer, R. (2000). A learner centered approach to multimedia explanations: Deriving instructional design principles from cognitive theory. Retrieved from http://imej.wfu.edu/articles/2000/2/05/printver.asp
Moreno, R., Mayer, R. E., Spires, H. A., & Lester, J. C. (2001). The case for social agency in computer-based teaching: Do students learn more deeply when they interact with animated pedagogical agents? Cognition and Instruction, 19(2), 177–213CrossRefGoogle Scholar
Morrison, G. R. (1994). The media effects question: “Unresolvable” or asking the right question. Educational Technology Research and Development, 42(2), 41–44CrossRefGoogle Scholar
Moundridou, M., & Virvou, M. (2002). Evaluating the persona effect of an interface agent in a tutoring system. Journal of Computer Assisted Learning, 18(3), 253–261CrossRefGoogle Scholar
Mousavi, S. Y., Low, R., & Sweller, J. (1995). Reducing cognitive load by mixing auditory and visual presentation modes. Journal of Educational Psychology, 87, 319–334CrossRefGoogle Scholar
Nass, C., & Steuer, J. (1993). Anthropomorphism, agency, and ethopoeia: Computers as social actors. Human Communication Research, 19(4), 504–527CrossRefGoogle Scholar
Office of Technology Assessment. (1988, September). Power on: New tools for teaching and learning. Retrieved June 30, 2002, from http://www.wws.princeton.edu/∼ota/disk2/1988/8831_n.html
Perkins, D. N., & Grotzer, T. A. (1997). Teaching intelligence. American Psychologist, 52(10), 1125–1133CrossRefGoogle Scholar
Pintrich, P. R., & Schunk, D. H. (2002). Motivation in education: Theory, research, and applications (2nd ed.). Englewood Cliffs, NJ: Prentice HallGoogle Scholar
Richardson, J. A., & Turner, T. E. (2000). Field dependence revisited I: Intelligence. Educational Psychology, 20(3), 255–270CrossRefGoogle Scholar
Richardson, J. T. (2000). Researching students' learning: Approaches to studying in campus-based and distance learning. Buckingham, UK: Society for Research into Higher Education and Open University PressGoogle Scholar
Riding, R. J., & Cheema, I. (1991). Cognitive styles: An overview and integration. Educational Psychology, 11, 193–215CrossRefGoogle Scholar
Ryokai, R., Vaucelle, C., & Cassell, J. (2003). Virtual peers as partners in storytelling and literacy learning. Journal of Computer Assisted Learning, 19, 195–208CrossRefGoogle Scholar
Salomon, G. (1984). Television is “easy” and print is “tough”: The differential investment of mental effort in learning as a function of perceptions and attributions. Journal of Educational Psychology, 76(4), 647–658CrossRefGoogle Scholar
Sampson, D., Karagiannidis, C., & Kinshuk, . (2002). Personalised learning: Educational, technological and standardisation perspective. Interactive Educational Multimedia, 4, 24–39Google Scholar
Schramm, W. (1977). Big media, little media. Beverly Hills, CA: SageGoogle Scholar
Shute, V. J., & Glaser, R. (1990). A large-scale evaluation of an intelligent discovery world: Smithtown. Interactive Learning Environments, 1, 51–77CrossRefGoogle Scholar
Stahl, S. A. (1999). Different strokes for different folks? A critique of learning styles. American Educator, 23(3), 27–31Google Scholar
Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognitive Science, 12, 257–285CrossRefGoogle Scholar
Sweller, J. (1989). Cognitive technology: Some procedures for facilitating learning and problem solving in mathematics and science. Journal of Cognitive Psychology, 81(4), 457–466Google Scholar
Sweller, J. (1999). Instruction design in technical areas. Camberwell, Australia: ACERGoogle Scholar
Sweller, J., & Chandler, P. (1994). Why some material is difficult to learn. Cognition and Instruction, 12, 185–233CrossRefGoogle Scholar
Sweller, J., Chandler, P., Tierney, P., & Cooper, M. (1990). Cognitive load as a factor in the structuring of technical material. Journal of Experimental Psychology: General, 119(2), 176–192CrossRefGoogle Scholar
Sweller, J., Merriënboer, J. G., & Paas, F. G. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10, 251–296CrossRefGoogle Scholar
Tuovinen, J. E., & Sweller, J. (1999). A comparison of cognitive load associated with discovery learning and worked examples. Journal of Educational Psychology, 91(2), 334–341CrossRefGoogle Scholar
Merriënboer, J. J. G., Kirschner, P. A., & Kester, L. (2003). Taking the load off a learner's mind: Instructional design for complex learning. Educational Psychologist, 38(1), 5–13CrossRefGoogle Scholar
Merriënboer, J. J. G. (1997). Training complex cognitive skills: A four-component instructional design model for technical training. Englewood Cliffs, NJ: Educational Technology PublicationsGoogle Scholar
Veenman, M. V. J. (1993). Intellectual ability and metacognitive skill: Determinants of discovery learning in computerized learning environments. Unpublished doctoral dissertation. University of Amsterdam
Veenman, M. V. J., & Elshout, J. J. (1995). Differential effects of instructional support on learning in simulation environments. Instructional Science, 22, 363–383CrossRefGoogle Scholar

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×