To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Chapter 14 allows us a look at the trajectories in brain imaging technology and research while acknowledging the field’s unpredictable evolution. It examines how existing tools are being refined, with functional MRI achieving submillimeter resolution and EEG sampling rates reaching 100,000 Hz, while highlighting the growing influence of private industry through initiatives like Neuralink, Facebook’s Building 8, and Google Brain. The chapter analyzes the scientific value of multimodal imaging approaches that combine complementary techniques such as EEG-fMRI to leverage both high temporal and spatial resolution. It discusses how large-scale collaborative efforts including the Human Connectome Project and Brain Initiative are reshaping our understanding of neural connectivity despite the challenges of modeling the brain’s extraordinary complexity. The emergence of biomarkers receives particular attention, emphasizing how machine learning algorithms are enhancing our ability to detect neurological and psychiatric conditions through brain imaging data. Recent technological innovations are surveyed, including miniaturized MRI scanners, real-time imaging analysis, optically pumped magnetometry, and functional ultrasound imaging, all pointing toward more accessible and sophisticated brain measurement capabilities. The chapter concludes with practical guidance for newcomers to the field and consideration of ethical dimensions, emphasizing that brain imaging technologies should advance human wellbeing rather than enable control or manipulation. Throughout, the chapter maintains that while specific trajectories remain uncertain, the overall direction is toward increasingly precise, accessible, and clinically valuable brain imaging technologies.
Chapter 12 examines the methodological foundations for conducting effective brain imaging research, positioning experimental design as the cornerstone of meaningful neuroscientific inquiry. It outlines a systematic approach to developing experiments, beginning with the essential groundwork of literature review and theoretical development before proceeding to stimulus creation and experimental implementation. The chapter emphasizes the critical balance between simplicity and complexity in design, advocating for well-controlled paradigms that isolate specific cognitive processes while acknowledging the brain’s inherent complexity. Particular attention is given to the technical considerations unique to different imaging modalities, addressing how fMRI’s hemodynamic response requires different design considerations than EEG’s direct measurement of neural activity. The chapter explores the philosophical challenges of constructing appropriate control conditions that effectively isolate the cognitive processes of interest, comparing cognitive subtraction approaches with factorial designs that reveal interaction effects. It emphasizes the importance of piloting experiments to identify potential confounds like expectancy bias and the role of jittered intertrial intervals in minimizing such effects. Throughout, the chapter underscores that experimental design in neuroimaging requires interdisciplinary expertise: understanding of brain anatomy and physiology, mastery of imaging technology, and sophisticated experimental psychology skills to translate abstract cognitive concepts into operationalizable experimental paradigms.
Chapter 6 explores magnetoencephalography (MEG), a neuroimaging technique that measures magnetic fields generated by neural activity with millisecond temporal precision. Starting with MEG’s development by David Cohen in 1967 and the crucial introduction of SQUID sensors, the chapter examines how MEG differs from EEG while measuring activity from the same neural sources. While EEG predominantly detects signals from gyri parallel to the skull, MEG captures perpendicular signals from sulci with superior spatial resolution as magnetic fields pass unimpeded through tissue. The practical aspects of MEG acquisition are covered, including participant preparation, artifact removal, and the importance of structural MRI for anatomical coregistration. The chapter addresses source localization challenges, such as the inverse problem of determining which neuronal sources created the detected signals, and explores solutions ranging from single dipole models to distributed approaches using anatomical constraints. Clinical applications in epilepsy and presurgical mapping are discussed, as is the complementary nature of combining MEG with other imaging modalities, particularly fMRI, to leverage their respective spatial and temporal strengths for comprehensive brain activity visualization.
Chapter 13 discusses the analysis processes that transform raw brain imaging data into meaningful neuroscientific insights. It explains the methodical progression from preprocessing to advanced analytical techniques, emphasizing that analysis is not merely a technical afterthought but a fundamental component of neuroimaging research. The chapter begins by addressing preprocessing steps – quality control, artifact correction, normalization, and smoothing – that prepare data for subsequent analysis while preserving signal integrity. It then explores single-subject processing approaches that aggregate experimental conditions and trials to establish individual response patterns before proceeding to group-level analyses that enable population-level inferences. Statistical considerations receive particular attention, with the chapter explaining how techniques like statistical parametric mapping function as the interpretive lens through which brain activity becomes visible. The problematic issue of multiple comparisons is thoroughly examined, illustrating how whole-brain analyses necessitate statistical correction to prevent false positives in the tens of thousands of simultaneous tests typical in neuroimaging. The chapter extends beyond traditional univariate approaches to cover network analysis methodologies that reveal functional connectivity patterns between brain regions. It concludes by addressing emerging analytical frontiers: real-time analysis for brain–computer interfaces, closed-loop brain stimulation paradigms, and the methodological limitations that necessitate careful interpretation of neuroimaging results. Throughout, the chapter emphasizes that analytical expertise is as essential as technical proficiency with imaging hardware, and that understanding analytical limitations is crucial for responsible interpretation of the neural basis of cognition and behavior.
Chapter 3 explores event-related potentials (ERPs), one of electroencephalography’s most powerful analytical techniques for investigating cognitive processing. The chapter traces ERPs’ evolution from Pauline and Hallowell Davis’s pioneering work in 1939 through its exponential growth as a research methodology. It explains how ERPs extract meaningful neural signals by time-locking and averaging EEG segments surrounding stimulus presentations, thereby revealing characteristic voltage deflections that correspond to specific cognitive processes. The text examines key ERP components, including C1, P1, N1, P2, N2, and P300, detailing their temporal progression, neuroanatomical origins, and functional significance in the processing hierarchy. It evaluates ERPs’ exceptional capacity to discriminate between processing stages occurring within milliseconds of each other, from early sensory encoding through attention allocation to semantic processing. The chapter addresses methodological considerations essential for robust ERP research, including experimental design principles, artifact reduction techniques, and the interpretation of scalp topographies. By analyzing ERPs’ comparative advantages, including millisecond-precise temporal resolution, ability to track covert processing without behavioral responses, and sensitivity to processing stage differences, alongside their limitations in spatial localization and specific experimental contexts, the chapter positions ERPs as a vital methodology for understanding the sequential unfolding of perceptual and cognitive processes in the human brain.
Chapter 10 discusses functional near-infrared spectroscopy (fNIRS), a noninvasive brain imaging technique that utilizes light to measure hemodynamic responses. It traces the evolution of spectroscopy from Newton’s prism experiments to modern neuroimaging applications, explaining how near-infrared light penetrates tissue to detect changes in oxygenated and deoxygenated hemoglobin. The chapter details the physical principles underlying fNIRS, comparing continuous wave, frequency domain, and time domain approaches while examining the instrumentation of modern systems. It addresses practical considerations including optode placement, signal quality optimization, and noise reduction techniques. The relationship between fNIRS signals and neural activity is discussed, highlighting similarities to the BOLD response in fMRI while acknowledging limitations in depth penetration. The chapter covers analytical approaches for fNIRS data processing and emphasizes its unique advantages: portability, relative affordability, and functionality in environments hostile to electromagnetic recordings. Case studies demonstrate fNIRS applications in specialized contexts like underwater environments and space exploration, illustrating why this technique has become an essential tool for specific research questions despite its spatial limitations.
Reading Biblical Greek is aimed at students who are studying New Testament Greek for the first time, or refreshing what they once learned. Designed to supplement and reinforce The Elements of New Testament Greek, by Jeremy Duff, each chapter of this textbook provides lengthy, plot-driven texts that will be accessible as students study each chapter of The Elements. Each text is accompanied by detailed questions, which test comprehension of content from recent lessons and review challenging topics from previous chapters. The graded nature of the texts, together with the copious notes and comprehension questions, makes this an ideal resource for learning, reviewing or re-entering Greek. The focus of this resource is on reading with understanding, and the exercises highlight how Greek texts convey meaning. Finally, this book moves on from first-year Greek, with sections that cover the most important advanced topics thoroughly.
Master the principles of flight dynamics, performance, stability, and control with this comprehensive and self-contained textbook. A strong focus on analytical rigor, balancing theoretical derivations and case studies, equips students with a firm understanding of the links between formulae and results. Over 130 step-by-step examples and 130 end-of-chapter problems cement student understanding, with solutions available to instructors. Computational Matlab code is provided for all examples, enabling students to acquire hands-on understanding, and over 200 ground-up diagrams, from simple “paper plane” models through to real-world examples, draw from leading commercial aircraft. Introducing fundamental principles and advanced concepts within the same conceptual framework, and drawing on the author's over 20 years of teaching in the field, this textbook is ideal for senior undergraduate and graduate-level students across aerospace engineering.
Making Sense of Mass Education gives a comprehensive overview of the cultural contexts of education, addressing and debunking important myths in the field. This book is an approachable text for undergraduate and postgraduate readers studying the Sociology and Philosophy of Education. The text covers the rise of mass schooling as a disciplinary institution, including the governance of subjectivity and the regulation of childhood and youth. It examines cultural forces on the field of education and addresses the influence of philosophical thought. In the landscape of mass education, change is constant. New topics covered in the fifth edition include education policy, teachers' work, place, online spaces and artificial intelligence. Each chapter features margin definitions and boxes exploring a range of myths, encouraging teachers to think critically. Making Sense of Mass Education continues to be pertinent for pre-service and practising teachers in Australian contexts.
Rigorously revised, with brand new chapters on additional private sources of funding, due diligence, sustainable finance, and deep tech investing, the second edition of this successful textbook provides a cutting-edge, practical, and comprehensive review of the financing of entrepreneurial ventures. From sourcing and obtaining funds, to financial tools for growing and managing the financial challenges and opportunities of the startup, this engaging text will help entrepreneurs, students, and early-stage investors to make sound financial decisions at every stage of a business' life. The text is grounded in sound theoretical foundations with a strong European perspective and reference to the Middle East and Africa. New case studies and success stories, and up-to-date perspectives from experts and the media, provide real-world applications, while a wealth of activities give students abundant opportunities to apply what they have learned. A must-have text for graduate and undergraduate students in entrepreneurship, finance, and management programmes, as well as aspiring entrepreneurs and early-stage investors in any field.
Retirement benefits are a critical aspect of employee compensation. This chapter discusses pension plans, 401(k) matching programs, and financial security incentives for long-term employees. It covers the impact of retirement planning on workforce management and examines best practices in designing sustainable retirement benefits.
External labor market forces and wage regulations impose constraints on compensation strategies. This chapter examines how industry norms, government policies, and economic conditions influence pay decisions. Topics include fair pay laws, external benchmarking, and compliance with regulatory standards. The chapter also discusses global compensation challenges, such as currency fluctuations and international labor laws. Readers will gain insights into how organizations navigate external constraints while maintaining competitive and equitable pay structures.
This chapter covers the legal and regulatory environment surrounding compensation. It discusses key employment laws, such as minimum-wage requirements, pay equity regulations, and collective bargaining agreements. The chapter highlights the challenges businesses face in complying with compensation laws while maintaining flexibility in pay structures. Case examples illustrate the consequences of noncompliance and best practices for ensuring fair and lawful compensation policies. Readers will gain a strong understanding of how regulatory frameworks shape compensation decisions and the role of HR in compliance management.
Chapter 9 covers the remaining aspects of the visual atmospherics – colours and signage. Colours are often said to comprise three dimensions: hue, brightness contrast, and saturation. The dimension that has been studied the most is hue. Hue is often described as on a scale from warm colours (red) to cool colours (blue). Research has shown that warmer colours tend to take over the visual scene and force their way into the s consciousness of shoppers. A red colour therefore makes shoppers more aroused or even confused and may interfere with the shoppers' ability to notice other stimuli. A store with too many red objects would overload the senses of the shoppers, and therefore it makes sense to instead work with brightness contrast. The eye's ability to detect brightness contrast primarily resides in the rods in the retina, while the cones primarily are responsible for colour vision. Research has shown that independent of hue, a contrast in the brightness level can create an even stronger visual pop-out effect. Regarding signage, it is found that a sign’s primary task is to attract attention. The attention-grabbing aspect is often more important than the communication. The optimal way to write prices is covered in Chapter 13.