We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure no-reply@cambridge.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
This study tracked the referential production of 25 Japanese-English returnee children for 5 years upon their return to Japan from an English-dominant environment (Mean age = 9.72 at the time of return) and compared their referential strategies to 27 Japanese monolinguals and 27 English monolinguals, age-matched to the returnee’s age at time of return. Returnees used more redundant noun phrases (NPs) in both languages to maintain references compared to monolingual peers. In English, no changes in NP use were noted over time, but increased exposure to English led to fewer redundant NPs when maintaining references. In their native Japanese (L1), returnees used less NPs for maintaining references and more NPs for reintroducing references, indicating improved reference tracking longitudinally. In sum, returnees’ referential production is more sensitive to L1 re-exposure effects than second language (L2) attrition and crucially, increased L2 exposure minimizes redundant referent production among bilingual returnee children.
The current study examined the comprehension and production of classifiers, case marking, and morphological passive structures among 414 child Japanese heritage speakers (mean age = 10.01 years; range = 4.02 – 18.18). Focusing on individual differences, we extracted latent experiential factors via the Q-BEx questionnaire (De Cat, Kašćelan, Prévost, Serratrice, Tuller, Unsworth, & The Q.-Be Consortium, 2022), which were then used to predict knowledge and use of these grammatical structures. The findings reveal that: (i) experiential factors such as heritage language (HL) engagement at home and within the community modulate grammatical performance differentially from childhood through adolescence, and (ii) HL proficiency, immersion experiences, and literacy systematically predict HL grammatical outcomes. These results indicate that particular language background factors hold differential significance at distinct developmental stages and that higher proficiency, richer immersion experiences, and literacy engagement in the HL are crucial for the development of core grammatical structures.
The COVID-19 pandemic has transformed healthcare significantly and telepsychiatry is now the primary means of treatment in some countries.
Aims
To compare the efficacy of telepsychiatry and face-to-face treatment.
Method
A comprehensive meta-analysis comparing telepsychiatry with face-to-face treatment for psychiatric disorders. The primary outcome was the mean change in the standard symptom scale scores used for each psychiatric disorder. Secondary outcomes included all meta-analysable outcomes, such as all-cause discontinuation and safety/tolerability.
Results
We identified 32 studies (n = 3592 participants) across 11 mental illnesses. Disease-specific analyses showed that telepsychiatry was superior to face-to-face treatment regarding symptom improvement for depressive disorders (k = 6 studies, n = 561; standardised mean difference s.m.d. = −0.325, 95% CI −0.640 to −0.011, P = 0.043), whereas face-to-face treatment was superior to telepsychiatry for eating disorder (k = 1, n = 128; s.m.d. = 0.368, 95% CI 0.018–0.717, P = 0.039). No significant difference was seen between telepsychiatry and face-to-face treatment when all the studies/diagnoses were combined (k = 26, n = 2290; P = 0.248). Telepsychiatry had significantly fewer all-cause discontinuations than face-to-face treatment for mild cognitive impairment (k = 1, n = 61; risk ratio RR = 0.552, 95% CI 0.312–0.975, P = 0.040), whereas the opposite was seen for substance misuse (k = 1, n = 85; RR = 37.41, 95% CI 2.356–594.1, P = 0.010). No significant difference regarding all-cause discontinuation was seen between telepsychiatry and face-to-face treatment when all the studies/diagnoses were combined (k = 27, n = 3341; P = 0.564).
Conclusions
Telepsychiatry achieved a symptom improvement effect for various psychiatric disorders similar to that of face-to-face treatment. However, some superiorities/inferiorities were seen across a few specific psychiatric disorders, suggesting that its efficacy may vary according to disease type.
Schools may serve as shelters in the event of a disaster, but little is known about the requirements of children with disabilities in such situations. This study, therefore, aimed to investigate disaster preparedness in Japanese special needs schools depending on the designation of welfare shelters.
Methods:
A questionnaire was distributed to schools nationwide. The respondents (authorities from 531 schools) answered questions about their jobs, disaster experiences, the school type, its students’ disabilities, its designation as a welfare shelter, its evacuation readiness, and the items of a disaster prevention awareness scale. Differences in preparedness among schools and the relationship between preparedness and designation as a welfare shelter were determined.
Results:
Most respondents had never experienced a natural disaster. Schools had insufficient resources to cope with disasters. While the majority (68.2%) had resources for children to stay overnight, a substantial minority of schools (31.8%) did not. No differences were found in preparedness among schools with different types of children with disabilities. Schools designated as welfare shelters were significantly better prepared than others.
Conclusions:
Special needs schools in Japan have limited disaster preparedness. The designation of schools as welfare shelters may increase their preparedness for disasters.
At the heart of modern Japan there remains an intractable and divisive social problem with its roots in pre-history, namely the ongoing social discrimination against the Dowa communities, otherwise known as Buraku. Their marginalization and isolation within society, as a whole, remains a veiled yet contested issue. Buraku studies, once largely ignored within Japan's academia and by scholarly publishers, have developed considerably in the first decades of the twenty-first century, as the extensive bibliographies of both Japanese and English sources provided here clearly demonstrates.
The authors of the present study, published in Japanese in 2016 and translated here by the Oxford scholar Ian Neary, have been able to incorporate this most recent data. Because of its importance as the first Buraku history based on this new research, a wider readership was always the authors' principal focus. Yet, it also provides a valuable source book for further study by those wishing to develop their knowledge about the subject from an informed base. This history of the Buraku communities and their antecedents is the first such study to be published in English.
Antipsychotics are widely used in the treatment of major depressive disorder (MDD), but there has been no comprehensive meta-analytic assessment that examined their use as monotherapy and adjunctive therapy.
Methods
A systematic review and a meta-analysis were conducted on randomized placebo-controlled trials (RCTs) that reported on the efficacy and safety/tolerability of antipsychotics for the treatment of adults with MDD. Data of both monotherapy and adjunctive antipsychotic use were extracted, but analyzed separately using a random-effects model. Co-primary outcomes were study-defined-treatment response and intolerability-related discontinuation. We also illustrated the risk/benefit balance of antipsychotics for MDD, using two-dimensional graphs representing the primary efficacy and safety/tolerability outcome. Secondary outcomes included psychopathology, remission, all-cause-discontinuation, inefficacy-related discontinuation, and adverse events.
Results
Forty-five RCTs with 12 724 patients were included in the analysis. In monotherapy (studies = 13, n = 4375), amisulpride [1.99 (1.55–2.55)], sulpiride [1.50 (1.03–2.17)], and quetiapine [1.48 (1.23–1.78)] were significantly superior to placebo regarding treatment response. However, intolerability-related discontinuations were significantly higher compared to placebo with amisulpride and quetiapine. In adjunctive therapy (studies = 32, n = 8349), ziprasidone [1.80 (1.07–3.04)], risperidone [1.59 (1.19–2.14)], aripiprazole [1.54 (1.35–1.76)], brexpiprazole [1.41 (1.21–1.66)], cariprazine [1.27 (1.07–1.52)], and quetiapine [1.23 (1.08–1.41)] were significantly superior to placebo regarding treatment response. However, of these antipsychotics that were superior to placebo, only risperidone was equivalent to placebo regarding discontinuation due to intolerability, while the other antipsychotics were inferior.
Conclusion
Results suggest that there are significant differences regarding the risk/benefit ratio among antipsychotics for MDD, which should inform clinical care.
No co-productive narrative synthesis of system-level facilitators and barriers to personal recovery in mental illness has been undertaken.
Aims
To clarify system-level facilitators and barriers to personal recovery of people with mental illness.
Method
Qualitative study guided by thematic analysis. Data were collected through one focus group, which involved seven service users and three professionals. This group had 11 meetings, each lasting 2 h at a local research institute, between July 2016 to January 2018.
Results
The analysis yielded three themes: barriers inhibiting positive interaction within personal relationship networks, roots of barriers from mental health systems and the social cultural context, and possible solutions to address the roots. Barriers were acknowledged as those related to sense of safety, locus of control within oneself and reunion with self. The roots of barriers were recognised within mental health services, including system without trauma sensitivity, lack of advocacy support and limited access to psychosocial approaches. Roots from social cultural context were also found. There were no narratives relating to facilitators. A possible solution was to address the roots from systems. Social cultural change was called for that makes personalised goals most valued, with an inclusive design that overcomes stigma, to achieve an open and accepting community.
Conclusions
The analysis yielded system-level barriers specific to each recovery process. Roots of barriers that need transformation to facilitate personal recovery were identified within mental health services. Social interventions should be further explored to translate the suggested social cultural changes into action.
Virtual reality exposure therapy (VRET) is currently being used to treat social anxiety disorder (SAD); however, VRET's magnitude of efficacy, duration of efficacy, and impact on treatment discontinuation are still unclear.
Methods
We conducted a meta-analysis of studies that investigated the efficacy of VRET for SAD. The search strategy and analysis method are registered at PROSPERO (#CRD42019121097). Inclusion criteria were: (1) studies that targeted patients with SAD or related phobias; (2) studies where VRET was conducted for at least three sessions; (3) studies that included at least 10 participants. The primary outcome was social anxiety evaluation score change. Hedges' g and its 95% confidence intervals were calculated using random-effect models. The secondary outcome was the risk ratio for treatment discontinuation.
Results
Twenty-two studies (n = 703) met the inclusion criteria and were analyzed. The efficacy of VRET for SAD was significant and continued over a long-term follow-up period: Hedges' g for effect size at post-intervention, −0.86 (−1.04 to −0.68); three months post-intervention, −1.03 (−1.35 to −0.72); 6 months post-intervention, −1.14 (−1.39 to −0.89); and 12 months post-intervention, −0.74 (−1.05 to −0.43). When compared to in vivo exposure, the efficacy of VRET was similar at post-intervention but became inferior at later follow-up points. Participant dropout rates showed no significant difference compared to in vivo exposure.
Conclusion
VRET is an acceptable treatment for SAD patients that has significant, long-lasting efficacy, although it is possible that during long-term follow-up, VRET efficacy lessens as compared to in vivo exposure.
Electroconvulsive therapy (ECT) is the most effective antidepressant treatment for severe depression. Although recent structural magnetic resonance imaging (MRI) studies have consistently reported ECT-induced hippocampal volume increases, most studies did not find the association of the hippocampal volume changes with clinical improvement. To understand the underlying mechanisms of ECT action, we aimed to identify the longitudinal effects of ECT on hippocampal functional connectivity (FC) and their associations with clinical improvement.
Methods
Resting-state functional MRI was acquired before and after bilateral ECT in 27 depressed individuals. A priori hippocampal seed-based FC analysis and a data-driven multivoxel pattern analysis (MVPA) were conducted to investigate FC changes associated with clinical improvement. The statistical threshold was set at cluster-level false discovery rate-corrected p < 0.05.
Results
Depressive symptom improvement after ECT was positively associated with the change in the right hippocampus-ventromedial prefrontal cortex FC, and negatively associated with the right hippocampus-superior frontal gyrus FC. MVPA confirmed the results of hippocampal seed-based analyses and identified the following additional clusters associated with clinical improvement following ECT: the thalamus, the sensorimotor cortex, and the precuneus.
Conclusions
ECT-induced change in the right frontotemporal connectivity and thalamocortical connectivity, and changes in the nodes of the default mode network were associated with clinical improvement. Modulation of these networks may explain the underlying mechanisms by which ECT exert its potent and rapid antidepressant effect.
The corpus callosum (CC) is the largest interhemispheric white matter commissure connecting the cerebral hemispheres and plays a crucial role in interhemispheric communication and cognitive processes. The subdivisions of the CC were attempted to define corresponding areas of the cortex from which the fibers originate. Previous neuroanatomic studies of the CC provide impetus for investigating its role in obsessive-compulsive disorder (OCD).
Methods:
In this study diffusion tensor imaging (DTI) was employed to microstructural abnormalities of white matter of the CC in OCD patients. Nine patients with OCD and matched control subjects underwent DTI. Fractional anisotropy (FA), an index of the integrity of white matter tracts, was determined in the seven subdivisions of the CC.
Results:
Significant reduction in FA was found in the rostrum of the CC of patients with OCD compared with one of controls. FA of the other subdivisions except the rostrum in OCD patients did not differ compared with control subjects. Higher FA in the rostrum correlated with lower Y-BOCS scores (r = -0.852, p = 0.004).
Conclusions:
The rostrum contains fibers from inferior premotor as well as medial and caudate/orbital prefrontal regions. These results supported the theory of dysfunction of prefrontal cortex and striatal circuits in OCD and suggested the implication of the orbitofrontal circuit for symptom severity in the OCD patients.
The subdivisions of the corpus callosum (CC) were attempted to define corresponding areas of the cortex from which the fibers originate. Previous neuroanatomic studies of the CC provide impetus for investigating its role in obsessive-compulsive disorder (OCD).
Methods:
In this study diffusion tensor imaging (DTI) was employed to microstructural abnormalities of white matter of the CC in OCD patients. Nine patients with OCD and matched control subjects underwent DTI. Fractional anisotropy (FA), an index of the integrity of white matter tracts, was determined in the seven subdivisions of the CC. We placed each reagions of interest (ROI) over the sagittal plane and all subdivisions were measured.
Results:
Significant reduction in FA was found in the rostrum of the CC of patients with OCD compared with one of controls. FA of the other subdivisions except the rostrum in OCD patients did not differ significantly compared with control subjects. Higher FA in the rostrum correlated significantly with lower Y-BOCS scores (r = -0.803, p = 0.009).
Conclusions:
The rostrum contains fibers from inferior premotor as well as medial and caudate/orbital prefrontal regions. These results supported the theory of dysfunction of prefrontal cortex and striatal circuits in OCD and suggested the implication of the orbitofrontal circuit for symptom severity in the OCD patients.
THE WORLD ECONOMIC crisis began in 1929 in the USA, spread to Japan the following year and would last until around 1934. This long-lasting, severe economic crisis is referred to in Japan as the Shōwa Economic Crisis. A movement for the economic revival of agricultural and fishing villages developed nationally and even residents of discriminated Buraku communities were mobilized to take part in activities aimed at overcoming the impact of this economic crisis.
The impact of the Shōwa Economic Crisis was particularly serious within discriminated Buraku communities whose economic foundations were already very vulnerable and it exacerbated the difficulties that they were encountering. According to a survey carried by the Yūwa Association in November 1929, the average amount of tax paid by households in Buraku communities was between ¥3 and ¥8 whereas the average in farming villages as a whole was around ¥20 which suggests a huge gap existed between Buraku and non-Buraku household incomes at this time. Footwear manufacture had been the main industry within Buraku communities but it was in decline. Moreover, since it was not possible for former shoe makers to acquire land to take up farming in its place the only alternative for many was emigration to Hokkaido or one of the colonies. Dependence on highly unstable seasonal labour was widespread (Yamamoto Masao, ‘Buraku Keizai Mondai no Sobyō’, Yūwa Jigyō Kenkyū Dai 11go July 1930).
In these circumstances the argument outlined earlier within the Yūwa Association that a moral movement was all that was needed to eliminate discrimination and that special policies were not required inevitably fell temporarily into the background. Already there was general agreement that without some kind of unusual and special policy it would not be possible to bridge the economic gap that was emerging between the Buraku and non- Buraku communities. It had been agreed within the Yūwa Association to address the new situation with the ‘new awakening’ policy mentioned in chapter 14 but even chairman Hiranuma who hitherto had insisted on addressing the Buraku problem only as a moral issue had started to say things like, ‘I think that if we do not address this situation within which their industries and economic situation are at a standstill, the accomplishment of all the other aims of our project will be extremely difficult’ (Yūwa Jigyō Kenkyū Vol 11).
WHEN I WAS a student and started to take an interest in Buraku history the first text that I absorbed was the 1975 edition of ‘Buraku History and the Liberation Movement’ edited by the Buraku Mondai Kenkyūjo. Later I would read many other histories of the issue but as a substantial general history this has always been my basic reference point.
More than ten years have passed since Professors Teraki and Akisada produced their Buraku histories – pre-modern in 2002, and modern in 2004, respectively. We have now produced another new history to send out to the world. It must be left to the reader's judgement as to whether this text reflects changes in the circumstances in the Buraku problem or changes in the environment of the research. For me it is not just a question of degree of sensitivity to those questions but it is important to understand theoretically the contextual social structure and for this reason it is essential to comprehend the historical background. This is the crucial point for understanding, for example, the problems that surround the ‘comfort women’ issue. We must focus on what the majority did, study it and come to grips with it head on. If this volume can help in that way I will be very pleased.
I am extremely grateful for the encouragement I have received from Katagi Mariko, of the Development and Planning section of the BLHRRI who helped me edit the first draft of this text when it was published serially in the monthly journal Human Rights. Later Matsumoto Shinji of the BLHRRI research office guided us through the process from planning the serial publication to the creation of this volume. Only thanks to his patient efforts did we manage to accomplish this. In addition, Kobashi Kazushi of the Kaihō Shuppansha and Miyatake Toshimasa of Ichimojo Kobo were extremely helpful.
I have also benefited from the support and advice of many others across the years. I am grateful to them all.
THE FORMATION OF HININ STATUS GROUPS AND THEIR RESPONSIBILITIES
HININ WERE THE other typical discriminated status group of the Edo period. Apart from the Hideninkaito in Osaka, the Hokujūman of Sakai and the Hidenin of Kyoto, it is thought that most residents of hinin communities in the early modern period were unrelated to the hinin of the middle ages and were formed from people who had migrated into cities because of disturbances to their lives caused by war or poverty.
In Edo they were governed from the so-called ‘Four Places’: Kuruma Zenshichi in Asakusa, Matsuzaemon in Shinagawa, Zentarō in Fukagawa and Kyūbei in Yoyogi – the four hinin leaders. Another, Kyūbei of Kinegawa appears briefly in the records as the name of a hinin leader but not necessarily at the same time as those in the other ‘Four Places’. Moreover, not all of the four other hinin leaders were together at the start of the early modern period. The oldest of these leadership positions is Kuruma Zenshichi. According to a statement that Kuruma Sendaimatsu presented in 1839, his ancestor came from Mikawa Atsumi village and he was appointed as a hinin leader by the city shogunal administrator. Matsuzaemon of Shinagawa, according to a petition submitted in 1854, is said to have been appointed in 1660 by the shogunal authorities in the town and ordered to deal with the influx of ‘field hinin’ from the countryside. These four leaders first appear together in the historical records in 1721. From then on, the organization of the hinin gradually developed to create a hierarchy:
The control of Kuruma Zenshichi by Danzaemon grew stronger from around 1652 but Zenshichi continued to resist it submitting a written complaint in the seventh month of 1719. In the second month of the following year Danzaemon filed a suit against Zenshichi. Danzaemon was successful in that law suit in the eleventh month of 1721 and thereafter was able to exercise control over Zenshichi. In other words, Danzaemon's authority over the hinin under Kuruma was officially recognized at this time.
THE FIRST WORLD War lasted from 1914 to 1918. It stimulated an increase in overseas orders for Japanese manufactured goods and this produced rapid economic growth. On the one hand, it produced a new class of nouveau riche while at the same time the increase in prices caused by the resulting inflation was not matched by an increase in wages and so it produced a decline in working class living standards. Meanwhile in August 1917 the government decided to send an expeditionary force to Siberia to try to keep the Russian revolution in check. Rice merchants anticipating an increase in the price of rice began to buy up rice stocks while restricting sales. This spurred an increase in rice prices but the government took no effective action apart from issuing an anti-profiteering ordinance. Prices continued to increase dramatically. This caused difficulties for the workers who had no option but to buy rice to eat and small scale tenant farmers who paid rent for their land to landowners and had difficulty in getting rice for their own consumption.
The first people to take action were the wives of fishermen in Toyama in July 1918. Protests and rioting then spread across the country reaching a peak in early to mid-August but continuing until October. Disturbances were recorded in every part of Japan apart from Aomori, Iwate, Akita, Tochigi and Okinawa, and in some areas the army had to be mobilized to suppress them (Rekishi Kyōikusha Kyōgikai 2004).
Residents of discriminated Buraku communities typically had been prevented from accessing stable employment by discrimination and had little alternative but to rely on unstable jobs such as day labouring. Most were either unemployed or only semiemployed. For example, in one Buraku community in Okayama prefecture only five out of 110 households could make a living as tenant farmers with 1–2 tan of land (between a quarter to half an acre). The rest were just about able to make ends meet selling meat, working in slaughter houses or making zōri (Sanin Shimpō 11 August 1918).
For this reason, women and children from the Buraku had to go out to work in match factories, find work as child nurses, or plait zōri at home to supplement their household income.
FOLLOWING THE RICE riots and the end of the First World War, statements from people in Buraku communities started to appear in local newspapers. For example, in a letter to the Ehime Shimbun on 20 August 1919 the correspondent, after expressing his doubts about the effectiveness of ‘improvements to the special buraku’, declares that ‘Every time we hear about social this and special that and the need for special treatment we feel as if our hearts are about to break…’, ’I do not think that we are in any way inferior to normal people in level of education, standards of hygiene or general moral values.’ Moreover, rather than ‘sympathetic conciliation’ or ‘Buraku improvements’ what they seek is a recognition that the Burakumin are improving themselves.
The movement seeking the self-awareness of Buraku people bursts out all over the place in various forms. One central theme of discussion in ‘Alarm Bell’, Keishō, the journal of the Sankyoshō a group which was formed in the Buraku district of Ōfuku, Chiki-gun, Nara prefecture was this topic of self-awareness. Taking pride in contributing to state and society as ‘subjects’ as one person put it, ‘Young men, my dear friends, as well as being young men of the village we should plan on reforming our ideas as young men’ (Marubashi Ryuka, The Sin of Non-Awareness, November 1920). They appealed to the younger generation to stand up for themselves and urged them to overcome the improvement movement ideas which had been dominant until then (Matsuo 1974).
THE SWALLOW ASSOCIATION (TSUBAMEKAI) – SEEKING A DISCRIMINATION FREE SOCIETY
The young men who came together to form the Tsubamekai (Swallow Association) which was to play a central role in the process of the formation of the Suiheisha started out with similar ideas. It was formed in 1920 in a Buraku community called Kashiwara, in Wakigami village, Katsuragi-gun Nara prefecture by a group of young men that included Saikō Mankichi (the pen name of Kiyohara Kazutaka), Sakamoto Seiichirō, Komai Kisaku and Ikeda Kichisaku.
Saikō had moved to Tokyo planning to become an artist but his encounter with discrimination there destroyed his dreams and he returned home.