2 results
eleven - Inappropriate content
- Edited by Sonia Livingstone, London School of Economics and Political Science, Leslie Haddon, London School of Economics and Political Science
-
- Book:
- Kids Online
- Published by:
- Bristol University Press
- Published online:
- 15 July 2022
- Print publication:
- 30 September 2009, pp 135-146
-
- Chapter
- Export citation
-
Summary
Introduction
Use of the internet and its associated services is becoming an increasingly popular pastime, particularly among children and young people, but despite the many benefits offered there are also risks which they must be made aware of. The possibility that children could encounter inappropriate content online receives less public attention than the risk that they may make risky contact with people met online, and the range of content that is of potential concern is vast, including pornography, racist material, violent and gruesome content, self-harm sites (including pro-anorexia and pro-suicide sites), commercially exploitative material and more. The European Commission (EC) supports Safer Internet hotlines throughout Europe where people can anonymously report what they perceive as illegal or disturbing content (EC, 2009). Thirty-four hotlines across the globe are members of the International Association of Internet Hotlines see (www.inhope.org).
This chapter focuses on children and young people's access to inappropriate content online. The term ‘inappropriate content’ is not a well-defined term and one can find variations across generations and across countries and cultures. Content that seems inappropriate from an adult's perspective may not be perceived in the same way by children and young people. Furthermore, cultural differences may influence how we understand and categorise different types of content. This blurry middle ground can contain sexual content, for example, as it is hard to achieve consensus on what is pornography and what is sexual information or portrayal. On the other hand, certain content is universally classified as inappropriate for children in all cultures – for example the depiction of graphic violence or sexual abuse, and encouragement to harm one's self or others. Furthermore, some content can be classified as illegal (thus inappropriate), such as violent or sexual acts against children, and the promotion of racism and xenophobia.
The EU Kids Online network categorised the different types of inappropriate content and risks that children can encounter online (Hasebrink et al, 2009), as presented in Chapter One (this volume). The classification is based on the role of the child (as recipient, as participant or as actor) and the motives of the provider (commercial, aggressive, sexual and values-related). The aim of this chapter is to provide a description of the empirical evidence available within the EU Kids Online network, and where appropriate within the wider literature, regarding inappropriate material encountered by children online.
ten - Risky contacts
- Edited by Sonia Livingstone, London School of Economics and Political Science, Leslie Haddon, London School of Economics and Political Science
-
- Book:
- Kids Online
- Published by:
- Bristol University Press
- Published online:
- 15 July 2022
- Print publication:
- 30 September 2009, pp 123-134
-
- Chapter
- Export citation
-
Summary
Introduction
One of the anxieties regarding children's internet use relates to the potential for risky contacts (see, for example, EC, 2008). This chapter critically reviews the latest findings and theories on children's risky contacts with adults and children – grooming, harassment and meetings – in order to identify who is really at risk from what. Two primary types of risks will be discussed: children and young people as victims of aggressive communication and as victims of sexually oriented communication. This discussion will expand our current understanding of both media and social psychological dimensions of the misuse and abuse of the internet, which in turn can enable industry, policy makers and future generations to approach and develop an online environment that poses fewer dangers.
The specific characteristics of online communication that appear to lower thresholds to find, contact and interact with others have led to perceptions that dangerous encounters are likely. Media panics similarly emphasise the dangers represented by adults with intentions of deceiving children and young people online (Weathers, 2008). Whereas this is indeed a risk, with potentially grave and tragic outcomes, research shows that children and young people are quick to emphasise that they are aware of ill-intended adults online, and that they take action to prevent such contact (Dunkels, 2008). However, research concerning what counter-strategies are actually helpful is conspicuous by its absence. Children may be regarded as native users of contemporary technology, but there is little research on the effectiveness of their often self-taught strategies to avoid online victimisation. Young people who are at risk of meeting adults typically already face serious problems offline (Shannon, 2007), and often they are not being deceived (Ybarra et al, 2007). The most likely scenario is that a sexual predator takes advantage of the fact that a child discloses vulnerability online. The predator offers to be an understanding and supporting adult and starts building a manipulative relationship with the child. When this process, called grooming, is completed, the potential victim often readily travels to meet the predator, even if aware of the adult's sexual intentions.
![](/core/cambridge-core/public/images/lazy-loader.gif)