Contents
This research has sought to explore what respectful relationships educators’ observations about how children and young people talk about social media, gaming, online safety and image sharing can tell us about prevention of image-based abuse, including AI-generated image-based abuse. It has also sought to understand how social media algorithms might be understood as a contributing factor to gendered violence enacted by children and young people. Due to the limited nature of the data, collected from a small group of specialist practitioners, we propose the following implications as areas where future thought leadership and research might focus.
Implications of these findings
Drawing on the research findings, we discuss three implications:
- Image autonomy may be a protective factor for image-based harms.
- Social media algorithms may drive children’s and young people’s resistance to prevention of gendered violence by serving polarising and misogynistic content that reinforces gender norms and stereotypes.
- Adults appear to have a vital role in the prevention of TA-HSBs enacted by children and young people using images.
Image autonomy may be a protective factor for image-based harms
In line with existing gendered violence prevention frameworks, it is critical to determine the social norms that underpin and reinforce violent behaviour enacted through images, to understand what factors drive children and young people to create harmful images of their peers using AI, as well as enact or condone other forms of image-based harms. For image autonomy, this may be examining if there is a link between children’s and young people’s attitudes towards, and knowledge of, image autonomy that may contribute towards children’s and young people’s understanding of image-based harms. The findings of this research suggest that building awareness of image autonomy with children, young people and adults may contribute opportunities to gain new insights towards the prevention of image-based harms, including TA-HSBs enacted using generative AI.
Image autonomy was a consistent theme throughout the focus group discussions and presents a key element of Body Safety Australia’s practice that is under-represented in research. These discussions made clear that RRE educators were observing that from the beginning, in early childhood, children are taught that their image does not belong to them and that they have no agency over if, when, how and by whom their image is taken and shared with others. These messages are often implicitly reinforced by content they encounter online, or more explicitly at home when children have their image shared on their parents’ social media.
Participants reflected that many children and young people believe that it is normal for adults they trust to not ask for consent before capturing or sharing photos or videos of them, or even to share images without their consent. The vast majority of these images likely contain content that is not harmful in and of itself. However, the normalisation of image sharing without inviting and obtaining permission by the adults around them may be foundational to how children and young people consider their own image-sharing practices with peers as they grow older.
Research participants agreed that the established social norms around image taking and sharing with children appear to drive children’s resistance to conversations about consensual image practices. Recognising that image taking and sharing practices have evolved with the emergence of social media and related technologies, it is important to consider how these social norms are changing alongside technologies, and where there are opportunities for prevention efforts to reshape norms that may be doing harm.
Social media algorithms may drive children’s and young people’s resistance to prevention of gendered violence by serving polarising and misogynistic content that reinforces gender norms and stereotypes
Taken together with the extant literature, analysis of the focus group discussions suggests that, while not causing gendered violence on their own, social media algorithms may contribute to gendered violence by driving and reinforcing children’s and young people’s resistance to prevention efforts such as RRE programs in schools. This appears to be due to the proliferation of misogynistic content, as well as content that reinforces gender norms and stereotypes.
As the educators reflected, 8–10 years of age was frequently observed to be a pivotal point of change in the RRE classroom in children’s understanding and adherence to gender norms and encountering gendered content online. This highlights an additional consideration to understand middle primary as a potentially vital stage of young people’s life for embedding prevention of gendered violence activities. Additionally, this may also be a critical point in children’s lives to understand and map their concept of image ownership and implement prevention education that supports respect and agency for images.
Forms of resistance to gendered violence prevention work were observed at a greater frequency in upper primary and secondary school children and young people. Citing examples such as the Depp v Heard trial, the educators in the focus groups reflected how misogynistic online discourse mobilised some young people to actively resist and challenge conversations about consent and respectful relationships. In some cases, this resistance was reinforced by teachers who were observing the educators’ facilitation, demonstrating how the online messages that children hear can be unknowingly reinforced by adults in their offline interactions.
The role of gaming ecosystems was a significant focus of the focus group discussions and indicates emerging considerations for prevention. Particularly for primary-school-aged children, who were less consistently on social media platforms such as TikTok and Instagram, gaming served as a primary way children engaged in play, communicated with friends and had fun online. YouTube was similarly identified as highly popular, which is consistent with research that shows YouTube is the most popular service for children under 13 (40). Yet this was also a way in which children were observed to be encountering and being normalised to manosphere language through memes and humour. This also has implications for the implementation of the Social Media Minimum Age Act. Platforms such as Discord, unless it is also included in the ban, may serve as a way that young people still encounter algorithmically driven content, even if they are not actively on social media. Children will still be able to access content on YouTube that does not require a user to be logged in and therefore may be less likely to be redirected to the algorithmically moderated YouTube Kids platform. As a result, they may still encounter harmful content including misogynistic or otherwise discriminatory material, which may not fall under the platform’s definition of age-restricted content (92). These findings highlight the fundamental importance of digital media literacy as a protective factor against TA-HSBs (15). Education for children and young people that addresses both the content and design of social media platforms will help enable them to critically engage with harmful or unrealistic representations being presented to them online.
This signals the need for a multifaceted approach to preventing the harmful content children and young people may encounter online. At a structural level, the prevention sector must work alongside social media platforms to manage regulation and reform. Policy and legislative efforts that impact children, including the Social Media Minimum Age Act, must be informed by the rights of the child, safety by design principles, and gendered violence prevention principles. Prioritising safeguarding approaches, legislative and regulatory action must allow children and young people to participate in online spaces while supporting them through education to build their capability to critically engage with content online.
Adults appear to have a vital role in the prevention of TA-HSBs enacted by children and young people using images
The focus groups reflected on the vital role that the broader school community has in shaping children’s perspectives on image autonomy, online safety and gender norms and stereotypes. Consistent with recommendations from Our Watch’s RRE blueprint (21), engaging with parents, caregivers and school staff is a critical component for the long-term success of RRE and other primary prevention efforts.
The findings in this report highlight the need for increased access to education and training for adults, including parents, caregivers, teachers and adults in the broader community. This guidance should use a strengths-based and rights-based approach to children’s sexual development, which builds adults’ confidence to model and have age-appropriate discussions with children about sexual behaviour, digital literacy and online harms (15).
Building this confidence also includes building adults’ understanding of the contexts and environments in which children enact TA-HSBs towards other children using images. The focus group participants identified that, in their experience, many parents’ understandings of online risk and harm was not often grounded in technology-facilitated abuse and violence. There is also an observed lack of knowledge of, or apprehension about discussing, peer-enacted harms with children and young people when the harm involves sexual or nude imagery. Further education, resources and support for parents and caregivers may enable them to more confidently navigate discussions of online safety, consent and TA-HSBs with the children in their care.
Future research directions
Engaging with children and young people directly
This study was exploratory in nature and has presented implications for prevention that indicate future directions for research. Firstly, it is critical that the implications discussed are investigated further through engaging with children and young people directly.
The Australian National Research Agenda to End Violence against Women and Children 2023–2028 (ANRA) identifies children and young people as a priority population for research into preventing domestic, family and sexual violence, as people who have experienced violence in their own right (93). The ANRA highlights the importance of centring the voices of children and young people in research when they are still children, in order to design systems and services that are appropriately tailored to their needs (93). More research is needed on children’s experiences of technologies, social media and gendered violence, and how this affects their own attitudes and behaviours. In further investigating and understanding the experiences of children and young people, it is critical that further research seeks to explore how the intersections of race, gender and disability drive how children enact TA-HSBs using images.
It may be valuable to further investigate younger children’s attitudes towards, and experiences of, TA-HSBs. As identified in the focus group discussions in this report, children aged 8–12 were observed to hold gendered attitudes, have established online lives, and possibly already be resistant to messaging around consent for image taking or sharing. This may be a critical period of children’s development to investigate further in prevention. Consistent with the literature, which shows young people aged 12–15 are the most common age group for displaying harmful sexual behaviours and 10–14 most commonly the age children experience TA-HSBs (15), effective prevention likely needs to start at an earlier stage to address these harms.
With this in consideration, it may be useful to consider the need to adapt existing Victorian primary prevention frameworks for a child-specific model. Harmful sexual behaviours, including TA-HSBs, occur within the context of children’s cognitive, social, sexual and physical development; adult-perpetrated violence occurs within different contexts (15). As a part of the ongoing development and improvement of new and existing frameworks, the role of technologies – including the use of generative AI and online spaces – should be continuously and rapidly revisited to ensure prevention remains effective and at the forefront of children’s needs and experiences.
In addition, efforts to work directly with children and young people to build on the current understanding of effective prevention should be complemented by understanding the role of parents and caregivers in effective prevention with children, particularly when it comes to image autonomy.
Further research into image autonomy
Image autonomy is a novel concept in research, practice and policymaking. Further research is needed to examine the relationship between children’s and adults’ understanding of image autonomy, their attitudes towards TA-HSBs and image-based abuse, and their enactment or experience of these behaviours. This must be led by awareness raising and practice exploration of image autonomy and its importance beyond the existing scope of Body Safety Australia’s practice.
As identified in the Implications section, ‘Your image belongs to you’ finds that there are promising indicators that image autonomy may prove a protective factor against TA-HSBs in children and young people. This signals the need for robust empirical research on image autonomy to evaluate if and how it may serve as a factor in violence prevention frameworks, considering the social and environmental factors that influence the effectiveness of image autonomy as part of broader prevention efforts. This may include creating an understanding of, and mapping, developmentally appropriate image autonomy practices for children by age and stage.
In addition to comprehending children’s and young people’s understanding of image autonomy, it is important that there be further research into adult understandings of image autonomy. As identified in the focus groups, parents do not model image autonomy to children and young people. Further research may seek to understand how social norms and technological contexts inform adults’ ideas about image sharing and online safety, how image autonomy is or is not modelled by adults, and how these factors contribute to children’s and young people’s attitudes towards consent for taking, editing or sharing images.
This is consistent with the National Strategy to Prevent and Respond to Child Sexual Abuse (2021–2030), which indicates that responses to prevent and respond to child sexual abuse must engage the whole community (25). In addition, the National Strategy and Change the story identify parents as a main influence on children’s gendered socialisation (12, 25).
To begin addressing this research gap, Body Safety Australia is undertaking a research project working with young people aged between 13 and 17 to better understand how they perceive and experience image taking and sharing with their peers, with the aim of developing a prevention program to address TA-HSBs enacted using images, including those created using generative AI. The project aims to create a better understanding of young people’s experiences and the social norms that young people are observing and experiencing with their peers, as well as to inform the development of a tool for adults to respond to a young person’s disclosure of image-based abuse.
Further work to expand how we think about primary prevention and online social lives and influences
This research has highlighted the need to reconsider the framing of ‘online’ in primary prevention spaces and frameworks. There is an important and growing body of research literature that focuses on understanding how gendered violence and related harms are perpetrated and experienced online, and ways to respond to support people who have experienced violence and deter or punish people who have used violence. However, to date, there has been relatively little research that explores the multifaceted dimensions of action that might be taken to prevent these harms before they occur, particularly with relation to children and young people.
This report has started to explore the role of online spaces as potential sites for preventative action and highlights the need for more research on the role of algorithms as a potential contributing factor to gendered violence. Further, our findings suggest that research that seeks to understand further how children view and understand algorithms may support the ongoing development and improvement of age-appropriate media literacy programs.
The focus group discussions highlight that the role of gaming spaces and their surrounding ecosystems also require further investigation. Platforms adjacent to gaming spaces, such as Discord and YouTube, may act as a space where children and young people experience and access content that typically circulates on social media platforms. This is of note given that some gaming and chat platforms may be exempt from the Social Media Minimum Age Act, and if so may effectively circumvent some of the protections for minors that the age restrictions are intended to provide. This highlights the fact that legislative and policy action must be complemented through providing effective critical media literacy for children, young people, parents, families and school communities, as well as the crucial importance of resourcing access to help-seeking when children and young people experience harms online.
Parents play a crucial role with their children in supporting the prevention of online harms. Future research should explore the barriers to online safety education for adults to strengthen community-level prevention education for both adults and children.
Further work to understand intersecting forms of structural discrimination in technology-assisted harm
This research provides limited insight into the intersections between sexist and misogynistic drivers of technology-assisted harm and those rooted in racism, ableism, classism, colonialism, heteronormativity and cisnormativity. The links between each of these forms of discrimination are well established in other research and practice literature (94-96). In particular, the overlap between misogynistic online content and white supremacist, anti-transgender and religious extremist radicalisation are well established (97, 98), as is the explicit ableism that is described by people who have experienced online hate (99).
More research is required that explores how and to what degree children and young people are exposed to these various types of discriminatory content and discourse, alongside the misogynistic content that is the focus of this report. There is also a need for evidence that helps to build more nuanced policy and practice understanding about the effects of children’s and young people’s exposure to racist, ableist, transphobic and homophobic material on their behaviour towards peers, including regarding TA-HSBs enacted using images.