Access to Inappropriate Content in Virtual Worlds by kids – Not so much, really
The Federal Trade Commission (FTC) issued a report in December about the availability of explicit content in Virtual Worlds entitled “Virtual Worlds and Kids: Mapping the Risks”. Despite a slightly alarming header in the press release, the report results are not as negative as they may first appear. Little explicit content appeared in child-oriented virtual worlds, and the explicit content found in some of the teen and adult sites could not be accessed by children if they accurately self-reported their age.
It is important to note that the report states that “because the Commission’s researchers examined these worlds with the express purpose of uncovering sexually and violently explicit content, it is unlikely that a typical user would stumble upon such content unintentionally”.
Equate this to a child spending time in a book store or library: the amount of potentially questionable/explicit material differs based on the section of the store a child spends time in. If a child sticks to sections that have books and other media designed for their age group, the odds remain low. Should a child head out of the children’s section and into a section of teen and adult material, the chances of seeing explicit material, either pictorial or printed word, go up. But again, one would really need to be looking for such material in order to find it.
The report also finds that many virtual worlds are taking successful steps to screen content as well as to keep children from viewing content that may not be appropriate for their age. Researchers found little or no explicit content in18 of the 27 virtual worlds they visited. In others, such content was only found when registered as a teen or as an adult, not when visiting the site as a child user.
The FTC makes the following five sagacious recommendations for all virtual worlds:
- Better age screening to ensure that the virtual world’s target audience is, in fact, the age they claim to be.
- Age segregating in worlds that appeal to a wider demographic, such as Second Life
- Strengthen word filters to ensure that user-generated content is appropriately screened
- Community moderators to help ensure adherence to codes of conduct, as well as clear codes of conduct so that users can better self-adhere to community standards
- Parental and Youth Education about the complexities of this particular type of social media.
The final take-away from this report should be that, while it is important for creators of virtual worlds to spend time and effort to thoughtfully craft a safe virtual world, the worlds themselves are not inherently fraught with negative content. It is important to reemphasize that the instances of explicit content, as defined by the FTC, were rare in the children’s virtual worlds, and researchers had to dig really hard to find it. Virtual worlds contain an amazing wealth of creative, thoughtful and positive content, and with the help of electronic tools and sharp trained community moderators, the benefits of these worlds can far outweigh any detractions.