Child-friendly transparency of data processing in the EU: from legal requirements to platform policies
Keywords
Children
data protection
GDPR
transparency
privacy policy
Instagram
Snapchat
TikTok
co-design
cocreation
European Union
Publication details
Year: | 2018 |
DOI: | 10.1080/17482798.2019.1701055 |
Issued: | 2019 |
Language: | English |
Volume: | 14 |
Issue: | 1 |
Start Page: | 5 |
End Page: | 21 |
Editors: | |
Authors: | Milkaite I.; Lievens E. |
Type: | Journal article |
Journal: | Journal of Children and Media |
Publisher: | Informa UK Limited |
Topics: | Online safety and policy regulation |
Sample: | The article carries out an in-depth theoretical analysis of articles 12,13 and 14 GDPR, the relevant recitals and guidelines by data protection authorities (DPAs) in the section entitled “Theoretical analysis of articles 12, 13 and 14 GDPR: the child’s right to transparent information about data processing”. It also reports on the results on the data protection policies of Instagram, Snapchat and TikTok. |
Implications For Policy Makers About: | Creating a safe environment for children online; Stepping up awareness and empowerment |
Abstract
Vast amounts of personal data of children are collected and processed in today’s increasingly digital, connected society by public and private actors. Children do have a right to the protection of
their personal data and, according to the General Data Protection
Regulation (GDPR), even merit specific protection. Children should
be clearly informed of and understand what happens with their
data when it is collected, processed, stored and transferred. To that
end, specific transparency standards require the provision of information in a clear and plain language that the child can easily understand. In this article, after having mapped these existing
requirements, the privacy policies of Instagram, Snapchat and
TikTok — services which are very popular with children — are
evaluated. The findings suggest that such policies are still complex,
long and primarily text based. In order to improve this, possible
practical ways of enhancing transparency for children such as legal
visualization, co-design, co-creation techniques and participatory
design methods which focus on presenting legal information in
a transparent and clear manner are explored.
Outcome
"Both children and adults use online services and often accept their terms and conditions without actually reading them or feel they do not have a genuine choice (not) to agree with data policies. It might not be fair to expect data subjects – adults, but especially (young) children – to understand the often very complex and opaque data collection and processing practices. Whereas this does not mean that no information should be provided to them. Especifically with respect to children and the special protection they deserve with regard to their personal data. The responsibility for undertanding cannot be placed solely on their shoulders, nor on those of their parents.
Efforts by online service providers and data protection authorities should be enhanced in order to ensure fair and transparent processing of children’s personal information. It is time for service providers to invest in innovative ways to offer information, which may in the end lead to enhanced trust relationships with their users. An important guarantee for reaching this goal is actually including children in information design and evaluation processes. Also, improvement for moving from text based, long policies to more engaging formats and understandable language adapted to different ages of users and other – more inventive – formats of information should be considered as well." (Milkaite & Lievens, 2020, pp. 17-18)