This website uses cookies

This website uses cookies to ensure you get the best experience. By using our website, you agree to our Privacy Policy

Jean-Yves Gilg

Editor, Solicitors Journal

Thanks for sharing

News
Share:
Thanks for sharing

By

Should social media users have the right to 'edit' their online histories, asks Emma Sheldon

A new campaign, iRights, is calling for social media sites to incorporate a simple and accessible method to enable users under 18 to edit and/or delete any content they have uploaded. But are these rights already available to individuals under existing data protection legislation?

Outdated operating system

The existing Data Protection Act 1998 (DPA), was enacted before social media and user generated content were contemplated.

The EU acknowledges the current data protection regime is not sufficient or appropriate and is seeking to introduce a new regulation, including a 'right to be forgotten'.

Meanwhile, the Information Commissioner's Office (ICO) is obliged to find a sensible way to interpret the DPA appropriately in a social media context.

Mixing business and pleasure

Section 36 of the DPA says that the processing of personal data for 'personal, family, and household affairs' is exempt from the Act's usual requirements.

The ICO acknowledges the line between 'personal' and 'commercial' processing on social media platforms may be blurred. For instance, the ICO states an employee sharing a comment about the industry in which they work without prompting and via their personal account would be 'personal use'. By contrast, if an employee is asked by their employer to post a comment on the employer's dedicated page, this is likely to be 'commercial' use.

The ICO recommends individuals separate accounts for personal and professional use, but acknowledges many may not want to do this.

Big Brother

Unlike individual users, social media platforms are operating 'commercially'. Their obligation to comply with the DPA depends on the extent to which they are 'processing' personal data.

The ICO believes it would be disproportionate to require high traffic sites which do not select the content uploaded to review every post. Such sites should be compliant if they have clear policies about acceptable use, have easily accessible procedures for data subjects to dispute the accuracy of posts, request their removal, and deal with any complaints about accuracy quickly.

To ensure personal data is processed lawfully, most platforms require user consent to the uploading of content. However, the ICO makes it clear consent 'will not necessarily last forever'. If consent is required, it is difficult to argue a user should not be able to remove their consent and their content.

While the DPA does not impose different rules for children, the ICO stipulates 'consent must be appropriate to the age and capacity of the individual'. Parental consent should be obtained to process data relating to any child aged 12 or less. Over this age, parental consent may or may not be necessary depending on how 'intrusive' the data processing will be. For example, the ICO recommends parental consent should be required to make a child's image publicly available. It is debatable how much content uploaded by children is subject to adequate consent.

Ask Google

In 2014, the European Court of Justice (CJEU) held Google Spain and Google Inc had an obligation to remove search results and links containing personal data on the grounds making such data so widely available contravened an individual's right to data protection under article 8 of the Charter of Fundamental Rights of the European Union (the Charter), even though it had been justifiable for a website to publish the information. The judgment stated article 8 of the Charter encompasses a 'right to be forgotten'.

Although untested, this concept could logically be extended to social media. It would be consistent under this ruling to find, while an individual may fall within the section 36 exemption in uploading a post, it could still be a breach of an individual's article 8 rights for a social media platform to continue to make such a post widely available for an indefinite period of time.

Recommendations

Despite the lack of legal clarity, there are steps individuals can take to reduce risk of and/or damage caused by unwanted content:

  • Research the terms and conditions: Most social media platforms offer the option to edit or remove most content, although the method for doing this may not be obvious;

  • Check privacy settings: Users should think carefully about who can see their content and tailor the content they upload depending on how 'public' their profile is;

  • Maintain a 'commercial' identity: Separating personal and corporate use of social media clarifies whether data protection legislation applies; and

  • Review and edit content regularly: Often embarrassing or damaging social media content could have been deleted but remains accessible because the relevant user did not think to remove it. Users should review their accounts regularly and remove or edit content which is no longer relevant or appropriate.

Emma Sheldon is a solicitor in the commercial and intellectual property team at Hamlins