Introduction
As well as being an important tool for individuals in their professional roles for the BBC, social media plays a key role for content teams in reaching and engaging with different audiences.
The Editorial Guidelines govern all branded activity in these spaces just as they do for on platform activity. This Guidance explains how and where they should be applied.
Use of platforms
Social media platforms differ widely in the functionality they offer, the audiences they appeal to and risks inherent in the BBC's activity on them.
Before deciding to start any BBC activity on a third party platform teams should consider the editorial purpose, the appropriateness of the platform for the audience and whether they have the resources to manage the account appropriately. Any choice of third party sites must not bring the BBC into disrepute. Advice should be sought from Editorial Policy and divisional social media leads.
Any proposal to use a new platform should also take into account and take advice from Editorial Policy and legal on the terms and conditions of the platform, any contractual, legal, data protection or information security issues.
The closure of any account should be clearly communicated with users, informing them that the account will no longer be updated or moderated and pointing them to an alternative source. Any content on our social channels is still part of the BBC archive, so the presumption should be that the account is not deleted and remains accessible.
Errors, corrections and apologies on all BBC branded social media accounts should be transparent. Any correction or apology should be speedily made and connected clearly with the original error.
Ensuring Impartiality on social media
Maintaining due impartiality in content and activity on social media is as important as on BBC platforms. The specific challenges posed by each platform in achieving due impartiality should be carefully considered.
The use of hashtags to aggregate content and join in social conversations should not support or allow the appearance of supporting a cause, promoting a brand, or point of view no matter how apparently worthy the cause or how much their message appears to be accepted or uncontroversial.
While it is possible to publish personal view content or reflect a range of opinion on controversial subjects, the serendipitous nature of social feeds and the nature of the algorithm that drives them, means that audiences may not always come across the full range of content we have produced on these subjects.
Connected content or content that might reflect a range of views on the same subject should be linked or signposted where it is appropriate. Just as with BBC platforms, each social account should take responsibility for ensuring due impartiality across its output.
Social media posts – whether text or video - contain should appropriate context. Each piece of content should be judged on its own merit, unless the posts are threaded or similarly linked. Be aware of the risks that short headlines, tweets or the pressure to create shareable content pose to publishing impartial or accurate content.
Impartiality in all aspects of social media activity during election campaigns is essential, in line with BBC election guidelines.
Similarly, due impartiality is required when it comes to sharing of content or liking content published by others as well as in any choice of which accounts to follow.
remember that social media platforms, including accounts, reflect particular demographics. Opinion gathered from these accounts either through functionality offered by each platform or anecdotally, can never be statistically robust or genuinely representative and should only be used as vox pops.
Similarly, functionality that platforms describe as 'polling' – where users can choose from a range of options – has no statistical credibility and should only ever be used as a tool for audience engagement. Results should not normally generate any significant outcomes or be used beyond the platform itself without referring to ITACU or Editorial Policy.
Calls to action or appeals for contributors through social media will reflect the same issues – that they will be dependent on the particular demographic of the platforms and accounts used. Any calls to action must be neutrally phrased to avoid accusations of bias and posted on BBC branded accounts. Where possible there should be alternative ways of contacting the BBC rather than just via social media – for example by email.
Audience Expectations
Audiences expect BBC run spaces on third party platforms to reflect similar values to on platform brand activity within the context of the tone and style of the particular social media site in question.
BBC brands will have their own identity, familiar to their audiences, which they should be able to express through the content they share and the conversations they have. They should use the same G for Guidance warnings on content used on BBC sites.
Be mindful of the potentially harmful impact of graphic images used as thumbnails or hero images on posts that can appear in users' timelines without warning or context. Similarly, consider carefully the impact of offensive language or images in the opening moments of a video posted to social media.
Although each platform has its own terms and conditions governing user behaviour, teams should not rely entirely on the platforms themselves to manage communities on BBC spaces. They should take overall responsibility and ensure user behaviour is in line with audience expectations for individual BBC brands.
Moderation of comment threads should generally be light touch but abusive behaviour will not be tolerated, particularly when it takes the form of personal attacks or offensive language.
Teams should be aware of any potential legal risks posed by any comments sparked by content, including hate speech and defamation or contempt of court – including internationally where laws about the liability for comments may differ.
Publishing teams are responsible for all the social media conversations they start – and where there could be a risk, they should consult Editorial Policy, Legal or their Divisional Social Leads before posting any content – and similarly agreeing if and when to turn comments off, to ensure a consistent and justified approach to limiting discussion across the BBC
The widest possible range of opinions consistent with our duty of care, appropriate language and behaviour, and the law should be accommodated. Where it is offered, appropriate comment that is critical of the BBC, talent, programmes or policies but not abusive behaviour should be included. Blocking users from accounts should be a last resort and advice can be sought from Editorial Policy.
Duty of Care
Content should only be posted on social media with the informed consent of contributors.
Teams should consider whether to post particular content where it could put contributors at risk of significant harm – particularly when they are young or vulnerable.
They should take account of the potential impact of harmful comment, of content being widely shared or of the possibility of individuals being identified even where steps have been taken to anonymise them.
When teams decide to post such content they should provide contributors with the necessary support and ensure the use of appropriate key word filters and moderation to minimise the potential harm. Always seek advice from the Moderation Services Team and Editorial Policy.
Escalation strategies should be in place for cases of suspected child grooming, threat to life, serious sexual assault or to avoid serious harm. Teams should also be prepared to respond to continued harassment of individuals, including people who work for the BBC. Advice can be sought from Editorial Policy and the Safety, Security and Resilience team.
Whenever BBC content requires pointing audience to support lines it should be included on each piece of content even if it is part of an extended series. Links to helplines can be included on any video or in the supporting text. Editorial Policy advice may be sought before applying support lines to content.
Children and young people
Children and young people have a right to a voice on the BBC's social media channels, but teams should ensure that they are able to operate in a safe and appropriate environment. If they cannot be sure publication is not in the best interests of such contributors, even with the consent of parents or guardians, they may choose not to publish that content on social media.
The BBC should normally abide by the terms and conditions of third party platforms – especially in relation to the minimum age for use. Any proposal not to do so must be referred to Editorial Policy. If teams create spaces aimed at a young teenage audience, for example 13-16, we should take particular care to ensure the platform we use and the behaviour in the BBC space remains appropriate so that it remains a safe environment.
BBC networks, programmes and channels have a distinct tone of voice and teams should only ever communicate with children and young people on social media using these BBC channels, never in an individual capacity.