Skip to main content

Definition

This section covers the policy that must be followed on websites that allow users to publish content.

There are website owners who allow users to add content. These users are members of the public e.g., citizens, stakeholders, other organisations.

There are different categories of users on a website depending on the level of identification and trust associated with the user. See below how these categories are defined:

  • Unknown users are users who have visited a website but have not logged in or provided any information that would allow to identify them.
  • Registered/untrusted users are individuals who have self-registered themselves by creating an EU login to login on the website. They have gone through the sign-up process to establish their identity on the website, but they have not been verified by the site owner.
  • Verified/trusted users are a subset of registered users who have taken additional steps to confirm their identity, resulting in increased trust and additional benefits on the website.

The actions that the users can or cannot do on the website must be clearly defined by the website policy.

The type of content that users may add falls in the following categories:

  • comments on existing content published by the owner
  • contribution to the EU decision-making process
  • feedback and/or suggestions on specific topics/actions
  • views on particular matters
  • platforms providing citizens the opportunity to debate on EU policy, challenges and priorities, e.g. online communities
  • specific content-sharing platforms that allow, for example, file uploads.

Website moderation is the process to identify and delete inappropriate content as fast as possible to protect the European Commission’s reputation.

What is inappropriate content?

The following content is always considered inappropriate

  • abusive, obscene, vulgar, slanderous, hateful, xenophobic, threatening or sexually oriented comments
  • spam, advertising for a website or product
  • duplicate content, where the same content has been posted more than once by the same user
  • off-topic comments
  • links to illegal or pirated software.

Rules

Unmoderated user-generated content on websites in the europa.eu domain is not permitted. Site owners must monitor what appears on their sites, detect and delete unsuitable content as quickly as possible.

Websites allowing user-generated content must clearly state this in a disclaimer (see example below) and define and document the moderation procedure they follow so that users be informed. The disclaimer should be added to the home page and to every page’s footer.

Website owners must establish an acceptable use policy on websites, clearly informing users about what can and cannot be done and make them aware that unsuitable content will be removed.

If user-generated content is mixed with official content published by the owner of the site, then a label should be used to distinguish the user-generated content.

In addition, website owners must implement a verification process for new registered unverified users to establish trust, such as central validation or "sponsorship" by existing users. Users should not be granted any privilege before their identity is verified. (Reference: Access Control and Authentication Standard)

Verified trusted users must be granted only the minimum necessary rights for uploading and publishing content. Access to system functionalities and data should be prohibited. (Reference: Access Control and Authentication Standard)

User-generated content on Commission websites must undergo pre- or post-moderation by the service that owns the website. Specific content contribution rules should be available and accepted by the users.

  • Pre-moderation: the content is verified and authorised before publication. Pre-moderation is the preferred option.
  • Post-moderation: the content is verified by a moderator after publication and is deleted if unsuitable. Post-moderation should be limited to exceptional cases justified by business needs.

The moderation process

Moderation may be performed by web masters, web editors or contractors responsible for support and maintenance and can be implemented in two ways as follows:

Pre-moderation

The content is checked before it is published based on the content contribution rules published on the website.

An appropriate privacy statement must appear on the website that indicates where the content is user-generated with pre-moderation. Users must agree with this statement.

Post-moderation

Post-moderation is limited to cases justified by exceptional business needs and must respect the following conditions:

  • only verified trusted users can publish content on the website
  • resources are available for sufficient and timely post-moderation (within 8 hours maximum after the post) to ensure application of Terms of Use.

How to limit and verify published content

  • File size and allowed types of uploaded files should be limited to only those that are necessary for business purposes. Whenever possible, the use of web forms should be preferred over uploading files
  • scan uploaded files with antivirus/antimalware software and store them securely in any publicly accessible web directory of the webserver. (Reference: Web Application Security Standard)
  • automatically rename uploaded files to prevent attackers from guessing the names of uploaded files and accessing them directly
  • enable verified users to report suspicious or spam- content within uploaded files.

How to limit bot operations and propagation to search engines

  • Pages containing sensitive or restricted content should include a "no index" tag to prevent their indexing by most search engines, ensuring they do not appear in search results
  • implement bot mitigation and blocking mechanisms, such as bot challenges involving a CAPTCHA or behavioral methods compliant with the accessibility standards, to distinguish and block automated bots. Throttling/ rate-limiting strategies should also be employed to limit excessive bot access after a certain threshold.

Moderator’s tasks

The moderator is responsible for the following tasks

  1. performing regular audits of pages with public content, verifying the good functioning of the access control and looking for unsuitable content
  2. removing unsuitable content as soon as detected
  3. ensuring the continuity of the moderation procedure in his/her absence

NB: The moderator may invite users to report any contribution they find inappropriate. For example, by offering users to fill in and send a report feedback form created for this purpose. If the opportunity to report unsuitable content is given to users, the moderator is responsible for receiving the feedback and removing the content that is deemed unsuitable.  

Statement and identification

 

  • Add a visible disclaimer to the home page and every page’s footer.

    Example disclaimer

    “This website contains user-generated content. The European Commission does not endorse any views, opinions or advice expressed by visitors to this website.”

     
  • Add a paragraph to the site’s terms of use. 

    Example paragraph

    “Although the use of the website requires all users to comply with these Terms of Use, some inappropriate user-generated content may still be submitted and displayed. We reserve the right to delete any content that violates the Terms of Use or the Community Guidelines, without notice.”

Contact and support

Need further assistance on this topic? Please contact Europa Domain Management (EU Login required)