[go: up one dir, main page]

14 January 2025

Transparency Report

This is the first edition of our Transparency Report, which provides insight into our content moderation efforts over a six-month period (Jan - June 2024) and reaffirms our commitment to safety, transparency, and compliance with the Digital Services Act. These reports will be published biannually. The data from this report was collected between July and August 2024. Due to this there may be slight changes and discrepancies to current numbers due to ongoing appeals and decisions.

Linktree’s mission is to empower anyone to curate, grow, and monetise their digital universe. Our highest priority is ensuring a safe and reliable experience for the visitors, creators and businesses worldwide who trust our platform. To this end, we have invested significantly in both people and technology to swiftly identify and remove harmful content.

Our Trust & Safety team accomplishes this with both proactive and reactive reporting workflows.

  • Our internal workflows identify and flag high-risk content, which is then automatically removed or sent to our moderation team for manual review.
  • Our website forms (Violation Report Form and IP Infringement Report Form) and mobile app enable visitors to easily report any content they come across that they believe is in breach of our Terms & Conditions or Community Standards.
  • Any requests we receive from law enforcement or government bodies are reviewed and prompt action is taken when required.

Our Approach to Content Moderation

Creative expression is central to our community, and while members have the freedom to curate their profiles, we have safeguards and limits in place to ensure a trusted and safe experience on our platform. 

Legal Requirements

The simplest yet most important reason for these safeguards is to ensure legal compliance.

A large cohort of our user base is composed of creatives and content creators: for this reason, we take claims of intellectual property infringement very seriously. We take action on the misuse or the unauthorised use of any copyrighted or trademarked material.

Ensuring a Safe and Trusted Experience

At Linktree, we prioritize individual privacy and safety. As part of these efforts, we don’t allow profiles that expose others’ personal information (e.g., address, IDs). In line with our intellectual property policies, we also prohibit profiles that impersonate individuals or organizations.

Users cannot use Linktree to share content with the intention to intimidate, harass, threaten, or bully another person. Exposing private media and image-based sexual abuse (including the unauthorised sharing of personal media from subscription-based online platforms, DMCA-protected adult content, or non-consensual pornography) is also prohibited.

Furthermore, any content featuring hate speech (discrimination against an individual or a group based on their religion, ethnicity, nationality, race, colour, descent, gender, or other identity factors) will be removed from our platform.

Protection Against Physical Harm

We prohibit the promotion of behaviours or actions that could cause physical harm, illness, or even death. This includes medical misinformation, e.g. dangerous “alternative” COVID-19 remedies.

Our Policies

When creating a Linktree account, users must confirm their adherence to our policies, including our Terms & Conditions and Community Standards. Any violations of these policies will result in warnings, content removal and/or account termination.

Our Community Standards outline guidelines for what content can and can’t be shared on Linktree. In most cases, our Community Standards cover content that is considered illegal in the EU. However, in some categories we may also remove content that is legal but deemed violative on our platform. 

Our Community Standards may limit or prohibit the following content if it is found on a Linktree profile:

  • Adult Content
  • Child Harm
  • Copyright & Trademark Infringement
  • Electoral Fraud
  • Extremism/Terrorism
  • Harassment
  • Hate Speech
  • Illegal Goods and Services
  • Misinformation
  • Invasion of Privacy & Impersonation
  • Self-harm
  • Shocking or Violent Content
  • Spam & Fraud
  • URL Abuse

Team Structure

Linktree's Trust & Safety team operates around the clock, with members strategically located across Australia, the Philippines, and the United States. This global team ensures 24/7 support for reviewing content and responding to community violation reports, enabling us to address issues efficiently and remove harmful content as quickly as possible. The team is structured to include multiple escalation pathways both within the moderation team and across full-time Trust & Safety managers at Linktree.

All team members have access to regular and comprehensive training, monthly quality assurance checks, and regular meetings with their team leaders. Additionally, due to the nature of the content reviewed, we provide access to mental health support to ensure the well-being of our Trust & Safety team.

How We Remove Content

Linktree employs a combination of automated, hybrid, and manual moderation methods to flag, review, and manage content on the platform. We have two primary approaches to address content that violates our Community Standards: content removal and profile suspension.

Content Removals

Linktree prioritises removing violative content from a profile before considering profile suspension whenever possible. This process may involve removing links that direct to harmful sites, deleting images containing inappropriate content, or applying a sensitive content warning to specific links to alert visitors before they proceed to third-party websites.

The person is notified whenever content is removed from their account, and they are provided with instructions on how to appeal the moderation action if they believe it was made in error.

Profile Suspensions

In cases where a violation is particularly severe or there are multiple infractions associated with a profile, the entire profile may be suspended. Users are notified when their profile is suspended and are given instructions on how to appeal the decision, unless there is a business or legal reason that prevents us from doing so.

Notifications

Community Violation Reports

If users or visitors believe they have found content that violates our policies, they can use Linktree’s in-profile reporting tool to access our Community Violation Report form. Once the form has been completed, it is manually reviewed by our moderation team and the reporting party is notified once their report has been processed.

Intellectual Property Reports

If there is a case of intellectual property infringement located on a Linktree profile, visitors or rights holders can submit an intellectual property report that will be reviewed manually, usually within 1-2 business days.

Anyone who submits an intellectual property report will be notified of the outcome, including whether their report was accepted or rejected.
 
Users will also be notified if any of their content is moderated due to an intellectual property report and will receive instructions on how to submit a counter-notice if they believe their content was removed in error.

Appeals

As mentioned previously, if a person believes that their content was wrongly removed or their profile unjustly suspended, they can submit an appeal for a manual review.

Our content moderation team carefully reviews each appeal, and in some cases appeals may be escalated internally for further evaluation.
There are three possible outcomes for an appeal:

  • The decision may be overturned, and the profile or content will be restored.
  • The decision may be modified, resulting in a warning being applied (or removed) to a specific link.
  • The decision may be upheld, and the profile or content will not be restored.

Once the final decision is made, the person who submitted the appeal will be promptly notified of the outcome.

Methodology:

Automation:

Linktree uses various automated techniques to detect and remove violative content from Linktree. Our automated tools use a combination of text, URL, and image models to detect potentially violative content and remove it or prevent it from being added at all. All automated models include some level of human oversight to address biases and maximise accuracy.

Hybrid:

Much of our content moderation includes an element of manual review, which classifies these processes as hybrid actions. These processes can include items that are flagged by our automated content detection systems and require manual moderation for retraining or reviewing.

Linktree also uses third-party tools and services to detect violative content, which is then reviewed by our content moderation team. 

Manual:

Linktree also utilises manual moderation for various different violative categories that require additional context and judgement.

Some examples of manual moderation include our community violation reports, policy/ban appeals, and intellectual property reports. This manual effort ensures we are handling cases in an appropriate timeframe and that the correct moderation action is applied to them. 

Content Moderation Actions (Global)

Automated Link Removals

The table below displays the percentage of links that were removed and banned from Linktree profiles. It also highlights the percentage of decisions that were reversed, providing insight into the accuracy of the automated processes.

Hybrid Content Moderation

The table below shows the breakdown of manual link removals submitted by the content moderation team during the reporting period versus how many were eventually reversed. These removals are associated with flagged items that were reviewed by the team and determined to be unsafe, illegal, or in violation of our Community Standards.

Manual/Hybrid Link Removals

Profile Suspensions:

Total Profile Suspensions:

Global: 68,199

The graph below displays the total number of profile suspensions from January to June.

Banned Accounts: Top Violative Category Breakdown

This chart shows a breakdown of the most common reasons for an account suspension on a global basis. ‘Other Ban’ refers to bans not covered by any existing categories.

Community Violation Reports

Number of Violation Reports Received

In addition to proactively detecting violative content on our platform, we welcome Linktree users and visitors to report any content or accounts they believe to be in breach of our Community Standards.

During this period, we received 16,276 submissions in our Violation Report form, averaging roughly 2,700 per month.

Below is a breakdown of the Community Violation reports that Linktree has received from EU Member states. In line with what we’ve mentioned above, please note that the category indicates what the reporter selected when submitting the report and does not necessarily correlate with or guarantee any moderation action.

Number of Violation Reports, Rejected vs Accepted

This graph shows the ratio of reports where there was no confirmed violation versus reports that did result in a moderation action.

Overall, this shows that only 14% of submissions reported during this time were in breach of our Community Standards.

Median Handling Time for Violation Reports

This data reflects the difference in time (days) between when the violation report was submitted and when it was closed out by the moderator (including review time and moderation actions).

Number of Appeals Received

Appeals may be submitted for almost any moderation action taken on an account, such as an account ban. We received a total of 6,975 appeals during this period, and this figure typically correlates with the number of profile suspensions during each month.

The chart below provides insight into the appeals that have been submitted by individuals within the EU.

Number of Appeals Received, Rejected vs. Accepted

This is the ratio of appeals that were rejected (i.e. where the ban stayed in place) vs. appeals that were accepted (i.e. decision overturned and account/content restored). Overall, 44% of appeals during this time were accepted.

Median Handling Time for Appeals

This calculates the difference in time (days) between the appeal being submitted and the decision being made by the moderator (including review time and any reversal of moderation actions).

Intellectual Property Reports

Linktree respects intellectual property rights and we expect our users to do the same.

When we receive an intellectual property report, it is manually reviewed to determine whether the report is valid and if the reported profile contains infringing content. If the Linktree profile contains infringing content, then the content is removed and, if appropriate, the profile is suspended.

More information on our intellectual property policy can be found here.

The graph below displays a breakdown for the global volume of intellectual property reports we received during the data period.

Global Total:

The two pie charts below highlight the ratio of copyright and trademark reports that were accepted, denied, or resulted in no action globally.

Trademark:

Copyright:

The chart below shows the median time in hours to respond to an intellectual property report. During the data period the average median first response time was 13.99 hours.

The following chart shows the volume of intellectual property reports that we received from within the European Union.

Below is a table that shows the complete breakdown for the intellectual property reports we’ve received from each EU member state as well as the total actions taken with each report.

Legal Requests

At Linktree, we take legal information requests seriously; they are reviewed manually and actioned quickly. Any legal requests can be submitted to legal@linktr.ee.

Below you will find a table that breaks down each legal information request we received during the data period. Subpoenas are only actioned if accompanied by the appropriate, official legal documentation.

Trusted Flagger:

Linktree has not received any reports from Trusted Flaggers, as defined by the Digital Services Act, during the data period.

We remain committed to closely monitoring activity and maintaining an open channel of communication with our Trusted Flaggers to ensure swift action when necessary.

Monthly Active User

Article 24(2) of the DSA requires online platforms like Linktree to publish information on average monthly active recipients of the service every six months. These recipients should be geographically located in the EU. The primary purpose of publishing this number is to identify if an online platform is a ‘very large online platform’ = at least 45 million users per month in the EU.

For the period between 01 July 2024 to 31 December 2024 the average number of monthly active recipients was below the 45 million user threshold for being designated as a VLOP.

We define a monthly active recipient as Linkers and visitors who visit our platform and interact with a profile at least once, for example clicking a link, during the calculation period. We have also attempted to limit this number to “unique” visits only e.g. counting multiple visits by the same user as only once in each month.

We will continue to monitor the number of average monthly active recipients and publish updated information in accordance with Article 24(2) of the DSA.

Conclusion

At Linktree, our mission is to empower anyone to curate, grow and monetize their digital universe. We are committed to creating a safe and welcoming experience for everyone who uses our product, ensuring that each interaction on the platform reflects our dedication to inclusivity and trust.

Our Trust & Safety team is deeply committed to continuous improvement of our efforts to swiftly identify and remove unsafe content. We are constantly advancing the tools and systems we use to keep our platform secure. In addition, we are focused on refining our reporting process to enhance transparency around community violation reports, offering detailed breakdowns of violation categories and clearer visibility into the appeals process. We continually evaluate and update our policies and Community Standards to uphold a safe and fair environment for everyone on the platform. Our commitment to these principles is at the core of everything we do, ensuring that Linktree remains a trusted space.

Note: the data reflected in this report was accurate at the time of generation. Data is subject to change based on day-to-day moderation decisions, such as appeals.