This is the first edition of our Transparency Report, which provides insight into our content moderation efforts over a six-month period (Jan - June 2024) and reaffirms our commitment to safety, transparency, and compliance with the Digital Services Act. These reports will be published biannually. The data from this report was collected between July and August 2024. Due to this there may be slight changes and discrepancies to current numbers due to ongoing appeals and decisions.
Linktree’s mission is to empower anyone to curate, grow, and monetise their digital universe. Our highest priority is ensuring a safe and reliable experience for the visitors, creators and businesses worldwide who trust our platform. To this end, we have invested significantly in both people and technology to swiftly identify and remove harmful content.
Our Trust & Safety team accomplishes this with both proactive and reactive reporting workflows.
Linktree uses various automated techniques to detect and remove violative content from Linktree. Our automated tools use a combination of text, URL, and image models to detect potentially violative content and remove it or prevent it from being added at all. All automated models include some level of human oversight to address biases and maximise accuracy.
Much of our content moderation includes an element of manual review, which classifies these processes as hybrid actions. These processes can include items that are flagged by our automated content detection systems and require manual moderation for retraining or reviewing.
Linktree also uses third-party tools and services to detect violative content, which is then reviewed by our content moderation team.
Linktree also utilises manual moderation for various different violative categories that require additional context and judgement.
Some examples of manual moderation include our community violation reports, policy/ban appeals, and intellectual property reports. This manual effort ensures we are handling cases in an appropriate timeframe and that the correct moderation action is applied to them.
The table below displays the percentage of links that were removed and banned from Linktree profiles. It also highlights the percentage of decisions that were reversed, providing insight into the accuracy of the automated processes.
The table below shows the breakdown of manual link removals submitted by the content moderation team during the reporting period versus how many were eventually reversed. These removals are associated with flagged items that were reviewed by the team and determined to be unsafe, illegal, or in violation of our Community Standards.
Global: 68,199
The graph below displays the total number of profile suspensions from January to June.
This chart shows a breakdown of the most common reasons for an account suspension on a global basis. ‘Other Ban’ refers to bans not covered by any existing categories.
In addition to proactively detecting violative content on our platform, we welcome Linktree users and visitors to report any content or accounts they believe to be in breach of our Community Standards.
During this period, we received 16,276 submissions in our Violation Report form, averaging roughly 2,700 per month.
Below is a breakdown of the Community Violation reports that Linktree has received from EU Member states. In line with what we’ve mentioned above, please note that the category indicates what the reporter selected when submitting the report and does not necessarily correlate with or guarantee any moderation action.
This graph shows the ratio of reports where there was no confirmed violation versus reports that did result in a moderation action.
Overall, this shows that only 14% of submissions reported during this time were in breach of our Community Standards.
This data reflects the difference in time (days) between when the violation report was submitted and when it was closed out by the moderator (including review time and moderation actions).
Appeals may be submitted for almost any moderation action taken on an account, such as an account ban. We received a total of 6,975 appeals during this period, and this figure typically correlates with the number of profile suspensions during each month.
The chart below provides insight into the appeals that have been submitted by individuals within the EU.
This is the ratio of appeals that were rejected (i.e. where the ban stayed in place) vs. appeals that were accepted (i.e. decision overturned and account/content restored). Overall, 44% of appeals during this time were accepted.
This calculates the difference in time (days) between the appeal being submitted and the decision being made by the moderator (including review time and any reversal of moderation actions).
Linktree respects intellectual property rights and we expect our users to do the same.
When we receive an intellectual property report, it is manually reviewed to determine whether the report is valid and if the reported profile contains infringing content. If the Linktree profile contains infringing content, then the content is removed and, if appropriate, the profile is suspended.
More information on our intellectual property policy can be found here.
The graph below displays a breakdown for the global volume of intellectual property reports we received during the data period.
Global Total:
The two pie charts below highlight the ratio of copyright and trademark reports that were accepted, denied, or resulted in no action globally.
Trademark:
Copyright:
The chart below shows the median time in hours to respond to an intellectual property report. During the data period the average median first response time was 13.99 hours.
The following chart shows the volume of intellectual property reports that we received from within the European Union.
Below is a table that shows the complete breakdown for the intellectual property reports we’ve received from each EU member state as well as the total actions taken with each report.
At Linktree, we take legal information requests seriously; they are reviewed manually and actioned quickly. Any legal requests can be submitted to legal@linktr.ee.
Below you will find a table that breaks down each legal information request we received during the data period. Subpoenas are only actioned if accompanied by the appropriate, official legal documentation.
Linktree has not received any reports from Trusted Flaggers, as defined by the Digital Services Act, during the data period.
We remain committed to closely monitoring activity and maintaining an open channel of communication with our Trusted Flaggers to ensure swift action when necessary.
Article 24(2) of the DSA requires online platforms like Linktree to publish information on average monthly active recipients of the service every six months. These recipients should be geographically located in the EU. The primary purpose of publishing this number is to identify if an online platform is a ‘very large online platform’ = at least 45 million users per month in the EU.
For the period between 01 July 2024 to 31 December 2024 the average number of monthly active recipients was below the 45 million user threshold for being designated as a VLOP.
We define a monthly active recipient as Linkers and visitors who visit our platform and interact with a profile at least once, for example clicking a link, during the calculation period. We have also attempted to limit this number to “unique” visits only e.g. counting multiple visits by the same user as only once in each month.
We will continue to monitor the number of average monthly active recipients and publish updated information in accordance with Article 24(2) of the DSA.
At Linktree, our mission is to empower anyone to curate, grow and monetize their digital universe. We are committed to creating a safe and welcoming experience for everyone who uses our product, ensuring that each interaction on the platform reflects our dedication to inclusivity and trust.
Our Trust & Safety team is deeply committed to continuous improvement of our efforts to swiftly identify and remove unsafe content. We are constantly advancing the tools and systems we use to keep our platform secure. In addition, we are focused on refining our reporting process to enhance transparency around community violation reports, offering detailed breakdowns of violation categories and clearer visibility into the appeals process. We continually evaluate and update our policies and Community Standards to uphold a safe and fair environment for everyone on the platform. Our commitment to these principles is at the core of everything we do, ensuring that Linktree remains a trusted space.
Note: the data reflected in this report was accurate at the time of generation. Data is subject to change based on day-to-day moderation decisions, such as appeals.