Half of the world’s largest digital platforms fail to meet minimum standards of transparency regarding the advertising they carry. This is one of the main findings of the study Data Not Found carried out by our research group at NetLab – the Internet and Social Media Research Laboratory at the School of Communication, Federal University of Rio de Janeiro, Brazil – in collaboration with researchers from the Minderoo Centre for Technology & Democracy (MCTD), an independent team of researchers at the University of Cambridge, in the United Kingdom.The study investigated, in an unprecedented and transnational manner, the availability and quality of data on user-generated content and advertising across 15 major digital platforms operating in Brazil, the European Union (EU) and the United Kingdom. The analysis includes some of the main social media platforms in use, such as TikTok, Instagram, Facebook, YouTube, Kwai and Telegram, allowing for a comparison of how these companies provide information within different regulatory contexts.These regions were selected for specific reasons. The European Union stands out for having the most advanced platform regulation efforts in the world, having consolidated laws such as the Digital Services Act (DSA), whose transparency standards frequently serve as a benchmark for international debate. The United Kingdom does not adopt universal transparency rules, opting instead for a model in which the supervision of platforms is determined through case-by-case assessments by regulatory authorities. Brazil, meanwhile, represents the Global South, where the regulatory debate on digital platforms remains in its infancy and transparency practices depend, to a large extent, on the goodwill of the platforms themselves.To evaluate the selected platforms, we applied the Social Media Transparency Index, which was developed jointly by NetLab and MCTD to measure the extent to which it is possible to access, understand and verify data on content and advertising on these platforms. To this end, we analysed criteria such as the availability, completeness and standardisation of data, as well as ease of access and the ability to track advertisements — including information on funding, amounts invested and audience targeting. Systemic opacityThe survey results, finalised in early April 2026, revealed low transparency in virtually all aspects assessed: we identified incomplete or missing data, gaps in ad libraries, a lack of clarity regarding funding and targeting, and an absence of standards that would allow for consistent comparisons between platforms.In Brazil, although the platforms operate fully within the country, it is striking that transparency mechanisms are even more limited and inconsistent than in other contexts analysed, such as the European Union. This translates into less data availability, more gaps in ad libraries and greater difficulty of access for independent researchers. In some cases, tools that exist in other countries are simply not available or operate in a more restricted manner in Brazil.Our survey also indicates that this opacity is systemic: even where transparency mechanisms exist, they are limited, inconsistent and unreliable. In the Brazilian context, this scenario is particularly worrying, as it hinders the monitoring of the information ecosystem in an environment highly dependent on social media for the circulation of information and public debate.Asymmetrical relationshipsWhilst platforms know us better than anyone else, it is almost impossible to understand how they work and what goes on inside them. Despite playing a central role in the circulation of information, these companies act as if they were detached from public interest and did not need to be accountable to society, regulators and the community of experts and researchers regarding the impact of their services.Thus, by providing limited mechanisms for accessing data, they come to define the very conditions under which knowledge about themselves is produced. These mechanisms often function more as public relations strategies than as genuine instruments of transparency. They thus maintain an appearance of openness which, in practice, does not translate into effective access.All of this significantly compromises the possibilities for analysing how the platforms operate. Researchers are unable to audit their social impacts or independently validate the knowledge they produce or that is produced by third parties.Regulators, in turn, lack the necessary inputs to conduct investigations or launch formal inquiries into their practices and effects. Consequently, it is impossible to map, in a systematic and verifiable manner, risks such as disinformation campaigns disguised as organic content, fraudulent advertising campaigns that disproportionately target the most vulnerable users, and the exposure of children and adolescents to inappropriate content and online harassment.More tools and higher-quality dataThe progress of the debate on how to curb the power of digital platforms has reinforced the central importance of transparency and access to data for regulators, researchers and policymakers. The United Nations (UN), for example, recognises transparency for research purposes as a prerequisite for restoring so-called “information integrity”, since addressing disruptions to it requires accurate diagnostics. In this sense, it is not enough for data to be accessible: it must be of sufficient quality to be usable and representative.Given this, it is essential to ask what data is, in fact, available. And also whether the quality of the access mechanisms meets the minimum requirements necessary for conducting research in the public interest. In general, the transparency and data access tools available on platforms have limited search capabilities and offer data with little granularity. This hinders more robust analyses of targeting practices, campaign reach and the broader social impacts of online advertising. With regard to user-generated content data, access mechanisms are even more limited. Even in cases where European regulations provide for free access to researchers, their implementation is inconsistent. Access requests are frequently denied without justification, as has happened to us and to several other researchers.This scenario highlights a key limitation of the Digital Services Act (DSA). It provides for a significant transfer of power to platforms, obliging them, on the one hand, to make data available for research, whilst allowing them to retain broad discretion over who can access it and how. This contrasts with one of our key recommendations on transparency: access to public data must be universal and designed to enable genuinely independent analysis.Transparency as a requirement, not a choiceThe introduction of new regulations is also recommended, as they constitute a fundamental step towards increasing transparency, provided they are accompanied by robust mechanisms that ensure adequate conditions of access. In the case of Brazil, as in much of the world, this agenda is even more urgent. Until this happens, transparency concentrated in a few regions deepens inequalities, as some researchers gain access to data whilst others — particularly in the Global South — remain excluded, despite often facing greater risks.Given the urgent need, it is up to the platforms themselves to align their data access practices with the highest available standards, ensuring that users and researchers, wherever they are, benefit from consistent levels of transparency.Much of the world still needs to make progress in regulatory terms, but this process cannot take place in a haphazard manner, nor without the input of the researchers themselves who depend on this data. Platform transparency cannot be treated as a corporate choice, but rather as an indispensable condition for the protection of the public interest.R. Marie Santini receives funding from CNPq and Faperj, and Netlab is funded by national and international charitable foundations.Adriano Belisário, Bruno Mattos, Danielle Pinho, Debora Salles e Hugo Leal não prestam consultoria, trabalham, possuem ações ou recebem financiamento de qualquer empresa ou organização que poderiam se beneficiar com a publicação deste artigo e não revelaram nenhum vínculo relevante além de seus cargos acadêmicos.