A scoping review of ethical aspects of public-private partnerships in digital health

Wait 5 sec.

IntroductionPublic-private partnerships (PPPs) exemplify ‘open innovation’ networks where data, expertise, and risk are shared for the mutual benefit of public and private partners1,2. Such partnerships are grounded in common interests and joint decision-making, and the World Bank defines a PPP as a “long-term contract between a private party and a government agency, for providing a public asset or service, in which the private party bears significant risk and management responsibility” (2012, p.11)3. PPPs are stimulated in the developing field of ‘digital health’, predicated on the idea that both public and private partners are needed to develop accessible, affordable and high quality health technology. The European Commission unites public organizations and private commercial entities in the Horizon Europe research funding framework4, and the government of the Netherlands has, for example, established the Health Holland programme to facilitate and finance PPPs for healthcare innovation5. The curation of large patient cohorts and disease registries by governments and academic hospitals makes them desirable collaboration partners for technology companies developing data-driven tools6. While some digital health PPPs are based merely on sharing health data between public and private organisations out of a common interest (i.e. ‘data partnerships’7), others truly pool their resources or expertise for the purpose of jointly providing care or developing innovations in more long-term collaborations (i.e. ‘strategic partnerships’8).Although the rise of digital health PPPs provides opportunities for societal impact, they also raise important ethical issues for those people whose data informs digital health technologies (mainly patients and citizens), future users (including clinicians), and for society as a whole. People are reluctant to share health data with private companies, so solidarity and trust in medical science can be affected by PPPs9,10,11. This reluctance stems from growing power imbalances between tech companies and data subjects in the age of ‘surveillance capitalism’12, which refers to an economic system where financial profit may be obtained from the systematic collection of personal data to predict and influence consumer behaviour. Commercial interests can also cause a lack of transparency about data protection issues, and the obscurity of commercial artificial intelligence (AI) models limits their governance and the assessment of bias13. Whenever commercial companies participate in medical research and innovation, there is the threat of conflicts of interest related to financial gains and publications14, and of restricting access to publicly funded datasets and applications in order to serve business interests15,16. In healthcare, these trends were exemplified by Google DeepMind’s taking of NHS patient records without the consent of those affected17. This case is described in Box 1, along with two other key illustrations of controversial digital health PPPs. Of note is that the ethics discourse is focused mainly on companies’ commercial interests, while public institutions and researchers may also have certain interests that conflict with those of patients and citizens, as we will discuss further on in this article.With the rise of public-private collaboration, it becomes imperative that appropriate policies are created to “avoid socialization of risk and privatization of rewards”18 in digital health research and innovation. However, data protection laws are necessarily general and there are currently no specific (inter-)national ethics guidelines specifically for digital health PPPs. Although previously there has been a lot of focus on partnerships with the pharmaceutical sector, such partnerships have important differences from those with digital technology companies; big pharma has long time experience with human subjects research and oversight by Institutional Review Boards, while big tech has operated largely outside the regulatory constraints of medicine19. Thus, we cannot simply copy lessons from pharmaceutical PPPs one-to-one. The aim of this review, therefore, is to systematically collect and review the ethical issues that are raised in digital health PPPs. This will provide a starting point for further research and practical ethical guidelines. In our analysis, we used the World Bank definition of PPPs but expanded the focus from government agencies to the entire public sector including (academic) hospitals, and in principle also included within our scope any papers discussing tripartite partnerships wherein civil society functions as an additional third partner in a PPP (Fig. 1). Regarding the latter, we find such partnerships to be within the scope of PPPs but note that in reality patients and citizens have few formal opportunities to have their voices heard, as this wider range of stakeholders is often excluded from digital health partnerships.Fig. 1: Partnering spaces in digital health.This review focuses on the intersection between the public sector and industry and, in principle, this also includes partnerships where patient organisations or civil society are involved as additional partner. Based on the theoretical framework by van Tulder and Pfisterer77.Full size imageBox 1 Key examples of controversial PPPs in digital healthCare.dataThe care.data programme was proposed in the UK in 2013 to collect and share anonymised patient data from general practitioner (GP) records with the National Health Service (NHS) and other organizations for research and planning purposes. However, concerns arose quickly, initially focusing on the information leaflets that looked like junk mail and did not include opt-out options. Critics argued that the scheme lacked transparency, proper consent procedures, and safeguards to protect sensitive data from being accessed or exploited by third parties, including private companies. Amidst public outcry and criticisms from medical professionals, the care.data programme faced significant delays and ultimately was abandoned in 2016.NHS / DeepMindIn 2015, the NHS partnered with DeepMind, a Google-owned AI firm, to develop "Streams", an app that would help healthcare professionals with detecting and managing acute kidney injury. DeepMind accessed 1.6 million patient records from Royal Free London hospital without patient consent, raising concerns about privacy, oversight, and transparency of the collaboration between a tech giant and public organisation. The UK’s Information Commissioner’s Office found the data-sharing agreement violated data protection laws. In the aftermath of the controversy, DeepMind and the NHS revised their agreements.State of Iceland / deCODEIn 1998, Iceland granted deCODE Genetics exclusive rights to create a national database combining medical and genetic records. The project lacked transparency and proper consent procedures, and critics feared that the genetic information could be exploited for profit by deCODE Genetics or shared with third parties without consent. In response to international controversy, the legislation was withdrawn and the national database failed, but deCODE Genetics revised its consent procedures to improve information provision and give Icelandic citizens the option to opt out of the adapted project now led by deCODE.ResultsIncluded studiesA total of 46 studies were included in this review (Supplementary Table 1: screening and inclusion flowchart, Supplementary Table 2: characteristics of included studies). Of these, 82.6% (38/46) were published in the last five years. The majority of papers (56.5%; 26/46) described European PPPs: in Denmark, Finland, Iceland, Italy, Switzerland, United Kingdom (UK), or Europe generally. Other papers (17.4%; 8/46) referred to the United States (n = 4), Canada (n = 1), Australia (n = 1), Israel (n = 1), and Singapore (n = 1). The remaining 12 (26.1%; 12/46) papers contained general ethical discussions without reference to a specific country context. Five specific cases of digital health PPPs were analysed in detail in more than one paper: the care.data scheme in the UK (n = 6), the partnership between the UK’s National Health Service (NHS) and Google’s DeepMind (n = 6), the collaboration between the State of Iceland and deCODE Genetics (n = 4), the Findata programme in Finland (n = 3), and the Google/Apple collaboration with national governments during the COVID-19 pandemic (n = 3). Most included studies (n = 30) discussed ‘data partnerships’, whereas fewer studies (n = 16) analysed ‘strategic partnerships’. No article discussed tripartite partnerships in which patient organizations or citizen groups join the public-private collaboration as a third partner. Three overarching ethical issues throughout the lifecycle of digital health PPPs were identified: data privacy and consent, public benefit and access, and trust and governance (Fig. 2). In Table 1, we provide an overview of key recommendations for each theme, derived from the literature, to support the creation of ethically informed PPPs in digital health. These recommendations should be viewed in conjunction with each other because, for instance, individual consent is not by itself a sufficient safeguard for privacy and it does not protect against broader societal harm.Fig. 2: Key ethical themes in digital health PPPs.Overview of the main ethical themes distributed over the two general phases of digital health technology development, i.e. the data collection and analysis phase, and the implementation phase.Full size imageTable 1 Recommendations for ethically informed digital health PPPs, synthesized from the reviewed literatureFull size tablePrivacy and consent for data processing in PPPsThe literature describes privacy concerns and uncertainties around the scope of regulations in relation to digital health PPPs. Although PPPs are started from some common interest, private and public institutions may have different secondary goals that conflict with privacy20,21. Various authors frame the private sector as ‘tyrannical’ and argue that digital health PPPs are facilitating sphere transgressions of internet companies into the sphere of health, which violates the public’s expectations about privacy22,23,24. However, privacy concerns can be targeted at both the private and the public sector, focusing respectively on risks of commercialization (e.g. unwanted profiling, or an increase in inequalities) and state surveillance (e.g. misuse by law enforcement agencies, or the normalization of pandemic surveillance)25,26. Data sharing in PPPs can give rise to ‘group privacy’ concerns related to broader harms around stigmatisation or increasing inequality, as well as to concerns about individual re-identification, which may be particularly easy in smaller countries like Finland or Iceland that are ideal ‘population laboratories’ for health and genetic research27. While data protection regulations are aimed at safeguarding data privacy, they do not always cover data processing in PPPs. Foreign companies may be based outside the jurisdiction of the data subjects; moreover, many regulations only apply to medical records in public organisations and not to commercial health data collected in consumer apps or wearables28,29,30. In addition, most data protection laws only cover data from identifiable subjects, but some people have principled objections to secondary commercial uses even when their data is successfully de-identified, so current data protection regulation may not suffice for digital health PPPs30,31.Another key topic in the literature was informed consent for data sharing in PPPs. In cases of sharing publicly stored health data with companies, the process of giving informed consent is increasingly framed as an individual trade-off between exchanging ‘your’ personal health data for ‘public benefit’. Indeed, according to most data protection and health laws, individual consent might not be strictly required if there is a clear and important public interest. However, Vezyridis and Timmons discuss how this binary view disregards that public health innovation is only possible because of long-standing individual rights24. Moreover, various authors note that making such a trade-off or cost-benefit calculus is not truly possible because of patients’ inability to adequately control their data and because consent is an insufficient safeguard for broader societal harms32,33. Still, consent is seen as valuable for purposes of transparency and for giving people the option to veto certain uses of their data, as well as for maintaining trust in health research. Various major collaborations with industry were unsuccessful because data subjects´ consent was not adequately obtained and trust was harmed as a result. In addition to the care.data24,30,34,35,36,37, DeepMind30,38,39,40,41,42 and deCODE30,43,44,45 controversies presented in Box 1, the literature describes criticism of Finland’s data infrastructure which does not allow opting out of data sales to commercial entities46, while in Italy, the unconsented transfer of health data from the government to IBM Watson Health was blocked due to legal uncertainties41. In the US and UK, identifiable data sharing without consent is allowed when companies provide services for patient care, and there are debates whether this exception has been misused in data sharing PPPs between healthcare organisations and Google, where a ‘direct care’ relationship did not truly exist or at least not for all data subjects28,38,39,42.A number of scholars have argued that some form of consent should be required for sharing health data with commercial companies, even though the risk of selection bias inherent to the consent process may be exacerbated when data subjects have to specifically consent for commercial use20,28,47. Whether via an opt-in or opt-out process, these authors find that data subjects should be informed of how and for how long their data will be used in the PPP. Kaplan notes that at a minimum, data subjects should be able to opt out of commercial data use48. According to the reviewed literature, such opting-out should be easy and good reasons (i.e. those generally found relevant and fair) for accepting data sharing in PPPs must be specified in consent documents31, and the provided information should be kept up to date, e.g. by announcing new uses on a public website or through dynamic online consent processes49,50. Moreover, examples of bankruptcy of the original data holder such as in the SharDNA case in Italy50, show that notice or consent might need to be renewed when data are transferred to another organisation or company.Public benefit and access to the results of PPPsA key ethical requirement for establishing digital health PPPs is that the eventual results or tools benefit the public interest, especially when the data comes from the public sector20,30,51. In various legal frameworks such as in the EU, the ‘public interest criterion’ is a potential legal ground for not obtaining consent. We should note here that something which is in the public interest is not necessarily of direct public benefit, but in the reviewed literature the terms ‘benefit’ and ‘interest’ were used interchangeably. In any case, the potential value contribution of digital health PPPs is widely recognised and these initiatives are welcomed by researchers because they address limitations such as a lack of technological infrastructure or expertise in big data analytics22,35,36,41,49,52. In other words, “avoiding PPPs altogether would not serve the public interest”49. When public benefit is achieved, secondary commercial profit is generally accepted given the obligations that companies have to shareholders, employees and customers51,53. The ethics literature highlights different types of public benefits: improvements in direct patient care, advancements in scientific knowledge, cost-savings to the healthcare system, and even benefits to the national economy. Various authors argue that the important distinction is not just between public and private benefit, but also between a plurality of arguments (‘normative repertoires’23) that provide legitimacy to a PPP, which range between driving research, to improving healthcare, achieving financial and economic gain, and maybe even by monetising the data as such25,35,41. Due the ambiguity around the public interest requirement, it may not restrict data sharing but actually pave the way for expanded uses in the broader public interest, like promoting economic growth through PPPs32,54.Namely, the shift from welfare state data regimes to data-driven health economies can go hand in hand with a process of national ‘solidarization’46, whereby citizens are called upon to contribute their data to PPPs in order to benefit the public interest, despite uncertainties about what this interest is and who the actual beneficiaries are, and with the risk that eventually only those who share their data can use health services27. The main problem with this solidarization is that especially long-term strategic partnerships are often not aimed at a specific health benefit but rather at improving the research and business climate and at making the country in question a “famous and a leading country in the digital social services and healthcare field”54. The increased involvement of private companies can be related to a history of budget cuts and worries about sustainability of healthcare systems23. In the example of deCODE, the partnership tapped into citizens’ economic worries: at a time when fish stocks declined, a new source of income could be found in health technology that profited from the genetic homogeneity of the Icelandic population45. As such, solidarization is grounded in ‘population branding’—i.e. employing marketing strategies to position a country’s population as special to attract investments in data as a national resource44—thereby excluding those people who are not part of the branded population, and it is thus incompatible with a more universal solidarity46. Moreover, when the rhetoric of solidarity is used to promote a neoliberal view of health and research, this harms the trust of citizens who wish to “further the common good without being manipulated into doing it”36.In the literature, we found no consensus on which types of benefits are acceptable – i.e. whether PPPs should only be started when they bring improvements to future patients’ health or the healthcare system, or if PPPs that strengthen economic and research environments are also acceptable29,51,53. Views about this vary for different cultural (country) contexts and for the different social roles that one individual may have: “what patients care about as patients cannot be equated with what patients care about as citizens who are part of a much wider social endeavour”34. Whether specific benefits are seen as acceptable also depends on the level of certainty that these benefits will truly be brought about. Solidarization may not be appropriate also because there is some healthy scepticism about the actual benefits of tools developed in digital health PPPs, as these depend on the quality of the data and research, which should be subject to scientific review21. Especially when PPPs use data generated or collected by the commercial partner, the datasets are often patchy and biased because they are self-reported or passively collected by users who use the service (e.g. a smartwatch) inconsistently22,44,55. Moreover, the absence of quality meta-data may complicate linkage of data between different partners31. Finally, there may be a conflict of interest in the sense of a felt obligation by public researchers to the company that co-funds their studies, potentially leading to distortions in the design or analysis process21. Various scholars argue therefore that as a general rule, overriding safeguards like individual consent by appealing to public benefits may be acceptable only when this is proportional and transparent25,28,32. Partners should define early on their vision for how their PPP creates public benefit: this vision should be based on performance indicators which are actually tracked, to avoid ‘lip service’ or ‘ethics washing’ when companies or governments merely have an interest in accountability in order to protect their image49,51. This would also help avoid strategic (mis)use of the concept of public interest for retaining credibility when expectations based on one particular definition are not met54.Other issues discussed in the literature relate to access and intellectual property. In the most common terms, a public benefit may refer to one that is widely available56, which the literature describes is not always the case if private interests lead to digital health products being overpriced or otherwise inaccessible51. Various authors raise concerns about the exacerbating of inequalities when benefits are not fairly shared due to commercial interests28,32,38. Ideally, according to various authors based on their case studies, the healthcare institution providing health data should have free access to the resulting goods and services25,36,39,57 or at least recover the costs of providing the data and knowledge through data access fees56,58, but this is not always the case because the ideal of benefiting public healthcare, research and education may clash with the profit-driven aims of companies. It is known that the accessibility of digital health applications depends for a large part on intellectual property (IP) arrangements designed to protect industrial partners’ competitive advantage. Ballantyne and Stewart advise that public and private sector collaborations take the time to identify shared interests and clearly articulate how benefits will be distributed38. In the example of the DeepMind controversy, the NHS might have secured joint IP rights with a longer negotiation process involving a broader range of stakeholders38. These discussions should cover fair pricing, according to Winkler et al.51. Moreover, Landers et al.49 argue that the data generated in a PPP should be made open after the project is finished, according to the principles of findable, accessible, interoperable and reusable (FAIR) data and that IP rights may be sought only on outcomes mostly achieved by the private partner.The specific ways in which IP and other benefits are shared need to be determined case-by-case, but there seems to be consensus in the reviewed literature that public health data itself should always remain a public good, given the public investments in collecting it23,39,49,51. Digital health data coming from the public sector’s ‘gold mine’ is increasingly viewed as ‘asset’ or ‘public good’ that can create both social and economic benefit32,46,59. However, assigning financial value to data is said to be difficult, as this value is often generated at later steps in the development process of digital health tools, so perhaps public benefits are best understood as indirect societal value rather than direct financial value49. Finally, in refs. 39,51 it is argued that public organisations should remain gatekeepers and stewards of publicly collected data, even if they do not ‘own’ the data in the proprietary sense. However, fulfilling this role might be difficult due to power dynamics and companies’ monopolies.The need for public datasets and medical expertise in PPPs may give some negotiation power to the public partner as data stewards25. Yet ‘big tech’ companies are bringing expertise increasingly in-house by recruiting from top public organisations, leading to ‘brain drain’ whereby the public healthcare sector is becoming more and more dependent on industry25,32. With her frame of the ‘Googlization of health’, Sharon23 highlights the risks associated with corporate encroachment into health, politics, and public policy spheres, potentially leading to a crowding out of expertise, growing dependencies and concentration of power, and calls for more bioethics research on these power asymmetries. Large tech companies often promote a neoliberal, techno-optimist narrative focused around the individual as autonomous decision-maker, while ignoring societal concerns like designing inequality into digital health tools60. When the wider public is not in a position to benefit from the value of their data this can be seen as exploitation or even ‘tyranny’23,32. Lanzing et al.61 note that it may be more appropriate to speak of a ‘push and pull’ power play between industry and the public sector regulating it, and that values like digital sovereignty are important for the public sector to keep some of this power.All in all, the reviewed literature shows that while privacy protection is important, a focus on privacy should not draw attention away from safeguarding public benefit of PPPs by being attentive to larger societal risks of diminished power, expertise and access. While power issues play a role in any collaborations with industry (e.g. with pharmaceutical companies funding clinical trials), the impact might be great in the newer and less developed and regulated field of digital health21. There also exists a large digital divide because usually only large tech companies have the funds and infrastructure to move into the sphere of digital health32, but a fair playing field should be created by providing companies non-exclusive access to data51. It is in the public interest to avoid monopolization of large tech companies and to promote competition so that no single actor can control scientific outputs and the pricing of digital health tools22,40,41,44.Good governance and demonstrating trustworthiness of PPPsInstances where the consent process is inadequate and people discover commercial involvement after the fact (Box 1), cause an erosion of trust in both the public and private partner. Carter et al.34 have analysed this as the failure of obtaining a “social license to operate”, referring to the fact that not only legal requirements should be fulfilled but PPPs should also be sensitive to societal expectations about their conduct. The literature describes how citizens may be unfamiliar with the unexpected secondary goal of many PPPs to contribute not only to public health but also to national or regional wealth: trust in public organisations may come under more pressure as governments increasingly have economic, wealth-related and reputational reasons for engaging in PPPs for digital health. With the rise of PPPs it also becomes unclear whom data subjects should trust: the public or private organisation, the actors within these organisations, or the collaboration as such? Existing relations are changed when healthcare organisations enter into (non-transparent) collaborations with digital technology companies, which can be seen as context transgressions resulting in a trust gap22,49,57. Eroded trust in healthcare institutions can impact directly on the patient-physician relationship. In the Google-Ascension collaboration, for instance, clinicians were not informed and thus could not tell their patients about this data sharing partnership, which jeopardises patient trust in the confidentiality of the information shared with their doctor28. Even data controllers themselves may have trust issues and reputational concerns when they need to handle data access requests from the ‘other’ sector, so demonstrating trustworthiness is important between partners as well31,52.Some authors state that newly set up digital health PPPs should put in the work to earn and maintain public trust20,31, while others think that public sector relations with companies should not be based on trust, which is grounded in uncertainty, but rather on the confidence that they are indeed processing health data in acceptable ways62. In order to merit the public’s and each other’s trust, both private and public organisations should first ensure being trustworthy when it comes to handling patient data and reaping benefits from this37. In particular, several papers highlight the concern that the rapid pace of deal-making, driven by Big Tech’s ‘move fast and break things’ approach, may ultimately break patient and public trust by compromising governance29,30,37,39,55.Promoting trustworthiness of PPPs can be done by basing their governance on the concept of ‘stewardship’. The idea of stewardship is less complex to apply than ownership, and more appropriate due to its focus on responsibilities related to data protection, appropriate use, and promotion of public benefit27,31,38,58. While a ‘data controller’ already has a legal responsibility to ensure compliance with relevant laws on data protection, the concept of a ‘data steward’ is more encompassing, entailing not merely the protection of participants but also the promotion of research when it fulfils a public need, and the balancing between these two31. Despite worries that some governments may themselves facilitate sale of data for secondary use as an economically driven data broker rather than a data steward safeguarding the public interest44, various authors state that the public partner should always remain the steward of health data who determines what is appropriate access and use and how privacy should be protected and trust promoted32,38.The literature suggests that an internal PPP committee governs day-to-day access and management issues (such internal committees might be financed through data access fees20), based on publicly accessible policies, while external ethical and scientific boards are responsible for oversight38,50. Good governance or stewardship requires consulting with these oversight bodies, such as data protection authorities, data protection officers, and institutional ethics committees56,58,63. Yet ethics committees for overseeing digital health research are less established than for clinical trials and may lack specific epistemic and ethical expertise, ranging from negotiation skills to knowledge of programming, which highlights the need for guidelines and training of governance and ethics committee members on the ethics of digital health in general and of PPPs specifically21,25,43,47. Drafting clear policies and guidelines for specific PPPs is helpful when existing (national) regulation does not suffice, e.g. as it does not cover deidentified data64, and such guidance can be further refined through multi-stakeholder consultation including both private companies and public entities49.However, there seems to be no consensus on how strictly PPPs should be regulated and whether there is room for self-regulation by the private partner. Some authors advocate strict penalties for misuse of data, e.g. fines or exclusion from future data access, which may serve as an incentive for corporate responsibility35,58, and the enforcement of sanctions was seen by citizen juries as promoting trust in digital health PPPs30,56. In the context of fair pricing of the digital health tools, however, Winkler et al.51 advise against such penalties as they find that “public moral pressure or fear of reputational damage” would be sufficiently motivating for industry to self-regulate to ensure equitable access. Whether self-regulation is sufficient, remains to be seen. For instance, Powles & Hodson39 criticized DeepMind’s self-appointed (and later abolished) ‘independent board’ for hindering true oversight, a concern echoed about the company’s ‘Ethics and Society Fellows’42.The reviewed articles highlight that good stewardship requires particular attention for transparency and accountability of researchers and data stewards, not merely as procedural formality but as fundamental principles that should underpin digital health PPPs, especially when these collaborations are (indirectly) funded by public taxes and use publicly collected data. Consensus in the literature is that transparency constitutes a minimal requirement: people should at the very least be made aware that and how their data are processed in PPPs30,49,56,64,65. The public generally has less trust in industry than in the public sector32,38, yet in a study of two Australian “citizen juries” who discussed the sharing of data with private partners, those juries did not reject all involvement of commercial companies and understood the need for PPPs in specific cases after explanation56. Even though transparency and other demonstrations of trustworthiness are no guarantee for eliciting trust, there is an intrinsic link between transparency and the social license, because transparency to patients, employees and the public helps facilitate public debate38. Transparency is important at different levels, ensuring that datasets and algorithms are transparent but also in terms of disclosing potential conflicts of interest (COIs) and being transparent about the (commercial) purposes and practices of data sharing as laid out in agreements21,58. The literature also suggests several accountability requirements: namely, key performance indicators should be described and publicly monitored49, and clear lines of responsibility and effective accountability mechanisms are needed, coupled with proactive stakeholders engagement, and possibly with compensation measures for demonstrable harm caused by data sharing in PPPs31,51.Yet the commercial and reputational concerns of private and public partners can impede these efforts—especially since private companies are not bound to public transparency and accountability in the way that the public sector is, which can lead to power asymmetries and create conflict between commercial secrecy and public oversight mechanisms26,49. Various case studies underscore the challenges in balancing transparency with commercial interests, with lack of clarity in contractual arrangements and data governance frameworks raising concerns about data sovereignty and public trust36,37,39,49. In the DeepMind controversy, one accountability issue was that the company was labelled only as a data processor rather than joint data controller, despite having significant control over data use39. Similarly, Google’s partnership with Duke University positioned the university as “assisting” Google, instead of the other way around, showing the limited decision-making power of the university22.Finally, taking transparency and accountability a step further, the reviewed articles underscore the critical role of public engagement in navigating the complexities of PPPs. This entails organising ongoing dialogue to address concerns and garner support from stakeholders, especially the communities whose data are shared. In addition to controversial PPPs that did not consult stakeholders, there are various ‘best practice’ cases of achieving community buy-in by setting up patient and public involvement (PPI) strategies early on. For instance, Piciocchi et al.50 describe two cases where early engagement with local communities through town hall meetings and constant communication fostered trust and high participation rates. The literature highlights the importance of inclusivity and responsiveness to diverse perspectives, dissenting voices and vulnerable groups who have the least capacity of exerting influence31,38. Engagement should transcend mere awareness-raising and give meaningful control to those engaged34 but still be “proportional to the nature and size of the PPP”38. New developments, such as those around AI-based analysis of data, should call for renewed PPI efforts30,42. Baric-Parker and Anderson28 suggest that decisions regarding the use of data by third parties should always involve input from patient representatives, and Cohen and Mello63 even argue that at least half of the members of hospital data access committees should be patients of that hospital. Such proposals can be seen as a response to the difficulties with individual informed consent. While inclusive public and patient engagement may be costly and takes time, it promotes public agency and trust in PPPs, improves the quality of stewardship and oversight, and ultimately helps achieve greater societal benefit by promoting the acceptability of the tools developed in PPPs26,49,51.DiscussionWith this review, we have provided a broad overview of the ethical considerations involved in digital health research and development through public-private partnerships, and we hope this will help promote responsible innovation in PPPs. By addressing themes of data privacy and consent, public benefit and access, and good governance and trustworthiness, PPPs can better align with public expectations and safeguard societal values. From the literature, we synthesized fifteen key ethics recommendations across these themes, for PPPs to adopt Table 1. In addition to this overview of specific recommendations, our review has provided two general key findings that are relevant across all recommendations: (1) there is a lack of clarity on the meaning of ‘public benefit’ or ‘public interest’; (2) the discourse on ethics in PPPs has been mostly about the responsibilities of private entities, while public sector organizations should also (and even more so) be trustworthy and accountable. We elaborate on these two points in this Discussion.However, before reflecting on these issues further, we should note the limitations of our review method and of the literature more generally. One issue is that most included articles discussed cases of PPPs in Europe, and also the more general articles were mostly written by authors from European and from English-speaking countries, so we call for further research assessing the ethical aspects of PPPs in other parts of the world. Another important limitation is that several included articles that we deemed to be inside our scope as the authors discussed ethical aspects of extensive exchanges between the public and private sector, did not actually use the term ‘PPP’. Sterckx et al.37 describe these as “concealed Public Private Initiative[s]” in which data collected by public organisations is shared with private partners without calling this a PPP. It seems that the term is mostly used explicitly in cases of highly formalised partnerships. Critical study on the concept ‘PPP’ is needed and should include normative analysis on what is a desirable form of partnership: we hypothesize that increased formalization is correlated with higher awareness of ethics, which may result in more equitable distribution of responsibilities and in increased public transparency. The nature and meaning of a PPP should also be challenged by exploring new forms of partnership that have genuine involvement of patients and publics, as we discuss hereafter.The first general point for discussion is the criterion that digital health PPPs should benefit the public. While the governance of digital health PPPs should be based on questions of societal value and not just on data protection aspects, the reviewed literature shows that there is a lack of clarity and consensus on the meaning of ‘public benefit’ or the ‘public interest’, with different meanings used to justify PPPs. We find that PPPs need to be clear about what is meant with ‘public interest’ and patients and citizens should have a say in whether they want to contribute to this interest. Clarity on what constitutes public benefit or interest—including differentiation between these two overlapping terms—is important given the increasingly political aims of digital health research; evidenced in current ‘AI races’ between the US, China and the European Union, with the latter installing a European Health Data Space to promote medical AI development. It should first be clear who ‘the public’ is and if this includes merely patients or also the broader public. We think the latter would be suitable at least for large-scale PPPs with a large economic impact, or those that include data of healthy individuals. Wickson et al. argue that it is not productive nor democratic to think of ‘the public’ as a static entity, and that engagement should instead focus on the public as citizens: i.e., empowered individuals who are also members of communities66. They state that engagement should encourage those citizens to be active beyond the short, delineated time period of a project: “what is required is an informed citizenry empowered to actively engage in the democratic shaping of science and technology to meet social needs and accommodate plural values”. Similarly, Coeckelbergh argues for a “more active role of citizens and (end-)users: not only as participants in deliberation but also in ensuring, creatively and communicatively, that AI contributes to the common good”67. Thus, digital health PPPs should engage with various publics to find out what aims are acceptable and achieve clarity about what they understand to be a public benefit.In this sense, we see a need for exploration of alternative partnership models such as ‘tripartite partnerships’ which add civil- or patient societies as equal and formal partners, as a a way to meaningfully incorporate public and patient values in PPPs. Tripartite partnerships were not described in the reviewed literature, while we feel the promotion of such partnerships might help balance out power somewhat in favour of patients and the general public. While the collaboration between the State of Iceland and deCODE Genetics has been called a tripartite partnership45, in that case citizens were not meaningful partners but merely shareholders if they bought stock in deCODE. We are not aware of published research on tripartite partnerships in digital health, but lessons may be learned from examples in the broader areas of healthcare68 and politics69. The World Health Organization has also recommended supporting mechanisms for community oversight of data that might be used in ‘data collectives’ or other partnerships70. Further work is needed on the practical, political and ethical questions related to setting up tripartite partnerships. In doing so, it is important to not simply ‘sync fast’ to keep up with innovation71, but rather to shape more sustainable innovation practices by taking the time to work on and demonstrate trustworthiness and to reflect on concerns together with stakeholders. Doing this successfully requires further work to explore effective incentives for stakeholders, what level of engagement is fair to expect from citizens, and how to avoid ‘participation fatigue’. It must be noted that even if appropriate engagement processes are in place, patients and citizens should not be pressured into ‘data solidarity’ for the common good but we find that they should remain able to (at least) opt out of the use of their data in PPPs.Our second discussion point is about the responsibilities of public sector organizations participating in PPPs. Namely, our literature review showed that focusing only on the power of the private sector is too simplistic, as the interests of individual patients do not always align with governments’ or public sector organisations’ interests either (see the results section on ‘solidarization’). Important work has been done by authors like Shoshana Zuboff and Tamar Sharon who warn about ‘surveillance capitalism’12 and the ‘Googlization’ of health22, and we second these calls for more critical study of power asymmetries in PPPs. However, we find that framing the problem as Googlization has downsides. In particular, the rhetoric of big tech companies as (necessary) evil, which we saw in several articles in this review, may harm public trust in digital health PPPs. Especially when the private partner is a smaller company that collaborates on equal footing with a public organisation, this framing can be unfair and potentially harmful. Moreover, this frame might distract from the roles and responsibilities of public sector organizations whose role as data steward can conflict with the (implicit or explicit) aim of some digital health PPPs to contribute to broader economic development, rather than merely benefiting patients or the healthcare system.Instead of ‘Googlization’, we think the broader concept of ‘economization of (digital) health’ is more fitting. Economization refers to a “gradual but sweeping takeover of all policy areas by concerns for the economy”72. Focusing on this broader trend as a useful frame studying the meaning of public benefit and how to achieve that in PPPs, would still allow the study of other ‘-zations’ like Googlization and monopolization while finally putting the spotlight on the motives of the public sector too. The discourse has been mostly about sharing publicly collected data with private entities, but vice versa, users of commercial digital devices might be reluctant to share their health data with governments. Rather than hide behind a frame of ‘Googlization’, government and academia should themselves be trustworthy and accountable, and they should take responsibility for governance and oversight in PPPs. When collaborating with private partners, stricter enforcement measures may be needed to counter some companies’ tendencies of ‘ethics washing’73,74. While beyond our scope to discuss here, lessons about accountability and trustworthiness can be learned from the existing literature base on trust in health research. Of note is that when we speak of trust in digital health PPPs, there is the question of not only who but also what data subjects are being asked to trust, e.g. the standards, training and codes regulating the field of digital health (or the lack thereof). We find that (inter-)national public sector organisations involved in the ethical, legal and social implications of medical research should develop general ethics guidelines for PPPs in the field of digital health, to help foster more successful partnerships that genuinely serve the public interest. The early and joint operationalisation of ethics would help digital health researchers to promote the public interest and to find a balance between respect for fundamental values and the pursuit of impactful innovation.Development of general guidance for digital health PPPs could build on the study of existing (inter-)national biomedical ethics guidelines for recommendations about collaboration with industry, e.g. in the field of ‘Big Pharma’, to ascertain how these could apply to partnerships with ‘Big Tech’. Empirical research with diverse stakeholders could furthermore shed light on what constitutes the public interest, what expectations patients have, and how ethics committees are currently evaluating and overseeing PPPs. Such empirically informed and practice-oriented ethics guidance for digital health PPPs should be translated for each specific context of partnering – this translation might best be done by forming tripartite partnerships, as discussed in the previous section. In addition to anticipating the financial and legal viability of a PPP, co-developing ethics guidelines can “help reduce harm and prevent unjustified PPPs: defining an aligned vision, for instance, will likely reveal conflicts of interests – aligning on PPP guidelines early may thus serve as a litmus test for potential partners” (p. 657)49. If PPPs would become more acceptable and successful as a result of the development of ethics guidelines, this also contributes to reducing the research waste of failed collaborations.MethodsSearch strategy and inclusion criteriaStudies were identified by searching three databases: PubMed, EMBASE and Web of Science (Supplementary Table 3: search queries). Our systematic search included records published in English, in the last decade leading up to 1 December 2023. Free search terms and subject headings (MeSH and Emtree) were selected that were related or synonymous to: ‘public-private partnerships’, ‘ethics’, ‘digital health’. Theoretical and empirical studies that critically reviewed the ethical aspects of PPPs within digital health were included in this scoping review. We included studies that discussed ‘data partnerships’, ‘strategic partnerships’, or both. Articles were excluded that merely mentioned ethical aspects of PPPs in digital health and provided no argumentation.Study screening and data extractionTitles and abstracts were first screened by one of the authors (DH). Articles that did not contain the key concepts of this review were excluded. If the author was unsure of whether an article aligned with the criteria, the article was discussed with another author (MB). Second, full-text records were independently evaluated by both authors (DH and MB) using the online COVIDENCE tool (www.covidence.org) for literature screening, after which disagreements were resolved by discussion. Reference lists of the included articles were searched manually to identify additional relevant publications which were added to the pool of selected studies (backward reference searching). Also, citations of the included records by other articles were used to find additional records (forward reference searching).Analysis and reportingAfter screening was concluded, DH and MB carefully read all included papers and highlighted (‘coded’) important findings. Analysis followed the method of qualitative content analysis75 and was an iterative process of moving between inductive coding to identify key themes, and deductive coding to categorise findings according to these themes. The resulting code structure was discussed among all authors and noted down in an extraction table which served as the basis for the narrative description in this paper. Finally, the main themes were linked to the two major phases of digital health research: the collection and sharing of data to develop digital health tools, followed by the implementation and use of these tools (cf28). In reporting our findings, this review followed PRISMA scoping review guidelines (PRISMA-ScR)76 (Supplementary Table 4: PRISMA-ScR checklist).Data availabilityNo datasets were generated or analysed during the current study.ReferencesYildirim, O., Gottwald, M., Schüler, P. & Michel, M. C. Opportunities and challenges for drug development: public–private partnerships, adaptive designs and big data. Front. Pharmacol. 7, 461 (2016).Article  PubMed  PubMed Central  Google Scholar Schuhmacher, A., Gassmann, O., McCracken, N. & Hinder, M. Open innovation and external sources of innovation. An opportunity to fuel the R&D pipeline and enhance decision making? J. Transl. Med. 16, 1–14 (2018).Article  Google Scholar World Bank Institute. Public–Private Partnerships – Reference Guide Version 1.0 (International Bank for Reconstruction and Development/International Development Association or The World Bank, Washington, D.C., 2012).European Commission. European Partnerships in Horizon Europe. Accessed through: https://research-and-innovation.ec.europa.eu/funding/funding-opportunities/funding-programmes-and-open-calls/horizon-europe/european-partnerships-horizon-europe_en (2022).Health Holland. Flyer Publiek Private Partnership (PPP) ‘Gezondheid: ertoe doen en meedoen’ [Flyer Public Private Partnership (PPP) ‘Health: to matter and to participate’] (2020).Silva, P. J., Schaibley, V. M. & Ramos, K. S. Academic medical centers as innovation ecosystems to address population–omics challenges in precision medicine. J. Transl. Med. 16, 1–12 (2018).Article  Google Scholar NHS. A guide to effective NHS data partnerships. https://transform.england.nhs.uk/key-tools-and-info/centre-improving-data-collaboration/guide-to-effective-nhs-data-partnerships/ (2023).OECD. Strategic public/private partnerships in science, technology and innovation. OECD Science, Technology and Innovation Outlook 2016. https://www.oecd-ilibrary.org/docserver/sti_in_outlook-2016-10-en.pdf?expires=1724241481&id=id&accname=guest&checksum=55D88C1AF84AC30F53E0A10242242D18 (2016).Aitken, M. et al. Public responses to the sharing and linkage of health data for research purposes: A systematic review and thematic synthesis of qualitative studies. BMC Med. Ethics 17, 73 (2016).Article  PubMed  PubMed Central  Google Scholar Bak, M. A. R., Veeken, R., Blom, M. T., Tan, H. L. & Willems, D. L. Health data research on sudden cardiac arrest: perspectives of survivors and their next-of-kin. BMC Med. Ethics 22, 1–15 (2021).Article  Google Scholar Prainsack, B. & Buyx, A. A solidarity-based approach to the governance of research biobanks. Med. Law Rev. 21, 71–91 (2013).Article  PubMed  Google Scholar Zuboff, S. The age of surveillance capitalism: The fight for a human future at the new frontier of power (Profile Books, London, 2019).Campolo, A. & Crawford, K. Enchanted determinism: Power without responsibility in artificial intelligence. Engag. Sci. Technol. Soc. 6, 1–19 (2020).Google Scholar Moynihan, R. et al. Pathways to independence: Towards producing and using trustworthy evidence. BMJ 367, I6576 (2019).Olivieri, N. F. Patients’ health or company profits? The commercialization of academic research. Sci. Eng. Ethics 9, 29–41 (2003).Article  PubMed  Google Scholar Marr, K. & Phan, P. The valorization of non-patent intellectual property in academic medical centers. J. Technol. Transf. 45, 1823–1841 (2020).Article  PubMed  PubMed Central  Google Scholar Adjekum, A., Ienca, M. & Vayena, E. What is trust? Ethics and risk governance in precision medicine and predictive analytics. Omics 21, 704–710 (2017).Article  CAS  PubMed  PubMed Central  Google Scholar Mazzucato, M. The Entrepreneurial State: Debunking public vs. private sector myths (Penguin, London, 2011).Crawford, K. The atlas of AI: Power, politics, and the planetary costs of artificial intelligence (Yale University Press, New Haven, 2021).de Lecuona, I. & Villalobos-Quesada, M. European perspectives on big data applied to health: The case of biobanks and human databases. Dev. World Bioeth. 18, 291–298 (2018).Article  PubMed  Google Scholar Lipworth, W. Real-world data to generate evidence about healthcare interventions: The application of an ethics framework for big data in health and research. Asian Bioeth. Rev. 11, 289–298 (2019).Article  PubMed  PubMed Central  Google Scholar Sharon, T. The Googlization of health research: from disruptive innovation to disruptive ethics. Pers. Med. 13, 563–574 (2016).Article  CAS  Google Scholar Sharon, T. Blind-sided by privacy? Digital contact tracing, the Apple/Google API and big tech’s newfound role as global health policy makers. Ethics Inf. Technol. 23, 45–57 (2020).Article  PubMed  PubMed Central  Google Scholar Vezyridis, P. & Timmons, S. Understanding the care.data conundrum: New information flows for economic growth. Big Data Soc. 4, 1–12 (2017).Article  Google Scholar Marelli, L., Testa, G. & Van Hoyweghen, I. Big Tech platforms in health research: Re-purposing big data governance in light of the General Data Protection Regulation’s research exemption. Big Data Soc. 8, 20539517211018783 (2021).Article  Google Scholar Ferretti, A. & Vayena, E. In the shadow of privacy: Overlooked ethical concerns in COVID-19 digital epidemiology. Epidemics 41, 100652 (2022).Article  CAS  PubMed  PubMed Central  Google Scholar Rantanen, M. M. & Heimo, O. I. Citizens’ information for sale? Secondary use of information in Finland. ORBIT J. 1–16 (2018).Baric-Parker, J. & Anderson, E. E. Patient data-sharing for AI: Ethical challenges, catholic solutions. Linacre Q 87, 471–481 (2020).Article  PubMed  PubMed Central  Google Scholar Ballantyne, A. et al. Sharing precision medicine data with private industry: Outcomes of a citizens’ jury in Singapore. Big Data Soc. 9, 20539517221108988 (2022).Article  Google Scholar Shah, N. et al. Governing health data across changing contexts: A focus group study of citizen’s views in England, Iceland, and Sweden. Int. J. Med. Inform. 156, 104623 (2021).Article  CAS  PubMed  Google Scholar Laurie, G. T. Cross-sectoral big data: The application of an ethics framework for big data in health and research. Asian Bioeth. Rev. 11, 327–339 (2019).Cheung, S. Disambiguating the benefits and risks from public health data in the digital economy. Big Data Soc. 7, 2053951720933924 (2020).Article  Google Scholar Spithoff, S., Stockdale, J., Rowe, R., McPhail, B. & Persaud, N. The commercialization of patient data in Canada: Ethics, privacy and policy. CMAJ 194, E95–E97 (2022).Article  PubMed  PubMed Central  Google Scholar Carter, P., Laurie, G. T. & Dixon-Woods, M. The social licence for research: Why care.data ran into trouble. J. Med. Ethics 41, 404–409 (2015).Article  PubMed  Google Scholar Martin, P. & Hollin, G. A new model of innovation in biomedicine? November 1–34 http://nuffieldbioethics.org/wp-content/uploads/A-New-Model-of-Innovation_web.pdf (2014).Sterckx, S., Rakic, V., Cockbain, J. & Borry, P. You hoped we would sleep walk into accepting the collection of our data”: controversies surrounding the UK care.data scheme and their wider relevance for biomedical research. Med. Health Care Philos. 19, 177–190 (2016).Article  PubMed  Google Scholar Sterckx, S., Dheensa, S. & Cockbain, J. Presuming the promotion of the common good by large-scale health research: the cases of care.data 2.0 and the 100,000 Genomes Project in the UK. In Personalised Medicine, Individual Choice and the Common Good (eds Sterckx, S., Dheensa, S. & Cockbain, J.) 155-182 (Cambridge Univ. Press, 2018).Ballantyne, A. & Stewart, C. Big data and public-private partnerships in healthcare and research: the application of an ethics framework for big data in health and research. Asian Bioeth. Rev. 11, 315–326 (2019).Article  PubMed  PubMed Central  Google Scholar Powles, J. & Hodson, H. Google DeepMind and healthcare in an age of algorithms. Health Technol. 7, 351–367 (2017).Article  Google Scholar Powles, J. & Hodson, H. Response to DeepMind. Health Technol. 8, 15–29 (2018).Article  Google Scholar Schneider, G. Health Data Pools: Case-Studies and Involved Interests. https://doi.org/10.1007/978-3-030-95427-7_3 (2022).Winter, J. S. & Davidson, E. Big data governance of personal health information and challenges to contextual integrity. Inf. Soc. 35, 36–51 (2019).Article  Google Scholar Árnason, E. & Andersen, B. deCODE and Iceland: A critique. eLS 1–16 (February 2013).Tupasela, A. Introduction: Turning Populations into Assets. In Populations as Brands. https://doi.org/10.1007/978-3-030-78578-9 (2021).Winickoff, D. A bold experiment: Iceland’s genomic venture. In Ethics, Law and Governance of Biobanking (ed. Mascalzoni, D.) 157-178 (Springer, 2015).Snell, K., Tarkkala, H. & Tupasela, A. A solidarity paradox – welfare state data in global health data economy. Health 136, 345–932 (2021).Google Scholar Holm, S. & Ploug, T. Big data and health research—the governance challenges in a mixed data economy. J. Bioeth. Inq. 14, 515–525 (2017).Article  PubMed  Google Scholar Kaplan, B. Selling health data: de-identification, privacy, and speech. Camb. Q. Healthc. Ethics 24, 256–271 (2015).Article  PubMed  Google Scholar Landers, C., Ormond, K. E., Blasimme, A., Brall, C. & Vayena, E. Talking Ethics Early in Health Data Public Private Partnerships. J. Bus. Ethics 190, 649–659 (2023).Piciocchi, C. et al. Legal issues in governing genetic biobanks: the Italian framework as a case study for the implications for citizen’s health through public-private initiatives. J. Community Genet. 9, 177–190 (2018).Article  PubMed  Google Scholar Winkler, E. C. et al. Patient data for commercial companies? An ethical framework for sharing patients’ data with for-profit companies for research. J. Med. Ethics 1–8 (2023).Witjas-Paalberends, E. R. et al. Challenges and best practices for big data-driven healthcare innovations conducted by profit–non-profit partnerships–a quantitative prioritization. Int. J. Healthc. Manag. 11, 171–181 (2018).Article  Google Scholar Tully, M. P., Hassan, L., Oswald, M. & Ainsworth, J. Commercial use of health data—A public “trial” by citizens’ jury. Learn. Health Syst. 3, e10200 (2019).Article  PubMed  PubMed Central  Google Scholar Grön, K. Common good in the era of data-intensive healthcare. Humanit. Soc. Sci. Commun. 8, 1–10 (2021).Article  Google Scholar Sharon, T. When digital health meets digital capitalism, how many common goods are at stake? Big Data Soc. 5, 2053951718819032 (2018).Article  Google Scholar Street, J. et al. Sharing administrative health data with private industry: A report on two citizens’ juries. Health Expect. 24, 1337–1348 (2021).Article  PubMed  PubMed Central  Google Scholar Horn, R. & Kerasidou, A. Sharing whilst caring: Solidarity and public trust in a data-driven healthcare system. BMC Med. Ethics 21, 1–7 (2020).Article  Google Scholar Cole et al. Ten principles for data sharing and commercialization. J. Am. Med. Inform. Assoc. 28, 646–649 (2021).Article  PubMed  Google Scholar Fisher, E. & Rosenhek, Z. Engendering assemblages: the constitution of digital health data as an epistemic consumption object. J. Cult. Econ. 15, 599–616 (2022).Article  Google Scholar Howard, M. Wearables, the marketplace and efficiency in healthcare: How will I know that you’re thinking of me? Philos. Technol. 34, 1545–1568 (2021).Article  Google Scholar Lanzing, M., Lievevrouw, E. & Siffels, L. It Takes Two to Techno-Tango: An Analysis of a Close Embrace Between Google/Apple and the EU in Fighting the Pandemic Through Contact Tracing Apps. Sci. Cult. 31, 136–148 (2022).Article  Google Scholar Graham, M. Data for sale: trust, confidence and sharing health data with commercial companies. J. Med. Ethics 49, 515–522 (2023).Cohen, I. G. & Mello, M. M. Big data, big tech, and protecting patient privacy. JAMA 322, 1141–1142 (2019).Article  PubMed  Google Scholar Spector-Bagdady, K., Hutchinson, R., Kaleba, E. O. B. & Kheterpal, S. Sharing health data and biospecimens with industry-a principle-driven, practical approach. N. Engl. J. Med. 382, 2072–2075 (2020).Article  PubMed  Google Scholar Kaplan, B. How should health data be used?: Privacy, secondary use, and big data sales. Camb. Q. Healthc. Ethics 25, 312–329 (2016).Article  PubMed  Google Scholar Wickson, F., Delgado, A. & Kjølberg, K. L. Who or what is ‘the public’? Nat. Nanotechnol. 5, 757–758 (2010).Article  CAS  PubMed  Google Scholar Coeckelbergh,M.Artificial intelligence, the common good, and the democratic deficit in AI governance. AI Ethics 1-7 (2024).Kennedy, M. A. Tripartite governance: Enabling successful implementations with vulnerable populations. Nurs. Inform. 225, 158–162 (2016).Drayton, J. & Hart, V. Promoting good governance as an anti-corruption tool. (2010) Promoting Good Governance as an Anti-Corruption Tool (researchgate.net).World Health Organization. Ethics and governance of artificial intelligence for health: WHO guidance: executive summary. (World Health Organization, 2021).Landers, C., Blasimme, A. & Vayena, E. Sync fast and solve things—best practices for responsible digital health. npj Digit. Med. 7, 113 (2024).Article  PubMed  PubMed Central  Google Scholar Hallonsten, O. Economization. In Empty Innovation: Causes and Consequences of Society’s Obsession with Entrepreneurship and Growth (Springer Int. Publ., 2023).McLennan, S. et al. Embedded ethics: A proposal for integrating ethics into the development of medical AI. BMC Med. Ethics 23, 6 (2022).Article  PubMed  PubMed Central  Google Scholar Wagner, B. Ethics as an escape from regulation. From “Ethics Washing” to Ethics Shopping? In Being Profiled: Cogitas Ergo Sum (eds Bayamlioğlu, E., Baraliuc, I., Janssens, L. & Hildebrandt, M.) 84-88 (Amsterdam Univ. Press, Amsterdam, 2018).Elo, S. & Kyngäs, H. The qualitative content analysis process. J. Adv. Nurs. 62, 107–115 (2008).Article  PubMed  Google Scholar Tricco, A. C. et al. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann. Intern. Med. 169, 467–473 (2018).Article  PubMed  Google Scholar Van Tulder, R. & Pfisterer, S. Creating Partnering Space. In Social Partnerships and Responsible Business: A Research Handbook 105 (2013).Download referencesAcknowledgementsThis study received no funding. For feedback on earlier versions of this article, we thank our colleagues at the Institute of History and Ethics in Medicine, Technical University of Munich, and at the Department of Ethics, Law and Humanities, Amsterdam UMC.FundingOpen Access funding enabled and organized by Projekt DEAL.Author informationAuthors and AffiliationsInstitute of History and Ethics in Medicine, Department of Preclinical Medicine, TUM School of Medicine and Health, Technical University of Munich, Munich, GermanyMarieke A. R. Bak, Alena Buyx & Stuart McLennanDepartment of Ethics, Law and Humanities, Amsterdam UMC, University of Amsterdam, Amsterdam, The NetherlandsMarieke A. R. Bak & Daan HorbachInstitute for Biomedical Ethics, University of Basel, Basel, SwitzerlandStuart McLennanAuthorsMarieke A. R. BakView author publicationsYou can also search for this author in PubMed Google ScholarDaan HorbachView author publicationsYou can also search for this author in PubMed Google ScholarAlena BuyxView author publicationsYou can also search for this author in PubMed Google ScholarStuart McLennanView author publicationsYou can also search for this author in PubMed Google ScholarContributionsM.B. conceptualized the paper, led the analysis, supervised D.H., and wrote the original draft. D.H. developed the search strategy, was involved in analysis and data curation, created materials (figures and tables) for the original draft, and reviewed the manuscript. A.B. was involved in conceptualization of the paper and reviewed the draft. S.M. was involved in conceptualization and analysis, and reviewed and substantially edited the paper.Corresponding authorCorrespondence to Marieke A. R. Bak.Ethics declarationsCompeting interestsThe authors declare no competing interests.Additional informationPublisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.Supplementary informationSupplementary InformationRights and permissionsOpen Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.Reprints and permissionsAbout this article