There may be further divergence in the future, including with regard to administrative burdens. For example, the Data (Use and Access) Act 2025, or the UK Act, which supplements the UK GDPR, recently came into force and introduces certain provisions that diverge from the EU GDPR. Although the UK’s data protection framework is still considered to provide “essentially equivalent” safeguards to the EU’s GDPR, future divergence remains a possibility. Further divergence between the EU GDPR and UK GDPR could add legal risk, uncertainty, complexity, and cost to our handling of European personal data and our privacy and security compliance programs. We may no longer be able to take a unified approach across the European Union and the United Kingdom, and we will need to amend our processes and procedures to align with such divergence. In addition, EEA Member States have adopted national laws to implement the EU GDPR that may partially deviate from the EU GDPR and competent authorities in the Member States may interpret the EU GDPR obligations slightly differently from country to country. Therefore, we do not expect to operate in a uniform legal landscape in the EEA.
Similar data protection laws are either in place or under way in the United States. There are a broad variety of privacy and data security laws and regulations that may be applicable to our activities governing the collection, use, disclosure, and protection of health-related and other personal information (including, state data breach notification laws, health information and/or genetic privacy laws and federal and state consumer protection laws including Section 5 of the FTC Act, HIPAA, and the California Consumer Privacy Act, or CCPA). For example, the CCPA as amended by the California Privacy Rights Act, has created certain requirements for data use, sharing and transparency, and provides California residents certain rights concerning their personal information, such as access, correction, deletion and opt out of selling or sharing such data. A number of other states have implemented privacy legislation similar to the CCPA or are preparing to implement their own regulatory frameworks. Certain states have also passed laws that protect biometric information or are specifically focused on consumer health data (e.g., Washington, Connecticut and Nevada), which impose state regulations on consumer health data, which further increases compliance risk. These laws and regulations, including their interpretation by governmental agencies, are subject to frequent change and could have a negative impact on our business. Further, these varying interpretations could create complex compliance issues for us and our partners and potentially expose us to additional expense, liability, penalties, negatively impact our client relationships, and lead to adverse publicity, and all of these risks could negatively affect our business in the short and long term.. A wide range of enforcement agencies at both the state and federal levels, such as the Federal Trade Commission and state Attorneys General have been increasingly aggressive in reviewing and enforcing privacy and data security-related consumer protection laws.
Regulators and legislators in the U.S. are increasingly scrutinizing and restricting certain personal data transfers and transactions involving foreign countries. For example, a January 2025 DOJ rule prohibits transfers of data to countries of concern, including China, as well as certain agreements absent specified cybersecurity controls. Actual or alleged violations of these regulations may be punishable by criminal and/or civil sanctions, and may result in exclusion from participation in federal and state programs. and could restrict our ability to use certain vendors, sites, investigators, or service providers in global clinical trials. See Item 1 “Business—Government Regulation and Product Approvals” in the 2025 Annual Report for additional information.
Given the breadth and depth of changes in privacy, data protection and consumer protection obligations, preparing for and complying with these requirements is rigorous and time intensive and requires significant resources and ongoing review of our technologies, systems and practices, as well as those of any third-party collaborators, service providers, contractors or consultants that store, process or transfer personal data on our behalf. Compliance with the GDPR and other similar laws or regulations associated with the enhanced protection of certain types of sensitive data, such as healthcare data or other personal information from our clinical trials, could require us to change our business practices and put in place additional compliance mechanisms, may interrupt or delay our development, regulatory and commercialization activities and increase our cost of doing business. Any failure or perceived failure by us to comply with such laws and regulations could lead to government enforcement actions, private litigation and significant fines and penalties against us and could have a material adverse effect on our business, financial condition or results of operations. There is also the threat of consumer class actions related to these laws and the overall protection of personal data. Even if we are not determined to have violated these laws, government investigations into these issues typically require the expenditure of significant resources and generate negative publicity, which could harm our reputation and our business.
Our use of new and evolving technologies, such as artificial intelligence, or AI, may present risks and challenges that can impact our business, including by posing cybersecurity and other risks to our confidential and/or proprietary information, including personal information, and as a result we may be exposed to reputational harm and liability.
We may use, and our vendors may incorporate AI both in our own development and implementation of AI and through the adoption of commercially available tools. The use of AI presents risks and challenges that could adversely affect our business, including cybersecurity, data privacy, IT, confidentiality, regulatory, legal, operational, competitive, reputational and intellectual property risks. Specifically, risks related to accuracy, bias, artificial intelligence hallucinations, discrimination, harmful content, misinformation, fraud, scams, targeted attacks (including model poisoning or data poisoning), surveillance, data leakage, environmental and other harms may flow from our development or use of AI technologies. For example, use of certain AI tools may increase the risk of unauthorized disclosure of confidential information, compromise of proprietary intellectual property, or