Data Regulators in China, the UK, and Europe Are Making Moves – What Alternative Data Users Should Know

by Apr 5, 2024Blog, Featured

Alternative Data Compliance

Regulatory activities in the UK, Europe, and China were recently discussed as part of Eagle Alpha’s monthly alternative data-focused workshops for asset managers, via their Navigator subscription and in collaboration with legal partners Schulte Roth & Zabel. Anna-Maleva Otto, a partner at Schulte Roth & Zabel, and Samuel Yang, a partner at AnJie Law discussed the developments that have shaped the data landscape in 2023 and what can be expected in 2024. This article highlights three recent updates covered during this session from the European and Chinese markets for alternative data users concerned with alternative data compliance. 

Update 1: Cross-Border Data Transfers 

China 

In September 2023, the Cyberspace Administration of China (CAC) released draft regulations aimed at easing cross-border data transfer restrictions. This move is part of a broader strategy to attract foreign investment by relaxing control over data flows. The draft regulations introduced several exemptions to the existing stringent requirements for cross-border data transfers: 

  1. Data Originating Overseas: Entities are not required to report or file with regulators for data that originates from outside China. 
  2. Data for Service Provision: If the data transfer is necessary for providing services to individuals in China, no filing with the government is required. 
  3. Employee Data: Foreign-invested companies can transfer employee data overseas as part of global HR management without the current legal regime’s burdensome requirements. 
  4. Emergency Personnel Information: Data necessary for protecting life, health, and property in emergencies is exempt from reporting and filing requirements. 
  5. Small-Scale Data Exports: Transfers involving data of fewer than 10,000 individuals within a year are exempt from filing or contractual obligations. 

 

In March 2024, the CAC revised cross-border data transfer rules with a security assessment threshold now set at personal data transfers of more than one million individuals within a year or sensitive personal data of more than 10,000 individuals in a calendar year. A new exemption has been introduced for annual data transfers involving fewer than 100,000 individuals, excluding sensitive personal data. Also, all overseas transfers of ‘important data,’ as defined by the Data Security Law, must undergo a security assessment. However, organizations are not required to apply for this assessment unless their data has been officially classified as ‘important data,’ 

These exemptions signify a significant step towards relaxing the current controls on cross-border data transfers, potentially reducing compliance times from three to six months to considerably less. The discussion also clarified that current cross-border data transfer regulations primarily concern transfers from China to other jurisdictions, treating regions like Hong Kong, Taiwan, and Macau distinctly due to their separate legal systems. The potential for future mechanisms enabling international data transactions via platforms like the Shanghai or Shenzhen data exchanges was mentioned as an interesting development to watch in the alternative data compliance space. 

Europe  

Data transfers between Europe and the US also changed last year affecting alternative data compliance. In July 2023, however, the European Commission approved an adequacy decision for the EU-US Data Privacy Framework. Therefore, the free flow of personal data from the EU to American companies participating in the Data Privacy Framework will be allowed.  

This decision was made in response to new safeguards introduced in the United States, including an Executive Order issued by President Biden and a regulation from the US Attorney General, designed to address concerns raised by the European Court of Justice. These safeguards ensure that US intelligence agencies can access data only when necessary and proportionate and establish a fair and impartial mechanism for addressing complaints from European individuals regarding data collection. 

In September 2023, the UK government approved the UK-U.S. Data Bridge, allowing the flow of personal data to U.S. entities that have self-certified to the EU-U.S. Data Privacy Framework (DPF) and extend their certification to cover UK data. This bridge eliminates the need for additional data transfer mechanisms, simplifying UK-U.S. data transfers. 

Update 2: AI Regulation 

China 

China lacks a unified Artificial Intelligence (AI) law but has been actively legislating in this area, planning for a more solid and possibly unified AI regulation by 2030. Current regulations address security assessments, regulatory filings, and ethical reviews, especially where AI technologies might influence public opinion. There’s also an emphasis on global cooperation in AI governance, highlighting the need to avoid the weaponization of AI technologies. 

According to Sam, the comprehensive regulatory framework for cybersecurity, data security, and personal information protection is closely linked, with each law addressing different aspects of data and privacy protection. AI-related laws are expected to integrate with these existing laws, focusing specifically on AI while referring to cybersecurity and data protection where relevant. 

Europe 

Anna discussed the UK regulators, including the Bank of England and the Financial Conduct Authority (FCA), who are adopting a “wait and see” approach towards AI regulation. They prefer to work within the existing frameworks of data protection and operational resilience, especially for financial services, rather than introducing a comprehensive AI act. Their focus is on reviewing public feedback on AI, emphasizing cooperation among regulators to address risks such as fairness and bias, and incorporating AI governance into the senior management accountability framework. 

In January 2024, The Information Commissioner’s Office (ICO) initiated a consultation series on generative AI to explore how data protection laws should apply to the development and utilization of this technology. The first consultation focuses on determining the legality of training generative AI models using personal data collected from web scraping activities. The AI Regulation Bill was also proposed by Lord Holmes with a second reading in March 2024 which recommends an AI authority to coordinate regulation across industries, employing sandboxes for testing and requiring responsible AI officers 

In contrast, the EU has moved towards establishing a comprehensive regulatory framework for AI with the AI Act, expected to be applied in 2026. This act aims to provide a clear definition of AI and regulate its use across all sectors, with specific prohibitions on high-risk applications like remote biometric identification in public spaces and social scoring. The EU’s approach is more defined and restrictive compared to the UK’s flexible stance. 

Update 3: The UK’s Focus on Alternative Data Compliance 

The FCA’s review of wholesale data, including trading and credit rating data, reflects its role as both a competition authority and a data regulator. The focus is on addressing barriers to competition, privacy, and ethical risks associated with alternative data, particularly ESG data. The UK is looking to make alternative data more accessible and competitively priced, with further guidance expected on data governance and privacy. In March 2024, the FCA confirmed that it would not intervene with the wholesale data market to avoid unintended consequences.  

Both the UK and EU are emphasizing the importance of data anonymization, with forthcoming guidance from the UK’s Information Commissioner’s Office. The challenge of assessing whether data has been properly anonymized, especially in the context of due diligence of alternative data products and vendors, is highlighted as a significant risk area. 

While the UK and EU are more focused on privacy and ethical implications of AI and alternative data, Anna believes the SEC appears more concerned with the implications of AI on policies and procedures and the trading of non-public information (MNPI). The approach in the US is more enforcement-oriented, emphasizing the need for firms to have detailed policies and procedures for AI use. 

In February 2024, The FTC’s Chair Lina Khan highlighted that sensitive personal data like health, location, and web browsing history should be excluded from training artificial intelligence models. Companies must actively inform users if they plan to repurpose collected data for AI training as concerns over privacy and security have arisen due to the rapid advancement of generative AI technology, which can mimic individuals. 

Join us for the upcoming Alternative Data Conference on May 16th in London, bringing the investment and data community together in one place. This event will feature presentations on AI applications in data management, talent trends, introductions to new vendors, technical showcases, and compliance discussions led by Schulte Roth + Zabel LLP. With 25 vendors, 1-to-1 meetings, and extensive networking opportunities, the conference promises insightful content with past speakers from Microsoft, J.P. Morgan, and Blackrock, among others. Register your interest here.