Big Data and the financial industry
June 2020 | SPOTLIGHT | BANKING & FINANCE
Financier Worldwide Magazine
June 2020 Issue
Global privacy law has never been more unsettled. Even though the law of privacy in connection with personal information of consumers and employees is a relatively new field, we are currently in a time of enormous upheaval around the world. The General Data Protection Regulation (GDPR) created a new worldwide focus on privacy issues, by creating a broad set of overall principles with meaningful extraterritorial scope. In the US, the California Consumer Privacy Act (CCPA) has created both an enormous cottage industry for privacy compliance and growing, although temporarily stalled, pressure for a US national privacy law. And that is before the coronavirus pandemic created a new series of pressures on privacy law, particularly in connection with the potential use of personal data, such as location data, for a broad variety of public interest purposes.
One of the critical elements of this evolution is the emergence of Big Data principles around the world, and the recognition that, while Big Data presents enormous opportunities for both consumers and industry, there also are substantial risks for consumers, many of which may not yet be clear. To the extent that governmental entities have sought to regulate or debate Big Data analytics, most of the attention has been on trying to develop both a recognition of these benefits and an understanding of how the risks may upend protections under law today for a variety of consumers, particularly those in at risk groups. In today’s business environment, harnessing the power of Big Data requires thoughtful analysis of the evolving legal structure and, because this structure is clearly not complete, also requires critical attention to be paid to issues like ethics and best practices, areas where the role of a corporate privacy officer is particularly important.
So what is Big Data? According to the US Federal Trade Commission (FTC), Big Data refers to “a confluence of factors, including the nearly ubiquitous collection of consumer data from a variety of sources, the plummeting cost of data storage, and powerful new capabilities to analyze data to draw connections and make inferences and predictions”. This concept – which can also apply more generally to all data being generated in an increasingly electronic and digital environment – involves a series of interrelated developments, including optimal processing power generated by increasingly powerful computers, low cost and high-volume storage, analytics capabilities and technical skills to harness and manage this data.
Some of the tensions between Big Data and privacy are clear – the people collecting Big Data want as much data as possible, to use as the basis of the models businesses create to explain, predict and affect behaviour. To them, more data means better models. The key concept is “gather it now, figure out why later”. At the same time, it also is clear through these analytics that the system needs to test lots of data to identify what is important. For example, in one recent study, healthcare researchers found that certain data elements, including number of cars, income and marital status, played a strong role in evaluating emergency room utilisation. Certainly not an obvious finding, and one which presumably required testing a number of other variables before determining that these ones mattered.
At the same time, there is a recognition that this kind of analytics can be both valuable and risky. One critical report from the Obama White House, ‘Big Data: Seizing Opportunities, Preserving Values’, noted: “An important finding of this review is that while Big Data can be used for great social good, it can also be used in ways that perpetrate social harms or render outcomes that have inequitable impacts, even when discrimination is not intended. Small biases have the potential to become cumulative, affecting a wide range of outcomes for certain disadvantaged groups...Society must take steps to guard against these potential harms by ensuring power is appropriately balanced between individuals and institutions, whether between citizen and government, consumer and firm, or employee and business.”
Similarly, in its January 2016 report ‘Big Data: A Tool for Inclusion or Exclusion’, the FTC concluded that “The analysis of [Big] Data is often valuable to companies and to consumers, as it can guide the development of new products and services, predict the performance of individuals, help tailor services and opportunities and guide individualized marketing”. In addition, “At the same time, advocates, academics and others have raised concerns about whether certain uses of Big Data analytics may harm consumers, particularly low-income and underserved populations”.
For companies evaluating Big Data opportunities, it is crucial to think beyond formal legal requirements. These requirements may come into play, but looking solely at these issues will be insufficient, particularly in an industry like financial services where the risks to consumers are particularly important. Even understanding which laws to think about is challenging – in addition to the broad and growing range of privacy laws, including both ‘general’ laws like GDPR, industry specific laws like the Gramm-Leach-Bliley Act and ‘narrow’ laws dealing with specific new technologies such as facial recognition, it also is critical to think about civil rights laws, intellectual property, insider trading and other categories of these evolving laws. In addition, thinking about potential oversight is important – this may come from government regulators, but is equally (and perhaps more) likely to come from media, customers and plaintiffs’ attorneys.
What should companies be thinking about? And remember, there is a critical role here for lawyers, privacy officers and compliance officers in guiding your company on these issues. Make sure you know what the company is doing in this area – often, the Big Data teams operate with data that may be perceived as anonymous or aggregated, and therefore outside the scope of privacy law. But where the implications of these analytics findings and algorithms impact people, that is when these other issues come into play. Be appropriately transparent about what you
are doing, so that your stakeholders can understand. Be smart and responsible in your data practices – making decisions and taking action without careful consideration of the implications of your decisions may be the biggest risk you face. Make sure your security practices are reasonable. And be a participant in the ongoing public debate on these issues.
In addition, companies should drill down on the details of their analytics programmes. Think about the sources of your company’s data. Understand what you are doing with this data, and what conclusions you are drawing from it. Who are you giving your data to, in raw, anonymised or otherwise, or analysed form? Who do you have relationships with who might want to use your data? What controls do you have on what others are doing with your data?
For companies across the financial industry, as well as their lawyers, privacy officers and compliance officials, Big Data presents an ongoing challenge. It requires an understanding of the company’s needs, the data being used and disclosed, the decisions that are being made, and a careful understanding of both the evolving legal structure and the rapidly changing public understanding of the risks and benefits of Big Data. It will require creativity and thoughtfulness and a careful and nuanced understanding of the evolving standards in a privacy system driven both by law and by changing public and government attitudes.
Kirk J. Nahra is a partner and co-chair of the global cyber security and privacy practice at WilmerHale. He can be contacted on +1 (202) 663 6128 or by email: kirk.nahra@wilmerhale.com.
© Financier Worldwide
BY
Kirk J. Nahra
WilmerHale