A woman attempting to chaperone her daughter’s Girl Scout troop on a trip to attend a Rockette’s show at Radio City Music Hall was denied entry based on facial recognition technology. The security subsequently revealed that she was on a list of excluded attorneys as her firm was involved in ongoing litigation against Madison Square Garden (MSG) Entertainment (which owns Radio City Music Hall). this could be one of the consequences of allowing private corporations to use facial recognition technology.
Following the Supreme Court decision to overturning Roe v. Wade on June 24, 2022, the Dobbs v. Jackson Women’s Health Organization ruling that gutted the long-established right to an abortion has been a constant focus, both inside and outside of the legal and healthcare communities. Notably, the ruling has remained a central focus within both the government, federal and state, and surrounding the tech sector. And these Dobbs-related conversations have a theme – the topic of health data privacy. But more specifically, discussions about data privacy surrounding reproductive healthcare.
In Dobbs v. Jackson Women’s Health Organization (Dobbs), the US Supreme Court ruled that abortion is not a fundamental right protected by the Constitution. This decision resulted in additional abortion protections in California, Michigan, and Vermont, and prompted many patients, providers, regulators, and tech companies to rethink data privacy. However, because most abortions are still banned in at least 13 states, this patchwork of state abortion laws, combined with the lack of any sufficient national privacy law, puts patient privacy at risk.
In June 2022, a nonprofit news site called The Markup released a report stating that hospitals using Meta Pixel may be releasing patient data to Meta Platforms, Inc. (previously Facebook, Inc.). Since this report was released, many of the hospitals identified in the report removed pixel technology from their websites. In addition, some hospitals have released public breach notices and reported potential data privacy breaches to the US Department of Health and Human Services (HHS) Office of Civil Rights (OCR). Most recently, on October 20, 2022, Advocate Aurora Health, a large health system located in the Midwest, released a notice publicly announcing its potential pixel breach, which may affect as many as three million patients.
On June 24, the Supreme Court officially overturned Roe v. Wade. In doing so, it declared that there was no longer a constitutional right to abortion, allowing state police power to determine its legality. Immediately after this decision, trigger laws went into effect across a quarter of the states, making abortions illegal. Post Dobbs, information collected on personal devices, especially through period-tracking and telemedicine apps, is at risk of being exposed and utilized as criminal evidence.
Conversation surrounding the hodgepodge of state data privacy legislation in the U.S. has long been a subject of frustration within the U.S. and abroad. 2021 saw a drastic uptick in awareness and a need for meaningful comprehensive consumer privacy laws. With both data privacy and cybersecurity repeatedly making front page news over the last year, and even becoming high priority within the Biden Administration, it has become one of the few issues on which people across the political spectrum can agree. But will 2022 be the year that comprehensive federal privacy legislation becomes a reality? Don’t count on it.
There is no doubt that the COVID-19 pandemic has affected almost every aspect of life for people around the globe. While the internet has allowed people to stay connected and continue working from home, it has also presented an opportunity for cybercriminals to take advantage of susceptible remote working setups. Cybercrime has significantly increased since the start of the pandemic, prompting corporations to mitigate the risk of a data breach against an onslaught of new vulnerabilities to their internal systems.
The recent Pandora Papers leak in October 2021 shined the light on the massive and intricate web of offshore accounting that allows for insurmountable amounts of wealth to be hidden throughout the world. One of the most shocking revelations of these Papers was how heavily the United States was implicated in creating and perpetuating this system. As such, legislators have been pressured to find a way to crackdown on this sort of offshore money. One way that they have proposed addressing the problem is by amending the United States’ current criminal financial legislation, the Bank Secrecy Act.
On Friday, February 26, 2021, U.S. District Court Judge James Donato approved a 650 million-dollar settlement against tech giant Facebook for violating the Illinois Biometric Information Privacy Act. Chicago attorney Jay Edelson filed the class action lawsuit in 2015, alleging that Facebook had failed to obtain consent from users before using facial recognition technology to scan and digitally store uploaded photos.
On December 12, 2020, the European Commission (the “EC”) issued a highly anticipated draft of newly revised standard contractual clauses (“new SCCs”) that may be used by European Union-based companies to safeguard data transfers of personal data to third countries, such as the US, in compliance with GDPR Art. 46(1). The release comes at a decidedly inopportune time as it follows on the heels of the Court of Justice of the European Union’s (CJEU) Data Protection Commissioner v. Facebook Ireland Limited and Maximillian Schrems (“Schrems II”) decision which casts serious doubt on the adequacy of SCCs alone to safeguard against the “high-risks” involved in EU to US data transfers. And for many data protection experts, the language of the revised SCCs only adds to the confusion, raising even more questions. But one question in particular seems to be prominent among others—for transfers to importers, directly subject to GDPR, are SCCs really necessary?