Author:

Richard Horton

It’s Not Too Early to Start Worrying About Discriminatory Algorithms in Your Code: A Practical Approach to Self-Regulation

There’s no doubt that remote work, brought on by the coronavirus pandemic, will accelerate the digital revolution already underway. Consumers’ growing appetite to conduct their business online, rather than in-person, has fueled the proliferation of digitally accessible products and services. For instance, movie theaters have closed their doors while content streaming services have experienced exponential growth. And while the restaurant industry, as a whole, has suffered, ‘virtual’ kitchens and grocery delivery apps have picked up steam. A critical question that arises from these trends is “what can be done to eliminate biases in the algorithms that drive these digital transactions?”

A Practical Approach to Post-Schrems II Remediation of Cross-Border Data Transfers to the U.S. and Other “High Risk” Third Countries

On July 16, 2020, the Court of Justice of the European Union (“CJEU”) issued its deafening decision that summarily and immediately invalidated the EU-US Privacy Shield. The regulatory program established between the European Council and the U.S. Dept. of Commerce allowed for the transfer of personal data of EU residents to be sent from the EU to the US without violating the data transfer restrictions of the General Data Protection Regulation (“GDPR”). The decision went on to cast serious doubt on the sufficiency of standard contractual clauses to adequately protect data transferred to any third country, not just the US. Several months later, data exporters in the EU are still sorting through the wreckage of their privacy programs and waiting for practical advice on the way forward.

Relax, After GDPR’s Schrems II, Some Companies Transferring Personal Data from the EU to the US May Actually Have Less Challenges Than You Thought

On December 12, 2020, the European Commission (the “EC”) issued a highly anticipated draft of newly revised standard contractual clauses (“new SCCs”) that may be used by European Union-based companies to safeguard data transfers of personal data to third countries, such as the US, in compliance with GDPR Art. 46(1). The release comes at a decidedly inopportune time as it follows on the heels of the Court of Justice of the European Union’s (CJEU) Data Protection Commissioner v. Facebook Ireland Limited and Maximillian Schrems (“Schrems II”) decision which casts serious doubt on the adequacy of SCCs alone to safeguard against the “high-risks” involved in EU to US data transfers. And for many data protection experts, the language of the revised SCCs only adds to the confusion, raising even more questions. But one question in particular seems to be prominent among others—for transfers to importers, directly subject to GDPR, are SCCs really necessary?

Covid-19 Tenant Eviction Long-Term Relief: Designing a more Effective Data Privacy Remedy in Tenant Screening

Covid-19 has not only damaged the health and physical well-being of those stricken by the potentially deadly coronavirus, but it has also ravaged the livelihoods and financial stability of many millions more people around the world. The virus spread across the U.S. with incredible speed as more than 100,000 people had already been infected by early March. In many ways the unexpected and quick arrival of the pandemic caught many households financially unprepared and ill-equipped to survive the economic shutdown unscathed. For those that have experienced rent hardship and have, or will soon, be subject to an eviction for non-payment of rent, they must recover not only from the short-term challenges of finding shelter and putting their lives back together, but also the long-term struggle of finding suitable housing with an often disqualifying and indelible mark on their rental history.

Privacy Lessons Learned from Litigation: Video Surveillance of the Robert Kraft Massage Parlor-Prostitution Sting Operation

The criminal case against the NFL New England Patriots’ franchise owner, Robert Kraft, has taken an astounding turn of events as the Florida Court of Appeals handed down its ruling on Kraft’s privacy objections against law enforcement’s surveillance video evidence showing the billionaire soliciting prostitution at a local spa. Kraft filed a motion to suppress the evidence arguing that Florida law enforcement’s non-consensual and surreptitious recording of non-audio video surveillance of the premises of a private business, that is open to the public, runs afoul of Kraft’s, and others’, Fourth Amendment right to be free from unreasonable government searches. The ruling of the Appeals Court not only affirmed a similar lower court ruling by the Palm Beach County trial court, favoring Kraft, but it served up an interesting compliance lesson on the privacy protections required of law enforcement during their surreptitious video surveillance operations.

Privacy Lessons Learned from Litigation: The unfair and deceptive practices lawsuit against Zoom

Yet another privacy and data security-related lawsuit has been filed against Zoom Video Communications, Inc. (“Zoom Inc.”). Zoom Inc. has been the subject of several complaints related to its video-conferencing service since its meteoric and spectacular rise in popularity due to the Coronavirus pandemic and related quarantine measures beginning in March 2020. In this particular case, there are compliance lessons to be learned from the unfair and deceptive practices claims alleged against Zoom Inc. in the plaintiff’s D.C. Superior Court filing.