There is no doubt that the COVID-19 pandemic has affected almost every aspect of life for people around the globe. While the internet has allowed people to stay connected and continue working from home, it has also presented an opportunity for cybercriminals to take advantage of susceptible remote working setups. Cybercrime has significantly increased since the start of the pandemic, prompting corporations to mitigate the risk of a data breach against an onslaught of new vulnerabilities to their internal systems.
The recent Pandora Papers leak in October 2021 shined the light on the massive and intricate web of offshore accounting that allows for insurmountable amounts of wealth to be hidden throughout the world. One of the most shocking revelations of these Papers was how heavily the United States was implicated in creating and perpetuating this system. As such, legislators have been pressured to find a way to crackdown on this sort of offshore money. One way that they have proposed addressing the problem is by amending the United States’ current criminal financial legislation, the Bank Secrecy Act.
On Friday, February 26, 2021, U.S. District Court Judge James Donato approved a 650 million-dollar settlement against tech giant Facebook for violating the Illinois Biometric Information Privacy Act. Chicago attorney Jay Edelson filed the class action lawsuit in 2015, alleging that Facebook had failed to obtain consent from users before using facial recognition technology to scan and digitally store uploaded photos.
On December 12, 2020, the European Commission (the “EC”) issued a highly anticipated draft of newly revised standard contractual clauses (“new SCCs”) that may be used by European Union-based companies to safeguard data transfers of personal data to third countries, such as the US, in compliance with GDPR Art. 46(1). The release comes at a decidedly inopportune time as it follows on the heels of the Court of Justice of the European Union’s (CJEU) Data Protection Commissioner v. Facebook Ireland Limited and Maximillian Schrems (“Schrems II”) decision which casts serious doubt on the adequacy of SCCs alone to safeguard against the “high-risks” involved in EU to US data transfers. And for many data protection experts, the language of the revised SCCs only adds to the confusion, raising even more questions. But one question in particular seems to be prominent among others—for transfers to importers, directly subject to GDPR, are SCCs really necessary?
Advanced data driven infrastructure is now essential for sports entities to remain competitive, yet few structures are in place to manage the risks inherent in the collection of this sometimes, highly personal information. Data is utilized for virtually every aspect involved in the game, including; to enhance player performance, improve player health, deepen fan engagement, and increase betting predictions. These developments do not come about without risks to the rights of those who the data is extracted from.
As the United States continues to grapple with the effects of the coronavirus epidemic, the U.S. Department of Health and Human Services (“HHS”) announced new rules extending compliance dates and timeframes under the Cures Act. The agency’s new rules—most of which take effect on Dec. 4, 2020—are aimed at giving IT developers and health care providers flexibility in responding to the coronavirus pandemic.
Covid-19 has not only damaged the health and physical well-being of those stricken by the potentially deadly coronavirus, but it has also ravaged the livelihoods and financial stability of many millions more people around the world. The virus spread across the U.S. with incredible speed as more than 100,000 people had already been infected by early March. In many ways the unexpected and quick arrival of the pandemic caught many households financially unprepared and ill-equipped to survive the economic shutdown unscathed. For those that have experienced rent hardship and have, or will soon, be subject to an eviction for non-payment of rent, they must recover not only from the short-term challenges of finding shelter and putting their lives back together, but also the long-term struggle of finding suitable housing with an often disqualifying and indelible mark on their rental history.
There seems to be no end in sight to the various concerns associated with COVID-19, and experts are hesitant to say when and if life as we knew it will ever return to “normal.” As the pandemic persisted, companies large and small quickly realized that jobs we all assumed had to be done in an office, can in fact be done from the comfort of one’s home. #WFH is a trending social media hashtag standing for “work from home,” and posts using this hashtag range anywhere from how to dress comfortably while remaining professional when working from home to setting up the perfect home office. #WFH, however, is not just a social media trend, but a new normal for many Americans as employers were forced to allow their employees to work from home due to health concerns related to COVID-19. This gives rise to questions such as, what about safety and security concerns related to employer data? And, where do employees draw the line between work and home when working from home? While this may be uncharted territory, top researchers say that #WFH may be the next big thing for companies worldwide.
The use of facial recognition technology in the commercial context generates numerous consumer privacy concerns. As technology becomes increasingly present in many aspects of our life, regulations on states and federal level are struggling to catch up. Currently, only three states (Illinois, Washington, and Texas) implemented biometric privacy laws, and only Illinois grants individuals with a private right of action.
Data protection measures have been increasingly crossing news headlines ever since the General Data Protection Regulation (GDPR) came into effect in 2018. However, data protection measures did not begin with the GDPR. In the United States, where there is a sectoral system in place, there have been regulations in place for years that monitor children’s online privacy (COPPA), health information (HIPAA), spam (CAN-SPAM), and even video rental history (VPPA). Despite these systems being implemented years ago, large companies still fail to properly comply with the requirements set forth. Recently, a settlement between YouTube and the FTC brought to light the importance of compliance with COPPA.