On February 9, a group of senators led by Tammy Baldwin of Wisconsin and Bill Cassidy of Louisiana introduced a new bill, the Health Data Use and Privacy Commission Act (the “Act”), in attempt to revitalize current legislation regarding the protection and use of health data. The bill also has the support of a number of representatives from within the healthcare industry, including Epic, IBM, and Teladoc Health, as well as a number of professional associations like the American College of Cardiology, the Association for Behavioral Health and Wellness, and the Association of Clinical Research Organizations.
Remote work was something once looked at as a gift, a day to work at home in your sweatpants on your couch. But now, some are stuck working from home until further notice or maybe even until they retire. This new method of work has made it much harder for businesses to keep the information of their workers and customers safe despite additional avenues of technology being used to work from home. An average employee may never think about the challenges associated with data security, but it is important to shed some light on this subject so that more people understand its importance. It is also important to understand why the lack of data security laws in the US could be so detrimental to any company doing work here. Company and consumer information is more vulnerable than ever with people working from home all over the country and without comprehensive data security regulations in the US, there is no end in sight.
While the United States does have some federal data privacy regulations in place, the most comprehensive regulations exist at the state level with a degree of variation of protection from state to state. Recently, more conversations are being had about whether the United States should implement more federal data privacy laws. Proponents say they would likely use something equivalent to the European Union’s General Data Protection Regulation (GDPR), which focuses on regulating consumer data privacy and protecting consumers from data breaches. This is especially significant because states are taking matters into their own hands by passing state data privacy regulations that all vary slightly, which could become confusing for companies trying to be compliant with more than one.
There’s no doubt that remote work, brought on by the coronavirus pandemic, will accelerate the digital revolution already underway. Consumers’ growing appetite to conduct their business online, rather than in-person, has fueled the proliferation of digitally accessible products and services. For instance, movie theaters have closed their doors while content streaming services have experienced exponential growth. And while the restaurant industry, as a whole, has suffered, ‘virtual’ kitchens and grocery delivery apps have picked up steam. A critical question that arises from these trends is “what can be done to eliminate biases in the algorithms that drive these digital transactions?”
This spring I had the pleasure of attending a conference entitled Digital Platforms: Innovation, Antitrust, Privacy & the Internet of Things hosted by the UIC John Marshall Law School Center for IP, Information & Privacy Law. Throughout the day, panelists spoke about various topics of intellectual property, including artificial intelligence antitrust issues, and more. But for me, the highlight of the afternoon was the session on privacy issues. Here is a bit of what I learned…
Data privacy and more specifically, user privacy, has become the focus for many in the past year. Some may say that the European Union began this “trend” with the implementation of the General Data Protection Regulation (GDPR) with California soon following in their footsteps with the California Consumer Privacy Act (CCPA). However, seemingly more silently in New York, The Stop Hacks and Improve Electronic Data Security, or SHIELD Act has also been created in the interest of the protection of personal information. The SHIELD Act was enacted on July 25, 2019 as an amendment to the General Business Law and the State Technology Law to include breach notification requirements and stronger rules in place to enforce against businesses handling personal information. The SHIELD Act recently went into effect on March 21, 2020.
Data protection measures have been increasingly crossing news headlines ever since the General Data Protection Regulation (GDPR) came into effect in 2018. However, data protection measures did not begin with the GDPR. In the United States, where there is a sectoral system in place, there have been regulations in place for years that monitor children’s online privacy (COPPA), health information (HIPAA), spam (CAN-SPAM), and even video rental history (VPPA). Despite these systems being implemented years ago, large companies still fail to properly comply with the requirements set forth. Recently, a settlement between YouTube and the FTC brought to light the importance of compliance with COPPA.
New data privacy regulations entail questioning both current and future technologies. Recently, Amazon has introduced a store concept that eliminates everyone’s least favorite things about shopping, long lines and small talk. Amazon Go is the grocery store of the future and these stores allow consumers to walk in, pick up the items that they need, and then walk right back out. That’s it. No long lines, no cashiers, no shopping carts. However, as great as this concept seems, there are still concerns from a data privacy standpoint as Amazon needs to collect personal data from its consumers in order to be able to lawfully execute these checkout-less stores.
On September 12, 2018, the European Parliament approved amendments to the Directive on Copyright in the Digital Single Market, commonly known as the EU Copyright Directive (the “Directive”). The amendments primarily cover copyright protection over internet resources. There are two parts of the Directive that have caused concern: Articles 11 and 13. Article 11, also referred to as the “link tax,” provides publishers with a method to collect revenue from news content shared online. Article 13, also referred to as the “upload filter,” holds Internet platforms, such as Facebook and Twitter, liable for copyright infringement committed by users. Together, large and small platform providers that would have to comply with these new regulations have declared that the enactment of these articles places a heavier burden on service providers. Critics of these amendments also say the requirements are likely to lead to increased taxation and more lawsuits. The final vote on the directive is scheduled for January 2019.
On July 6, the Information Commissioner’s Office (ICO) issued their first Enforcement Notice to AggregateIQ (AIQ) under the General Data Protection Regulation (GDPR) and the United Kingdom’s Data Protection Act (DPA). The GDPR is a law regulating data protection and privacy as well as the export of personal data outside of the European Union (EU). It became enforceable on May 25, 2018. The DPA supplements the GDPR and regulates the processing of personal data. The ICO is a regulatory office in the UK which enforces regulations under the DPA and GDPR. AIQ is a Canadian digital advertising, web and software development company that was charged with violations regarding the use of data analytics in political campaigning. This article will address the AIQ enforcement notice and how companies ensure compliance with the GDPR to prevent receipt of an enforcement notice.