Dhara Shah Senior Editor Loyola University Chicago School of Law, JD 2020 Data privacy and more specifically, user privacy, has become the focus for many in the past year. Some may say that the European Union began this “trend” with the implementation of the General Data Protection Regulation (GDPR) with California soon following in …
The California Attorney General’s office released an updated draft to the California Consumer Privacy Act (CCPA) on February 10th. This updated draft follows the four public hearings that were held in December of 2019 and over 1,700 pages of submitted comments. Comments are being heard as of the posting of this article, and if no new changes are made, a final rulemaking record will be submitted.
In 2008, the Illinois legislature introduced and passed the Biometric Information Privacy Act (BIPA), which became the first law of its kind in the US. BIPA was passed to protect individuals against the unlawful collection and storing of biometric information. While many states have enacted similar laws, BIPA remains the most stringent among its contemporaries.
The California Consumer Privacy Act (CCPA) has been the first step away from the sectoral approach that United States’ privacy laws have followed for many years. While it is set to take effect on January 1, 2020—only recently was the first draft guidance published. Set forth by California’s Attorney General, Xavier Becerra, it states how the CCPA will be enforced. As is standard in notice and rulemaking standard in administrative law, a public consultation period is now in effect and will remain open for comments and hearings until December 6, 2019.
Despite industry groups’ and tech companies’ numerous efforts over the past few months to water down and ultimately halt the first-ever U.S. data privacy law, the California Consumer Privacy Act of 2018 (“CCPA” or “the Act”), the CCPA now has its final language set on September 13, 2019, the end of California’s legislative calendar, and will go into effect on January 1, 2020. The goal is to give California residents control of their personal information collected and processed by companies.
The Children’s Online Privacy Protection Act (“COPPA”) prohibits unfair or deceptive collection, use, and disclosure of the personal information of children on the internet. COPPA covers both website operators and app developers, and prevents collection of personal information without verified, written consent of parents. On February 27, 2019, the Federal Trade Commission (“FTC”) filed a complaint in U.S. District Court against TikTok, previously known as Music.ly. The complaint alleged that Music.ly knowingly violated COPPA when it collected data from children without written consent of parents. Music.ly settled for $5,700,000.00, the largest civil penalty obtained by the FTC for violations of COPPA.
Cook County General Administrative Order 18-1 pertains to the Standard HIPAA Qualified Protective Orders (QPO) that will be permitted in Cook County. These orders will only be allowed for cases that are in litigation where the Plaintiff and Plaintiff’s counsel authorize disclosure of a litigants’ protected health information (PHI). It also requires all entities who received PHI to either return the documents to the Plaintiff or destroy them at the end of the case. These changes mean that Plaintiff’s attorneys will see a change in the handling of Plaintiff’s medical records and other documents covered under the QPO containing PHI.
The Federal Trade Commission (“FTC”) recently proposed two amendments to the Privacy Rule and Safeguards Rule under the Gramm-Leach-Bliley Act (“GLBA”). The Safeguards Rule requires financial institutions to develop, implement, and maintain a comprehensive information security system. This rule went into effect in 2003. The Privacy Rule requires financial institutions to inform customers about its information-sharing practices and allows customers to opt out of having their information shared with certain third parties. This rule went into effect in 2000. The recent amendments to these two rules are intended to further protect consumers’ data from third parties. However, the changes could also adversely affect businesses.
On September 12, 2018, the European Parliament approved amendments to the Directive on Copyright in the Digital Single Market, commonly known as the EU Copyright Directive (the “Directive”). The amendments primarily cover copyright protection over internet resources. There are two parts of the Directive that have caused concern: Articles 11 and 13. Article 11, also referred to as the “link tax,” provides publishers with a method to collect revenue from news content shared online. Article 13, also referred to as the “upload filter,” holds Internet platforms, such as Facebook and Twitter, liable for copyright infringement committed by users. Together, large and small platform providers that would have to comply with these new regulations have declared that the enactment of these articles places a heavier burden on service providers. Critics of these amendments also say the requirements are likely to lead to increased taxation and more lawsuits. The final vote on the directive is scheduled for January 2019.
Direct-to-consumer genetic testing kits have exploded in popularity over the last decade. Ancestry.com and 23andMe proudly state they have had ten million and five million customers, respectively, using their DNA testing services. One study projects that improvements in technology and popularity will cause DNA testing to increase tenfold by 2021. Many experts in the field of genetics and bioethics have expressed concern regarding the ability of regulators and privacy infrastructure to keep pace with the expansion of these types of genetic services. We may not be at a point where we understand the full implications of having such large banks of genetic information, but here are five reasons to be concerned.