Privacy & Security
With the recent change of New York’s abortion law, legislators granted women the affirmative right to abortions under the state’s public-health law. Under the Reproductive Health Act, restrictions on abortion past twenty-four weeks are removed legalizing abortion up until the day of birth. This bill was passed on the 46th anniversary of the Roe v. Wade decision. The new bill comes as a reaction to the confirmation of conservative Supreme Court Justice Brett Kavanaugh, giving protection to women’s access to abortion if Roe v. Wade is overturned. Proving to be very controversial, the change has advocates and critics at odds with its potential future effects.
New data privacy regulations entail questioning both current and future technologies. Recently, Amazon has introduced a store concept that eliminates everyone’s least favorite things about shopping, long lines and small talk. Amazon Go is the grocery store of the future and these stores allow consumers to walk in, pick up the items that they need, and then walk right back out. That’s it. No long lines, no cashiers, no shopping carts. However, as great as this concept seems, there are still concerns from a data privacy standpoint as Amazon needs to collect personal data from its consumers in order to be able to lawfully execute these checkout-less stores.
On September 12, 2018, the European Parliament approved amendments to the Directive on Copyright in the Digital Single Market, commonly known as the EU Copyright Directive (the “Directive”). The amendments primarily cover copyright protection over internet resources. There are two parts of the Directive that have caused concern: Articles 11 and 13. Article 11, also referred to as the “link tax,” provides publishers with a method to collect revenue from news content shared online. Article 13, also referred to as the “upload filter,” holds Internet platforms, such as Facebook and Twitter, liable for copyright infringement committed by users. Together, large and small platform providers that would have to comply with these new regulations have declared that the enactment of these articles places a heavier burden on service providers. Critics of these amendments also say the requirements are likely to lead to increased taxation and more lawsuits. The final vote on the directive is scheduled for January 2019.
The FDA regulationson human subject protection and Institutional Review Boards(IRBs) provide guidance to protect the rights, safety, and welfare of subjects who participate in FDA-regulated clinical investigations. The regulations conform with the requirements set forth by the Department of Health and Human Services (HHS) Federal Policy of Human Research Subjects(45 CFR 46, part A). In order to reduce confusion and burdens associated with complying with both the FDA regulations and the HHS policies regarding human subject protections, the FDA is revising the current “common rule”.
Immediately upon introduction, mobile medical applications became favored by physicians and patients alike because the applications are user friendly and allow the patient to understand their care and participate in more meaningful discussions with their provider about their health. Due to the rapid development of technology and, as a result, a surge of mobile medical applications flooding the market, the Food and Drug Administration has issued three guidances on how they plan to regulate mobile medical applications. In order for mobile medical application manufacturers to remain compliant with the FDA guidances, they must meet the seven categories of requirements that are laid out in Appendix E of FDA’s 2015 guidance and also comply with any further guidance that is released.
Direct-to-consumer genetic testing kits have exploded in popularity over the last decade. Ancestry.com and 23andMe proudly state they have had ten million and five million customers, respectively, using their DNA testing services. One study projects that improvements in technology and popularity will cause DNA testing to increase tenfold by 2021. Many experts in the field of genetics and bioethics have expressed concern regarding the ability of regulators and privacy infrastructure to keep pace with the expansion of these types of genetic services. We may not be at a point where we understand the full implications of having such large banks of genetic information, but here are five reasons to be concerned.
Protected Health Information is seeing a surge of breaches on the cyber security front due to contractor error. It’s also impacting the most consumers in comparison to other data breaches and, in some cases, has the power to cause chaos in national infrastructure. Advances in technology and compliance measures can stem the tide and protect the most valuable information in consumers lives.
On July 6, the Information Commissioner’s Office (ICO) issued their first Enforcement Notice to AggregateIQ (AIQ) under the General Data Protection Regulation (GDPR) and the United Kingdom’s Data Protection Act (DPA). The GDPR is a law regulating data protection and privacy as well as the export of personal data outside of the European Union (EU). It became enforceable on May 25, 2018. The DPA supplements the GDPR and regulates the processing of personal data. The ICO is a regulatory office in the UK which enforces regulations under the DPA and GDPR. AIQ is a Canadian digital advertising, web and software development company that was charged with violations regarding the use of data analytics in political campaigning. This article will address the AIQ enforcement notice and how companies ensure compliance with the GDPR to prevent receipt of an enforcement notice.
On June 28, 2018 California took a page out of the European Union’s (EU) book and signed the California Consumer Privacy Act (CaCPA) into law. The CaCPA is a landmark privacy bill that will come into effect on January 1st, 2020 and it is being closely compared to the General Data Protection Act (GDPR).
What does this mean for California businesses and residents? In short, more privacy and more control over data. Key aspects include allowing consumers to request what data an organization has collected about them, allowing consumers the right to fully erase data, protecting children’s data, and making verification processes more stringent for businesses.
Sei Unno Associate Editor Loyola University Chicago School of Law, JD 2019 Facial recognition has become mainstream, whether the laws are ready or not. Video games are using facial recognition to check the ages of their users and cars are being equipped with technology to identify drivers who are fatigued or distracted. In the U.S., states …