Data protection measures have been increasingly crossing news headlines ever since the General Data Protection Regulation (GDPR) came into effect in 2018. However, data protection measures did not begin with the GDPR. In the United States, where there is a sectoral system in place, there have been regulations in place for years that monitor children’s online privacy (COPPA), health information (HIPAA), spam (CAN-SPAM), and even video rental history (VPPA). Despite these systems being implemented years ago, large companies still fail to properly comply with the requirements set forth. Recently, a settlement between YouTube and the FTC brought to light the importance of compliance with COPPA.
On August 29, 2019, the Environmental Protection Agency (“the EPA”) announced a proposed reconsideration amendment to an Obama Administration rule regulating the natural gas industry’s methane emissions. This proposal is in response to President Trump’s order for federal agencies to review their actions, purportedly to remove potential resource burdens. The EPA asserts that the changes will remove regulatory duplication and save the industry millions of dollars, but the savings may come at the expense of increasing the planet’s vulnerability.
At first, the story of John Kapoor’s rise to the top of the pharmaceutical industry sounds like the American dream played out in real life. The first to attend college in his family, Kapoor graduated from Bombay University in India with a degree in pharmacy. He came to the United States after securing a fellowship at the University of Buffalo, and earned his Ph.D. in 1972. His scientific and business savvy was evident from the start – in a matter of a decade, Kapoor took over a struggling pharmaceuticals business, turned it around, and netted a personal gain of $100mm. From there Kapoor became a serial entrepreneur, with INSYS Therapeutics marking the pinnacle of his success. The company made him a billionaire, but later made him the target of a criminal racketeering investigation and the face of one of America’s darkest problems.
Artificial intelligence is all around us. Whether it exists in your iPhone as “Siri” or in complex machines that are detecting diabetic retinopathy, it is constantly growing and becoming a regular part of the modern day. As with any new technology, regulation surrounding artificial intelligence is becoming increasingly problematic. The question facing us now is how do we encourage further development without accidentally hindering its growth? Recently, the Food and Drug Administration has attempted to take steps toward further regulation of artificial intelligence by introducing a review process for medical artificial intelligence. This is just one instance of how regulation may affect the evolution of artificial intelligence.
The Children’s Online Privacy Protection Act (“COPPA”) prohibits unfair or deceptive collection, use, and disclosure of the personal information of children on the internet. COPPA covers both website operators and app developers, and prevents collection of personal information without verified, written consent of parents. On February 27, 2019, the Federal Trade Commission (“FTC”) filed a complaint in U.S. District Court against TikTok, previously known as Music.ly. The complaint alleged that Music.ly knowingly violated COPPA when it collected data from children without written consent of parents. Music.ly settled for $5,700,000.00, the largest civil penalty obtained by the FTC for violations of COPPA.
On March 12, 2019, the Department of Justice (“DOJ”) announced revisions of the Corporate Enforcement Policy in the Foreign Corrupt Practices Act. The changes now require company oversight of ephemeral messaging apps used by any employee, stock holder, or agent who discusses business records via the messaging platform. Publicly traded companies must now establish internal compliance policies to review use of ephemeral messaging services, provide ongoing oversight of the messaging services, and may want to completely prohibit the use of such messaging apps for business purposes.
During Governor-elect J.B. Pritzker’s election campaign, he heavily advocated for Illinois to be more accommodating to recreational marijuana usage. In Illinois, medical marijuana has already been legalized, and new bills are being introduced to make it more accessible. If recreational marijuana is legalized, Illinois will join ten states, and the District of Colombia, in its authorization.
From Siri to Alexa, to deep learning algorithms, artificial intelligence (AI) has now become commonplace in most peoples’ lives. In a business context, AI has become an indispensable tool for businesses to utilize in accomplishing their goals. Due to the complexity of the algorithms required to make quick and complex decisions, a “black box problem” has emerged for those who utilize these increasingly more elaborate forms of AI. The “black box” simply refers to the level of opacity that shrouds the AI decision-making process. While no current regulation explicitly bans or restricts the use of AI in decision making processes, many tech experts argue that the black box of AI needs to be opened in order to deconstruct not only the technically intricate decision-making capabilities of AI, but the possible compliance-related problems this type of technology may cause.
On March 10, 2019, Ethiopian Airlines Flight 302 en route to Nairobi, Kenya crashed shortly after take-off leaving no survivors. It became the carrier’s most deadly crash and its first fatal crash since January 2010. Most notably, however, it was the second fatal crash involving Boeing’s new 737 MAX jet in less than five months after the Lion Air Flight 610 accident in October 2018. The day following the tragedy, Ethiopian Airlines grounded all of its Boeing 737 MAX 8 fleet until further notice. Many other airlines suspended operations of the aircraft as well and countless countries banned the 737 MAX from airspace.
The Federal Trade Commission (“FTC”) recently proposed two amendments to the Privacy Rule and Safeguards Rule under the Gramm-Leach-Bliley Act (“GLBA”). The Safeguards Rule requires financial institutions to develop, implement, and maintain a comprehensive information security system. This rule went into effect in 2003. The Privacy Rule requires financial institutions to inform customers about its information-sharing practices and allows customers to opt out of having their information shared with certain third parties. This rule went into effect in 2000. The recent amendments to these two rules are intended to further protect consumers’ data from third parties. However, the changes could also adversely affect businesses.