TikTok continues to rise in popularity, though their history of complaints and lawsuits paints a different picture. On February 27, 2019 the Federal Trade Commission (FTC) settled with TikTok for $5.7 million in response to a child privacy complaint. This settlement was the largest civil penalty obtained for a child privacy complaint, prompting TikTok to take corrective action by hiring compliance focused employees. Consumer groups now argue that TikTok has failed to make such changes and continues to “flout the law”. In response to national security concerns, President Trump signed an executive order on August 6, 2020 effectively banning the application in the U.S.
Today the healthcare industry is being transformed using the latest technology to meet the challenges it is facing in the 21st century. Technology helps healthcare organizations meet growing demands and deliver better patient care by operating more efficiently. As the world population continues to grow and age, artificial intelligence (AI) and machine learning will offer new and better ways to identify the disease and improve patient care.
In March 2019, Senator Brian Schatz and Senator Roy Blunt introduced a bill to Congress designed to provide oversight for facial recognition technology, known as the Commercial Facial Recognition Privacy Act. If passed, this law could change the way Americans deal with privacy.
From Siri to Alexa, to deep learning algorithms, artificial intelligence (AI) has now become commonplace in most peoples’ lives. In a business context, AI has become an indispensable tool for businesses to utilize in accomplishing their goals. Due to the complexity of the algorithms required to make quick and complex decisions, a “black box problem” has emerged for those who utilize these increasingly more elaborate forms of AI. The “black box” simply refers to the level of opacity that shrouds the AI decision-making process. While no current regulation explicitly bans or restricts the use of AI in decision making processes, many tech experts argue that the black box of AI needs to be opened in order to deconstruct not only the technically intricate decision-making capabilities of AI, but the possible compliance-related problems this type of technology may cause.
In a world where our reliance on technology and the cloud is increasing exponentially, data security’s growth has stagnated. The European Union (EU) passed the General Data Protection Regulation (GDPR) in hopes of ensuring that consumer data is protected and not harbored by businesses. The effects of the GDPR, however, have passed the borders of the European Union. In a world where our actions extend internationally with just the click of a button, the GDPR’s impact circles the globe as well. The GDPR has pushed for a shift in data privacy and regulation for companies within and outside of the EU as it holds to protect European citizens, no matter where they are in the world. This international reach has not only created forces to drive U.S. companies to comply, but states within the U.S. are now creating GDPR-inspired laws to protect their own citizens. The GDPR has started a trend that will soon become the norm and finally push compliance to keep up with the exponential growth of technology.
Technological advances in aviation have turned what was once a matter of science fiction into reality. With that increase in technology comes a need for regulation of those technologies and their integration into daily lives. In 2016, the Federal Aviation Administration (“FAA”) finalized its first iteration of the rules that would begin to mold how drones are used.
On Friday, October 28, 2017, the National Highway Traffic-Safety Administration (“NHTSA”) announced they are striving to deregulate strict regulations currently slowing production on self-driving cars. NHTSA is seeking to deregulate in an attempt to increase the production and deployment of driverless cars. In the Rulemaking Report released by the Department of Transportation (“DOT”), NHTSA seeks comments to “identify any unnecessary regulatory barriers to Automated Safety Technologies, and for the testing and compliance certification of motor vehicles with unconventional automated vehicles designs, especially those equipped with controls instead of a human driver.”
It is no secret that streaming services have been a highly controversial issue in the entertainment industry in recent years. Artists from all over the world have been affected by the rise of music streaming; many believe it is no different than piracy. Nevertheless, Spotify is in fifty-eight countries, and the user base consists of over fifty million subscribers globally, with twelve and a half million paying subscribers. As Spotify has grown, questions have risen surrounding the rights that artists, producers, and writers have to their music that the public has access to through ‘streaming’. As technology advances, the music industry will continue to change. The recently filed lawsuits against Spotify show that this is an underdeveloped area of the law that needs to be explored. The decisions regarding Spotify’s streaming service and compliance with copyright laws will have major implications for not just Spotify, but the entire music industry.
by William Devine, Guest Contributor
Apple has developed and distributed a curriculum that will teach students at 30 community colleges around the country to write code and create apps. What prompts this gift? A belief that we all bear responsibility for sustaining a functional economy. At a time when some corporate leaders and their legal teams focus on the perils of overregulation, the greatest regulatory risk an enterprise confronts may not be high compliance hurdles, but rather the possibility that regulators can’t keep the economy functioning well enough for the enterprise to do its most commercially inventive and societally valued work.