The Future of Privacy in Tech Might Just Depend on Trust

Sydney Mann

Associate Editor

Loyola University Chicago School of Law, JD 2025

It is fair to say that privacy is a priority for nearly all companies, but technology organizations in particular. Many have had to adopt and quickly develop robust compliance programs, documentation, reporting, and consumer request systems to comply with global privacy laws or face serious fines and consequences. In the United States alone, nine states (California, Virginia, Connecticut, Colorado, Utah, Iowa, Indiana, Tennessee, and Montana) have signed comprehensive privacy laws into effect with an additional 16 pending in local legislature. Beyond this, individual one-off laws such as the Illinois Biometric Information Privacy Act, signed into effect in 2008, make privacy even more important.

Many organizations have responded to these regulatory challenges by engaging tools such as OneTrust or BigID. In turn, consumers are faced with rejecting and accepting privacy terms and conditions under the guise of taking control and ownership over their data. The issue becomes that compliance is no longer enough – consumers comply with pop-ups out of a need to access what they really want, all the while losing faith in the brand in which they are directly engaging. With only a future of more privacy laws and regulations on the rise, compliance will no longer be enough, consumers are going to demand something more from technology businesses, and that is Trust.

What is trust?

‘Trust’ in the data economy can be considered a combination of privacy policy, communications, and marketing to increase consumer confidence that the organization that is collecting information on them is actually being a good steward of such lucrative details.

Although it sounds new, it’s really not. In 2016, two law school professors, Niel Richards and Woodrow Hartzog, from Washington University in St. Louis School of Law and Boston University School of Law respectively, published the article Taking Trust Seriously in Privacy Law in the Stanford Technology Law Review. In an age in which the European Union’s General Data Protection Regulation (GDPR) had not yet gone into effect, the scholars discussed a consent gap in which every solution requires a focus on organizations developing genuine trust with consumers.

Nearly seven years after publication, organizations have internalized this message, expanding their privacy compliance programs beyond check-the-box consent forms to dedicated “trust centers” offering a new level of transparency for consumers on their collected information.

How major industry players are building consumer trust

Apple, Microsoft, and Google are three industry-leaders that have developed such sites. Located under “Apple Values” is where Apple’s Privacy center can be found. Consumers are able to view Transparency Reports from around the world and a series of time spans. Coupled with the familiar data control features, the organization places an emphasis on providing transparency on the technology the consumer already has access to whether it be through their purchase of hardware or software, informing the consumer while also strengthening their trustworthiness brand power.

More so, Microsoft too has a thorough trust center aimed at informing their array of consumers about independent audit reports, when data is secured, and regional compliance guidance for their products. Unlike Apple’s Privacy Center, the Microsoft Trust Center is focused on privacy issues that businesses have when purchasing their various software or hardware products. Given that an estimated four out of every five Fortune 500 companies use Microsoft Office 365, this is not surprising. To that end, the Microsoft Trust Center contains publicly available documentation regarding how the organization protects cloud Azure cloud platform and infrastructure physical security.

Finally, Google has also developed a Safety Center, geared towards individual users of its products. The site defines what different privacy terms mean to Google and how they in turn promote safeguarding consumer data. Some of the responsible data practices defined include data minimization, privacy reviews, and data transparency. Through the Center, consumers also have access to the Google Business Data Responsibility page that outlines the organization’s internal compliance program, which only further support’s Google’s commitment to transparency for all who use their products.

Overall, these organizations, though sweeping in size and scale, are examples of trust privacy in a consumer-centric manner beyond typical compliance that can be replicated by businesses of any industry facing the next wave of privacy regulation. Furthermore, it is not necessarily presumptuous to say that privacy is never going away. As predicted by legal scholars Niel Richards and Woodrow Hartzog, and shown through the increasing amount of upcoming legislation, privacy is here to stay. However, the approach organizations take towards staying compliant will need to change as consumer preferences crave greater transparency and develop continued distaste for check-the-box compliance. Such a shift can only be built on a foundation of genuine trust.