Sign In  |  Register  |  About Livermore  |  Contact Us

Livermore, CA
September 01, 2020 1:25pm
7-Day Forecast | Traffic
  • Search Hotels in Livermore

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

With liberty and privacy for some: Widening inequality on the digital frontier

We deserve meaningful privacy protections that no company can afford to omit from their products. We deserve a both/and approach: Privacy that is both meaningful and widely available.
Cillian Kieran Contributor Cillian Kieran is CEO and co-founder of Ethyca, a New York-based privacy company.

Privacy is emotional — we often value privacy the most when we feel vulnerable or powerless when confronted with creepy data practices. But in the eyes of the court, emotions don’t always constitute harm or a reason for structural change in how privacy is legally codified.

It might take a material perspective on widening privacy disparities — and their implication in broader social inequality — to catalyze the privacy improvements the U.S. desperately needs.

Apple’s leaders announced their plans for the App Tracking Transparency (ATT) update in 2020. In short, iOS users can refuse an app’s ability to track their activity on other apps and websites. The ATT update has led to a sweeping three-quarters of iOS users opting out of cross-app tracking.

Whenever one user base gears up with privacy protections, companies simply redirect their data practices along the path of least resistance.

With less data available to advertisers looking to develop individual profiles for targeted advertising, targeted ads for iOS users look less effective and appealing to ad agencies. As a result, new findings show that advertisers are spending one-third less in advertising spending on iOS devices.

They are redirecting that capital into advertising on Android systems, which account for just over 42.06% of the mobile OS market share, compared to iOS at 57.62%.

Beyond a vague sense of creepiness, privacy disparities increasingly pose risks of material harm: emotional, reputational, economic and otherwise. If privacy belongs to all of us, as many tech companies say, then why does it cost so much? Whenever one user base gears up with privacy protections, companies simply redirect their data practices along the path of least resistance, toward the populations with fewer resources, legal or technical, to control their data.

More than just ads

As more money goes into Android ads, we could expect advertising techniques to become more sophisticated, or at least more aggressive. It is not illegal for companies to engage in targeted advertising, so long as it is done in compliance with users’ legal rights to opt out under relevant laws like CCPA in California.

This raises two immediate issues. First, residents of every state except California currently lack such opt-out rights. Second, granting some users the right to opt out of targeted advertising strongly implies that there are harms, or at least risks, to targeted advertising. And indeed, there can be.

Targeted advertising involves third parties building and maintaining behind-the-scenes profiles of users based on their behavior. Gathering data on app activity, such as fitness habits or shopping patterns, could lead to further inferences about sensitive aspects of a user’s life.

At this point, a representation of a user exists in an under-regulated data system containing — whether correctly or incorrectly inferenced — data that the user did not consent to sharing. (Unless the user lives in California, but let’s suppose they live anywhere else in the U.S.)

Further, research finds that targeted advertising, in building detailed profiles of users, can enact discrimination in housing and employment opportunities, sometimes in violation of federal law. And targeted advertising can impede individuals’ autonomy, preemptively narrowing their window of purchasing options, even when they don’t want to. On the other hand, targeted advertising can support niche or grassroots organizations in connecting them directly with interested audiences. Regardless of a stance on targeted advertising, the underlying problem is when users have no say in whether they are subject to it.

Targeted advertising is a massive and booming practice, but it is only one practice within a broader web of business activities that do not prioritize respect for users’ data. And these practices are not illegal in much of the U.S. Instead of the law, your pocketbook can keep you clear of data disrespect.

Privacy as a luxury

Prominent tech companies, particularly Apple, declare privacy a human right, which makes complete sense from a business standpoint. In the absence of the U.S. federal government codifying privacy rights for all consumers, a bold privacy commitment from a private company sounds pretty appealing.

If the government isn’t going to set a privacy standard, at least my phone manufacturer will. Even though only 6% of Americans claim to understand how companies use their data, it is companies that are making the broad privacy moves.

But if those declaring privacy as a human right only make products affordable to some, what does that say about our human rights? Apple products skew toward wealthier, more educated consumers compared to competitors’ products. This projects a troubling future of increasingly exacerbated privacy disparities between the haves and the have-nots, where a feedback loop is established: Those with fewer resources to acquire privacy protections may have fewer resources to navigate the technical and legal challenges that come with a practice as convoluted as targeted advertising.

Don’t take this as me siding with Facebook in its feud with Apple about privacy versus affordability (see: systemic access control issues recently coming to light). In my view, neither side of that battle is winning.

We deserve meaningful privacy protections that everyone can afford. In fact, to turn the phrase on its head, we deserve meaningful privacy protections that no company can afford to omit from their products. We deserve a both/and approach: privacy that is both meaningful and widely available.

Our next steps forward

Looking ahead, there are two key areas for privacy progress: privacy legislation and privacy tooling for developers. I again invoke the both/and approach. We need lawmakers, rather than tech companies, setting reliable privacy standards for consumers. And we need widely available developer tools that give developers no reason — financially, logistically or otherwise — to implement privacy at the product level.

On privacy legislation, I believe that policy professionals are already raising some excellent points, so I’ll direct you to some of my favorite recent writing from them.

Stacey Gray and her team at the Future of Privacy Forum have begun an excellent blog series on how a federal privacy law could interact with the emerging patchwork of state laws.

Joe Jerome published an outstanding recap of the 2021 state-level privacy landscape and the routes toward widespread privacy protections for all Americans. A key takeaway: The effectiveness of privacy regulation hinges on how well it harmonizes among individuals and businesses. That’s not to say that regulation should be business-friendly, but rather that businesses should be able to reference clear privacy standards so they can confidently and respectfully handle everyday folks’ data.

On privacy tooling, if we make privacy tools readily accessible and affordable for all developers, we really leave tech with zero excuses to meet privacy standards. Take the issue of access control, for instance. Engineers attempt to build manual controls over which personnel and end users can access various data in a complex data ecosystem already populated with sensitive personal information.

The challenge is twofold. First, the horse has already bolted. Technical debt accumulates rapidly, while privacy has remained outside of software development. Engineers need tools that enable them to build privacy features like nuanced access control prior to production.

This leads into the second aspect of the challenge: Even if the engineers overcame all of the technical debt and could make structural privacy improvements at the code level, what standards and widely available tools are available to use?

As a June 2021 report from the Future of Privacy Forum makes clear, privacy technology is in dire need of consistent definitions, which are required for widespread adoption of trustworthy privacy tools. With more consistent definitions and widely available developer tools for privacy, these technical transformations translate into material improvements in how tech at large — not just tech of Brand XYZ — gives users control over their data.

We need privacy rules set by an institution that is not itself playing the game. Regulation alone cannot save us from modern privacy perils, but it is a vital ingredient in any viable solution.

Alongside regulation, every software engineering team should have privacy tools immediately available. When civil engineers are building a bridge, they cannot make it safe for a subset of the population; it must work for all who cross it. The same must hold for our data infrastructure, lest we exacerbate disparities within and beyond the digital realm.

Building customer-first relationships in a privacy-first world is critical

Stock Quote API & Stock News API supplied by www.cloudquote.io
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Copyright © 2010-2020 Livermore.com & California Media Partners, LLC. All rights reserved.