FTC wants to protect women's privacy

FTC wants to protect women's privacy

If the world of cybersecurity seems like a freewheeling world without rules and adults, that's because it kind of is. The United States has no federal data privacy law (open in new tab). However, a government agency, the Federal Trade Commission (FTC), can fine companies that lie or deceive about how they use data. (Remember Facebook's Cambridge Analytica scandal?) Manieha Mithal, Associate Director of Privacy and Identity Protection, explains how the FTC is holding companies accountable and what still needs to be done. (Ahem, pass some legislation!)

Marie Claire: How does the FTC regulate privacy?

Maneesha Mithal: We sue companies that don't do enough to protect your data; we've filed lawsuits against Uber, credit bureau Equifax, and hotel chain Wyndham for not properly protecting their customers' sensitive data, leading to a massive data breach The Equifax case alone is not the only one we have filed against. In the Equifax case alone, we stated that the company's lax practices allowed hackers to access the Social Security numbers and other data of more than 147 million people. We also accused the company of making false claims about how it collects, uses, and shares data. For example, we accused Facebook of sharing people's information with certain apps and violating their chosen privacy settings. We won $5 billion from the company (in 2019), the largest privacy penalty ever, and imposed significant changes to the way the company approaches privacy.

MC: Research supports the notion that data privacy concerns disproportionately affect women, people of color, and other minorities. How can we better protect these populations?

MM: To protect women's privacy, we have filed lawsuits against websites that post revenge porn, stalkerware apps (where the perpetrator installs an app on the victim's phone that the perpetrator knows nothing about), apps that secretly sell location and health data to others The lawsuits have been filed against apps that covertly sell location and health data to others. When it comes to protecting minority groups, we are working on algorithmic discrimination (opens in new tab). Companies can accumulate large data sets and apply algorithms to detect patterns. As a result, they can gain valuable insights about their customers and job seekers. While these algorithms can help companies discover systemic biases, they can also lead to discrimination. We have urged firms to rigorously test their algorithms to eliminate illegal biases.

MC: What do you think is the most pervasive privacy issue today?

MM: Important privacy issues have come to the forefront as a result of the coronavirus outbreak. For example, we are hearing concerns about the privacy of the videoconferencing services that all of us have come to use. We are educating consumers on how to prevent unauthorized people from showing up to meetings, how to ensure that video and audio are not inadvertently played, and how to avoid security issues by not clicking on unknown links.

More broadly, we see concerns about sharing health data to facilitate contact tracing (open in new tab). We need to balance privacy with the need to use data for better health outcomes. When using consumer data for public health, we have urged companies to implement privacy protection techniques, use anonymous aggregate data whenever possible, and delete data once the pandemic is over.

You may also like


Comments

There is no comments