If you have ever felt that your privacy has been invaded, consider this: smartphone applications now know more about you than ever before.
Many smartphone applications sell personal information such as user activity within an application, location, contacts, and phone ID, as well as other typically anonymized data to third parties. This information is used by marketers to better target certain demographics.
In Febuary, the Federal Trade Commission released a report including a broad new set of non-binding privacy guidelines for makers and developers of apps. These recommended guidelines state that app developers and marketers should not take personal information and track user whereabouts if it is not necessary for their services.
Rebecca Jeschke is the Media Relations and Digital Rights Analyst for the Electronic Frontier Foundation, an organization that educates and protects technology users. She said that despite the FTC’s recent efforts, she is concerned with people’s privacy.
Jeschke, a Redwood alumnus, said she believes that the privacy settings automatically set by apps are not clear enough.
“We put a lot of trust in these apps and in their privacy settings, yet we don’t necessarily know if they have been set correctly,” she said.
Social media apps such as Facebook are especially concerning activists like Jeschke.
Jeschke said that Facebook has to change its ground rules regarding privacy and disclosure with each new update, which is controversial in itself.
“Most privacy polices are ridiculously long and hard to understand,” she said, “Not only are the policies complex, but so is the issue of privacy itself. It isn’t a simple problem.”
Jeschke said that the privacy settings automatically set by apps allow personal information to be sold to third parties. She says this breach of privacy is neither ethical, nor a voluntary exchange between company and user.
“Often the people who create these free apps will say things like ‘It’s free. We have to make our money somehow, so this is only fair,’ but the way I would come back is, ‘It’s only fair if people know what’s happening. If this is really a fair exchange, let me know and let me make that decision.’”
Jeschke said she holds companies to a higher ethical standard than what companies have met. “There needs to be a general understanding of what [information] is okay [to take] and what is not okay.”
Jeschke said she thinks that one solution to finding and establishing clear general rules is found in the use of “Opt in” or “Opt out” models. “Opt out” is a model in which the default selection in the terms and conditions requires giving up privacy, whereas the “Opt in” model requires that the user make a conscious choice to give up whatever private information is required.
On Facebook, the user currently has to opt out of the “recommended” privacy settings in order to put in their own.
According to Jeschke, the “Opt in” model is more user-friendly because the user chooses what they want public or private on a “case by case basis.”
Google+, for example, follows an “Opt in” model. Every time one wants to post a status on his or her page, Google+ lets the user choose which group, or circle, with which they want to share information.
Another dimension of the privacy issue is whether the private information provided is made anonymous for statistical use or linked directly to a given identity.
“You don’t need to have your name or your IP address or your credit card number personally attached to you to have it be personally identifiable,” Jeschke said. “People become identifiable in many ways.”
Jeschke said that it is hard to interpret the privacy law. “For example, there is the wiretapping law, which means you can’t read everyone’s email, for example, but there are exceptions to that [law] as well.”
“Certainly, electronic privacy law in regards to the government—when the government can collect electronic information about you from a service provider—is woefully outdated, ” she said.