The long-brewing behind-the-scenes tensions of privacy, big data, and mobile finally came to a head last week in the public relations disaster known as #Ubergate. Uber’s meteoric rise to the pinnacle of the rideshare start-up economy has been fueled in part by its collection and usage of sensitive consumer geolocation information. An Uber executive’s recent freewheeling remarks about the potential abuse of that sensitive consumer data has ignited a firestorm of controversy, bringing to the fore additional allegations of questionable data usage practices. #Ubergate serves as a cautionary tale to any start-up collecting and using sensitive personal location information to invest early in privacy policies, practices, and ethics.

UBER

Uber is a popular ridesharing service operating worldwide that uses a smartphone app to receive requests for trips, and then dispatch available drivers to riders. Founded in 2009, Uber reportedly just published its privacy policy publicly only last Tuesday. According to its privacy policy, Uber collects “Personal Information” such as a rider’s email, password, name, mobile phone number, zip code, credit card information, and user photo. It also collects “Usage Information,” including a rider’s Internet browser, IP address and geolocation data gathered during Uber trips. Some or all of this information may then be shared with the rider’s driver and his or her affiliated company. Its privacy policy states that Uber may also share a rider’s Personal Information and Usage Information with third parties (parent, subsidiaries, and affiliates) for unspecified “internal reasons.” A post on Uber’s blog about its privacy policy also states that it has a strict policy prohibiting all employees at all levels from accessing a rider’s or driver’s data, with the only exception being for “a limited set of legitimate business purposes.”

Uber came under fire last week for a number of alleged unorthodox consumer data collection and usage practices, which were contrary to its stated policy:

  • On November 17, BuzzFeed reported that Uber Senior Vice President, Emil Michael, suggested that Uber should consider spending “a million dollars” to hire a team of opposition researchers and journalists to dig up damaging information on media outlets critical of Uber.
  • The following day, BuzzFeed reported that Uber’s top New York executive had used the company’s “God View” tool, a function that allows Uber corporate employees to see Uber customer activity in real time, to track a reporter’s Uber travel without her consent on at least two separate occasions.
  • Uber employees have used the God View tool to screen at parties a map of identified riders using the service in real time.
  • Uber reportedly tracked what they termed “Rides of Glory,” i.e., users between the hours of 10 PM and 4 AM on a Friday or Saturday night who were then picked up about six hours later, to determine which Uber users had “one night stands.”

Uber has vehemently denied that Michael’s comment reflects its practices, and is investigating the alleged tracking of a journalist. In the wake of the scandal, Uber has publicly hired outside data privacy legal counsel to quickly review and address these privacy issues. And as a consequence of these reports, Uber is contending with considerable customer backlash on social media and a rash of customer requests to permanently delete their accounts.

Congressional Scrutiny

The rising customer backlash, fueled by intense media coverage, got the attention of Senator Al Franken, chairman of the Senate Judiciary Subcommittee on Privacy, Technology, and the Law. On November 19, Senator Franken wrote a letter to Uber CEO Michael Kalanick, requesting very specific information on the company’s privacy policies and practices. Citing “serious concerns” about the “scope, transparency, and enforceability” of Uber’s privacy policies, Senator Franken set out a list of 10 questions focused on alleged inconsistencies with Uber’s stated data privacy policy and its actual practices. Among them was a request for Uber to identify its stated “limited set of legitimate business purposes” that grants Uber employees access to riders’ Usage Information, including “sensitive geolocation data.” Senator Franken also questioned Uber’s lack of transparency in its privacy policy, which states Uber may share customers’ Personal Information and Usage Information with its “parent, subsidiaries, and affiliates for internal reasons,” without any further explanation. Further, the letter questioned Uber’s indefinite storage of consumer data, and asked Uber to justify why it does not delete consumer information immediately after a transaction. The letter closes by asking Uber to identify concrete changes it will make to its employee training on or before December 15, 2014.

Location-Based Services, the FTC, and Privacy

Over the past few years, the FTC and other regulators have become increasingly concerned with the privacy implications of mobile and geolocation data and mobile app data security. Location-based services – applications that provide information to users based on their location – are growing at an exponential rate with the rapid rise of mobile smartphone devices and tablets. Geolocation or location-based services provide tremendous benefits to consumers in the form of navigation (e.g., GoogleMaps), local search (e.g., Yelp), check-in (e.g., Foursquare), and social media (e.g., Facebook). Location-based services are what enable Uber to get you a ride as quickly as you want one. But the same principle that makes geolocation services so appealing – the ability to provide consumers with real-time information tailored to their location – also raises serious privacy concerns as companies can collect and compile – without consumer knowledge or consent – detailed records and consumer profiles on the places one works, eats, and visits; the events consumers attend; the people one socializes with; and more.

While no law currently exists on the collection of geolocation information without consumer consent, it is unlawful for companies’ privacy policies to be “unfair and deceptive” under the FTC’s long-standing rules and regulations. For instance, Snapchat recently found itself on the wrong side of an FTC complaint and settlement because its privacy policy represented that it did not collect location-based information when it in fact did. The FTC also settled with the mobile app Brightest Flashlight developer Golden Technologies for failing to adequately disclose the collection and sharing of consumer geolocation information in its privacy policy.

On the policy front, the FTC has taken the position that geolocation data is “sensitive” information deserving a greater level of privacy protection. The agency has issued several reports outlining its privacy concerns with mobile devices and location-based services and signaling where it recommends the industry should head. In the FTC’s seminal 2012 report, Protecting Consumer Privacy in an Era of Rapid Change (“2012 Privacy Report”), the Commission made plain its “particular concerns of location data in the mobile context” and called on “entities involved in the mobile ecosystem to work together to establish standards that address data collection, transfer, use, and disposal, particularly for location data.” On the heels of the 2012 Privacy Report, the FTC issued Marketing Your Mobile App: Get It Right from the Start, to educate small businesses on basic privacy principles, and Mobile App Developers: Start with Security, to provide guidance to app developers on mobile security. In February 2013, the FTC released a staff report titled Mobile Privacy Disclosures: Building Trust Through Transparency (“2013 Mobile Report”), which examines the risks mobile technologies pose to consumer privacy, including (i) the unprecedented growth of consumer data being collected over mobile; (ii) the precise information collected about a user’s location that can be used to build detailed profiles of consumers in unanticipated ways; and (iii) the difficulty in conveying consumer data collection policies and practices to consumers in an understandable way over small smartphone screens.

Recognizing the void in legal and regulatory coverage, the Senate Judiciary Committee’s Subcommittee for Privacy, Technology, and the Law introduced earlier this year Senate Bill 2171, The Location Privacy Protection Act of 2014 (LPPA). This bill – incidentally co-sponsored by Senator Franken – would, among other things, require consumer consent before companies could track geolocation data, and require companies collecting the location data of 1,000 or more devices to post online the kinds of data they collect, how they share and use it, and how people can opt out of data collection. The FTC testified in favor of the LPPA before the Senate Judiciary Committee this past June.

And it is not just the federal government weighing in on consumer geolocation privacy protection. For example, the California Attorney General’s office has been particularly active, and in January 2013 published a report titled Privacy On the Go: Recommendations for the Mobile Ecosystem, which followed a number of data privacy enforcement actions.

“Privacy by Design” and the Keys to Avoiding an Uber-Size Mess

Privacy and geolocation services are just part of the bigger picture. Uber could have avoided its current debacle by heeding the FTC’s 2012 Privacy Report’s recommendations for businesses to protect consumers’ privacy and the agency’s subsequent guidance on mobile apps. The importance of an accurate and clear privacy policy for any company collecting online personally identifiable consumer information goes without saying. The FTC has brought dozens of enforcement actions over the past five years for inaccurate privacy policies that may be considered misleading to consumers. The 2012 Privacy Report goes deeper than just having an adequate privacy policy. The report, among other things, calls on companies to adhere to the principle of “Privacy by Design,” which means that “[c]ompanies should promote consumer privacy throughout their organizations and at every stage of development of their products and services.”

To implement “Privacy by Design,” the FTC recommends that companies incorporate substantive privacy protections and data management procedures into their practices such as:

  • Data Security: Companies must provide reasonable security to adequately protect consumer’s personal information, such as SSL encryption.
  • Reasonable Collection Limits: Companies should limit their consumer data collection to “that which is consistent with the context of a particular transaction or the consumer’s relationship with the business, or as required or specifically authorized by law.”
  • Sound Data Retention: Companies should implement “reasonable restrictions” on the retention of consumer data and “dispose of it once the data has outlived the legitimate purpose for which it was collected.”
  • Data Accuracy: Companies should maintain reasonable accuracy of consumers’ data depending on the nature of the information collected and provide consumers access and the opportunity to correct erroneous information.
  • Designated Privacy Personnel: Companies should appoint specified personnel responsible for the execution of the data privacy program.
  • Privacy Risk Assessments: Companies should perform data privacy assessments that, at a minimum, address employee training and management, and product design and development.
  • Design and Development: Companies should implement controls designed to address the risks identified by the risk assessments.
  • Encryption/Anonymization Tools: As privacy-enhancing technology (“PET”) evolves, companies should take a flexible approach toward protecting consumer privacy throughout the life cycle of their products and services, including through the use of PETs such as encryption and anonymization tools.  

Applying these privacy-by-design principles, the FTC’s 2013 Mobile Report recommends the following privacy protections for mobile apps:

  • Create and Publish a Privacy Policy: Every mobile app should have a privacy policy that is made publicly available through the platform’s app store. Developers need to plainly explain what information the app collects and what is done with the data and then honor those promises to consumers.
  • Create Just-in-Time Disclosures: Mobile app just-in-time disclosures should inform consumers of additional information gathered in excess of that disclosed by the mobile platform, e.g., an app developer should be able to rely on the mobile platform disclosure that geolocation will be collected by the app, but if the app developer decides to share that information with third parties, the developer should create a just-in-time disclosure and obtain affirmative consent.
  • Improve Coordination with Third Parties: Mobile app developers should work with ad networks and third parties that provide services for apps in order to provide truthful disclosures to consumers.
  • Participate in Self-Regulatory Programs: Mobile app developers should participate in trade associations and industry organizations that can provide industrywide guidance on how to make uniform, short-form privacy disclosures.

Conclusion

Any enterprise – start-up or otherwise – that collects and uses consumer geolocation information should pay close attention to the FTC’s guidance on privacy by design and its application to mobile applications. In the wake of #Ubergate, there will undoubtedly be more congressional and regulatory scrutiny for mobile app location-based services, which were already garnering substantial attention because of their privacy implications. But implementing a top-down privacy-by-design approach to consumer privacy is not only good for the regulators, it is also good for business. Consumers are becoming increasingly concerned with the amount of personal information companies are collecting from them and their inability to do anything about it. And Uber’s alleged Big Brother (or should we say, “big bro”?) privacy excesses and abuses may be the case that makes these concerns concrete. Companies that affirmatively place themselves ahead of the privacy curve – instead of at the end of it or worse – will be better positioned to compete for and secure an asset that may in the long run be worth even more than the data collected – consumer trust.