Third party apps put Gmail at risk. This is how we can mitigate the risk with minimal impact on profit

FM
3 min readOct 21, 2020

Gmail third party apps today can engage in activities that put the entire company at risk. For example, by selling user’s private emails to external parties. Should Google regulate this behavior? We strongly think that yes, it does.

Turning Gmail into a platform can enable external developers to create products that significantly amplify user’s experience, accelerating innovation at virtually no cost and building wider moats around the product. We strongly believe that Google is to benefit in the long-term with a platform strategy for Gmail.

However, allowing external parties to build on top of Gmail also poses significant risks for Google. We believe that proactively taking a regulator approach over the platform is required for several reasons:

  • Spillovers. Any negative actions incurred by external parties will directly impact Google’s bottomline and reputation, potentially even resulting in legal implications.
  • Ethics and transparency. Google has the moral imperative to take care of its users. Long and difficult forms often result in customers not being aware of the risks they are exposing themselves to by using external apps. If there are minimal interventions that Google can take to prevent that from happening, we believe that it is normative and in everyone’s best interest to do so.
  • External regulation. Ultimately, because if Google does not regulate the platform, a public agency may end up doing it in less favorable terms. We believe that proactively regulating the platform reduces the risk of external regulation.

So the question is, how can Google mitigate some of these risks while capturing the most value?

There are several parameters to be play with: which apps are granted access, to which data, in which form, and how is user consent asked for. Overall, we have identified several quick-wins and an alternative model to explore

Low hanging fruits

  • Explicit consent forms: users should be notified clearly when an app requests access to their private information or asks for permission to sell the data externally. That should be done in a way that does not require the user to exhaustively analyze a long legal contract, but through very explicit notifications, and alerts of the risks they are agreeing to and their implications.
Exhibit 1: Microsoft user consent request
  • Forbid third party application developers to share the data externally. This measure would not reduce risk entirely: Google partially loses control of the data once it has been shared, as third party developers can still engage in illegal activity. However, we find this measure would enable the platform vision while putting limits only on those applications with the highest risks to privacy.

An alternative to explore

  • Running apps in-house. One idea to consider is to have those apps that require access to user’s private information to be ran within Google systems, limiting the access that developers have to the actual data. For example, a developer could share an app with Google. After that moment, the only way for the developer to interact with the app is through an API through which data access is limited. For app development, a sandbox environment could be created for the developer to test and develop functionality with fake data. Revenue for developers could come from (premium) subscriptions. However, Google would still have to test each application to ensure that API commands respect privacy and that there are no other data leakages.

--

--