Monday, March 14, 2011

Privacy and Security: if it’s your app, then it’s your @$$

This is the first of a continuing series to help smartphone app developers make more informed design, development, and policy decisions. The objective is to raise material issues, options, and risks specific to mobile app development (but not to give legal advice).

Specific topics to come include;

  • PII: what definitions are out there and which ones can you not afford to ignore?
  • Opt-in Opt-out: what should your defaults be? How often must you ask? Do you need separate opt-ins per app? Per app version? For regular use AND exception reporting?
  • Data retention and reuse: do you own your own data? Do you need to care about partner data policies or only your own?
  • App hardening: what risks stem from app reverse engineering and/or tampering? How do you know if you should care?
This first entry will delve into the broader motivations behind this series and call out aspects of the mobile app development experience that set it apart from other platforms and markets.

It’s impossible to guarantee that an app will never do harm

Evolving technologies, emerging and divergent regulations, and evolving social and ethical mores have made it is simply impossible to define a concise, bullet-proof set of policies and development patterns that are guaranteed to do no harm to either user or developer. Effective risk management must, by necessity, be a practice governed, to a significant degree, by subjective guidelines.

When god wants to punish you, He answers your prayers

It’s no secret; the convergence of technical, social, economic, and market forces that we call “the smartphone” is a mega-opportunity for those who “get it right.” And what’s greasing the skids speeding this disruptive force of change? Apps of course. Apps are on the front-line hooking consumers and driving the smartphone revolution – pretty cool right? Most definitely.

Swimming in deep waters is always cool, but sometimes it can be deadly too.

The most powerful corporations, the smartest entrepreneurs, and the crème de la crème of investors are racing to meet exploding smartphone demand. And when that much money, information, and power are on the move, criminals (of all kinds), lawyers, regulators, law enforcement, and all genus of government will be right there with us.

These are deep waters indeed.

To serve an app: Is it a cookbook?
Do app developers need to care about this stuff? Consider that when a developer ships their first phone app, they will most likely have already entered into as many as five different (and almost certainly contradictory) binding legal agreements that include assignments of personal liability tied to the developer’s privacy policy and their app’s security. Each tool supplier, marketplace owner, platform provider, ad-server provider, and (last but not least) user demands this assurance.

As an app moves across platforms and international borders, the developer’s legal obligations multiply and, as such, so does their material risk.

To be clear, there is nothing inherently wrong or unreasonable here; but it IS unprecedented.


The list of potential concerns grows considerably when one stops to consider ethical considerations (what’s the right thing to do versus the legal thing) on one hand, and the technical/development security requirements on the other.

In order for a developer to ensure that both their users’ and their own interests are being addressed, application design and development priorities must account for operational and contractual requirements (in addition to market-driven feature functionality of course.)

What level of due diligence and development investment is appropriate?

The diversity of the interested parties and the number of governing agreements all covering essentially the same act (running a single app on a phone) make it especially important that the developer understand:
  • Definitions: For example, do all parties use the same definition for PII (personally identifiable information) and do they use it consistently (when defining the developer’s obligations versus their own)?
  • Obligations: For example, are you agreeing (and indemnifying the other party) to adhere to multi-national regulations – many of which you will have no knowledge of?
  • Rights and privileges: For example, what usage and commercial rights to application usage, user behavior, and other data are conveyed? Are these subject to change?
  • Notifications: Under what conditions, through what channels, and in what timeframes must which information be communicated? Does the developer have different special obligations?
Knowledge management versus risk management
Most people divide the world into things they know and things they don’t and then try to manage their risk within that “circle of knowledge.”



Knowledge management is not risk management and often misses a danger-zone where risk most often hides; in “the things we don’t know that we don’t know” and “the things we think we know, but we don’t.”


If you “don’t know you don’t know”, then you miss the chance to educate yourself or ask for expert advice.

Similarly, if you think you know something, but you’re wrong, you may find yourself exposed or missing an opportunity.

The single-minded objective of this series will be to shrink the “danger zone” for mobile app developers; to make sure that you don’t get bitten by security, regulatory, or social gotcha’s that you didn’t even know were out there.

The take away from today’s installment is a very simple one; if it’s your app that gets jammed up – rest assured; it’s going to be your @$$ on the line.