Thursday, October 11, 2012

Who cares about application analytics? Lots of people for lots of reasons…


The results are coming in from our most recent survey on the current state of application lifecycle management and the use of application analytics.

Most everyone agrees that analytics are powerful - it's why they're powerful that gets interesting. 77% of development and their management identified “insight into production application usage” as influential, important or essential to their work, and 71% identified “near real-time notification of unhandled, caught, and/or thrown exceptions” in the same way (other choices were “moderately important" and "no importance"). 

…but where specifically do application analytics have the greatest impact?

Usage, behavior and patterns

Figure 1: Where does insight into production application usage matter? (click to expand)

Developers need to know where and how to prioritize the work that’s right in front of them and nothing makes supporting users more straightforward than having direct insight into what they’ve been doing in production.  

While third in the cumulative vote count, Product planning was ranked 1st in the “essential” categorization. If you don’t know what’s happening around you, there’s no way you can confidentially plan for the future.

Unhandled, thrown and caught exceptions

Figure 2: Where does insight into production incidents (all manner of exception) matter? (click to expand)

Not surprisingly, everyone can agree that insight into exceptions and failures in production provide critical insight into how future iterations of an application should be tested. The fact that 22% of respondents did NOT see exception analytics as being at least influential in customer support is somewhat surprising and will be the subject of future analysis – however, one potential explanation may lie in the obstacles development organizations face (or perceive) in actually implementing true feedback-driven customer support and development processes.

What’s getting in the way?


When comparing usage versus exception monitoring, respondents are mostly consistent in their ranking of obstacles – in fact, the consistency is striking when you consider the divergence in ranking of use cases across these two categories (usage versus exception monitoring). 
Figure 3: What are the obstacles preventing development organizations from implementing effective application analytics solutions today? (click to expand)

While specific numbers vary somewhat, development, product owners and management focus first on security and privacy concerns (see my last post) – followed closely by performance and stability – let’s call that Quality with a capital “q” and “Lack of Best Practices,” which is understandable as application analytics is only now emerging alongside new platforms, tools and methodologies.

PreEmptive Solutions and Application Analytics


What the respondents’ agreement in “obstacles” also indicates is that it should be possible for a single technology solution combined with appropriate processes and patterns designed to address these obstacles to meet the user and organizational requirements across all of these use cases and scenarios.  …and, coincidentally that is exactly what PreEmptive Analytics has been built to accomplish.

For more information on PreEmptive Analytics, visit www.preemptive.com/pa

For an article I wrote for MSDN and the launch of Visual Studio 2012, checkout Application Analytics, what every developer should know.

Sunday, October 7, 2012

Security and privacy concerns identified as most common obstacle to implementing application analytics


This is the first installment of a series posts on the state of application analytics and modern application development patterns and practices.

In a recent survey that includes responses from 100’s of development organizations, two thirds identified application analytics as either essential or important in one or more of the following categories: Product planning, Development prioritization, Test plan definition, Customer support, and/or Development ROI calculation.

Among this group where application analytics has the greatest impact, the following were identified as the most serious obstacles to implementation. (click to enlarge graphic)


Obstacles preventing the use of application analytics in my organization 

Half of all respondents identified security and privacy - a 20% higher response rate than the next two closest obstacles e.g. lack of expertise and general quality concerns). 

The emphasis on security and privacy is even more pronounced inside larger development teams. Nearly 3 out of every 4 development organizations with greater than 50 people identified privacy and security as an impediment – 50% more likely than development teams of between 5 and 15.  

Correlating perceived obstacles to implementing analytics with development organization size

In fact, an organization’s size appears to have a significant influence on virtually every perceived obstacle; larger organizations appear to be more concerned with performance, quality and connectivity while smaller organizations struggle with awareness of analytics solutions, development best practices, and the required integration of their development and operations processes.  

One might make the generalization that, due to the complexities that come with size, larger organizations have had to move to more tightly integrated platforms and practices – putting them in a better position to implement application analytics (and so they focus on potential risks stemming from an implementation) whereas smaller teams may not have as an entrenched “feedback-driven” integrated approach to development. As such, they are more likely to struggle with how to move forward (keep in mind that all respondents identified application analytics as either essential or important).

Privacy and Security and PreEmptive Analytics



Regardless of development team size, privacy and security is the number one perceived obstacle – and PreEmptive Analytics is unique in its approach to this critical requirement. PreEmptive Analytics includes the following:
  • Development teams own their own data. PreEmptive asks for no rights to aggregate, inspect or resell your data.
  • A two-level opt-in switch is included ensuring user opt-in to transmit runtime data from both regular usage AND application exceptions. The logic itself can be injected post-build for .Net and Java and can always be defined by the development organization.
  • All data is, by default, encrypted on the wire.
  • Device ID's (if they are collected at all) are hashed before they are transmitted.
  • Tamper-detection and defense can be used to detect and defend against any attempt to alter or redirect runtime data transmission.
  • Obfuscation can be used to obscure inspection by third parties of what is being collected and transmitted.
  • Unique keys identify both the organization and the application source for data.
For more information on how PreEmptive Analytics addresses the number one obstacle for implementing application analytics (as ranked by those that need it the most), visit www.preemptive.com/pa









Wednesday, October 3, 2012

Today’s DevOps: Pushmi-pullyu or kick#@s crew?


Most of us know Dr. Dolittle’s pushmi-pullyu – that special beast with two heads that go in opposite directions whenever it tries to move.  That this creature could only exist in a fantasy world free from Darwinian forces is obvious – doomed to failure because, while its blood may circulate, the pushmi-pullyu is literally of two minds whose “selfish interests” are forever at odds.

In the 2-person scull, rowing is executed in precise synchrony achieved through coordination and continuous feedback. Failure to stay in synch will list the boat to one side, slow forward progress, impede steering, frustrate rowers and serve as the root cause of numerous injuries.

Of course, competitive rowers invest in the best platform too (their racing boat or "shell"). Competitive shells are designed to reduce all manner of friction. A shell’s rigging is built to meet the distinct needs of each rower; accommodating unique requirements that stem from a rower’s relative position inside the boat (which should NEVER be confused with a conflict of interests between the two rowers).  

In fact, the underlying design principles presume that rowers share a common goal (win a race), have entered into a contract (to coordinate and synchronize), and are committed to working within an integrated platform (their shell).  …and the best rigging is one that finds a way to meet each rower’s unique requirements as measured by the achievement of their common goal.

The hallmarks of a successful DevOps organization (aka “the kick#@s crew”)


What’s it take to build a kick#@s crew and avoid breeding your own doomed pushmi-pullyu?

·        Agree on common goals (Dev ROI?)
·        Adopt processes designed to coordinate and synchronize DevOps activities through continuous feedback (Agile?)
·       Invest in an integrated DevOps platform built to reduce friction and able to meet the unique requirements of both development and operations.
·        Always be mindful that, if these unique requirements are not met (like in our sculling example), you will suffer slowed progress, impeded agility, frustrated stakeholders and all manner of inefficiency and loss.

These principles sit at the heart of PreEmptive Analytics and have helped ensure our success across industries, platforms and runtime environments.

I first blogged on the importance of understanding role based “special interests” relating to development and operations feedback almost two years ago to the day: Application analytics: a new game brings new rules  (10/12/2010)

For a more contemporary discussion of these topics, check out my article inside the MSDN Visual Studio Library: Application Analytics: what every developer should know

For more information on PreEmptive Analytics, visit www.preemptive.com/pa

STROKE!