Monday, September 7, 2009

Walter Cronkite for CIO!

I keep six honest serving-men
(They taught me all I knew);
Their names are What and Why and When
And How and Where and Who.

Rudyard Kipling, Just So Stories

The "5 W’s" serves as one of the most basic formulas in journalism (police investigation and research too). The power of “Who? What? When? Where? Why? (And How?)" stems from the fact that each question requires a factual answer that cannot be answered with a simple “yes” or “no”.

How many botched projects, misinformed acquisitions, and over hyped technologies could have been nipped in the bud had the original proposals been subjected to this most basic journalistic benchmark?

Who specifically are the stakeholders? (people who care) Whose job responsibilities will change? (not at all the same as stakeholders)

What exactly will change for each of the stakeholders and those who will see their day-to-day tasks change?

When will these changes occur (as steps within a process flow and/or in what sequence)?

Why will any of the participants “opt-in” or cooperate? What’s in it for them?

How exactly will proposed changes be implemented? How will the proposed technology set all of this in motion?

New technology promises all kinds of life-changing opportunities – but the distance between technology and adoption is much more than “the last mile” of a vision – it’s the difference between vision and victory.

A case in point – we have been focusing on bringing “runtime intelligence” to market – a genuinely unique approach to application monitoring. What makes our approach unique is that it is designed to “serve the selfish interests” of two communities that have historically had very different priorities and worldviews. By serving a much larger constituency, we are able to drive higher adoption, increase collaboration, and solve “unsolvable” problems for the very first time.

Typically, applications are monitored by EITHER developers OR operations. Developers are mostly concerned with debugging and general usability issues. IT operations will often focus on performance, security, and licensing. In fact, BOTH groups of stakeholders suffer from their respective isolation from one another. For example, a software vendor wants to build features that are of value to the widest possible set of users – a single company (operations) only cares about their own parochial needs (and they don’t want to pay for “over engineering”). The software vendor worries about piracy and IP theft – operations worries about sensitive information loss and operational risk. This (and many other) inherent conflicts between developers and operations management undermine both groups' agendas and impede their success.

Runtime Intelligence may be the first solution that addresses application developer demand for near real-time visibility into adoption and usage in the field while simultaneously helping operations automate their IT policies and reconcile application investments with business performance.

Our breakthrough is, in large part, due to a our focus on making sure we have solid answers for the 5W’s (and 1 H).

"And that's the way it is."

Tuesday, September 1, 2009

Are developers just big babies? The good ones are!

In her latest book, The Philosophical Baby, Alison Gopnik points out that babies are far from self-centered, myopic beings. In fact, they exhibit all of the characteristics (both good and bad) of adults. In fact, they are in some ways superior. Babies, Gopnik would assert, have malleable, complex minds and a drive for discovery, and are enthralled by every subtlety that surrounds them.

Gopnik compares babies to the research and development department of the human species, while adults take care of production and marketing. Like little computer scientists, babies draw accurate conclusions from data and statistical analysis, conduct experiments, and are even capable of counterfactual thinking (the ability to imagine different outcomes that might happen in the future or might have happened in the past).

In short, babies can
· Observe their environment and absorb salient facts,
· Connect consequences that stem from the events they have observed,
· Predict future outcomes based upon the previous observations and their consequences,
· Develop a vision for the future – develop predictions based upon “what if” scenarios based upon hypothetical (versus observed) events and consequences.

(The fact that babies have these innate characteristics is consistent with the evolutionary perspective on creativity that I already discussed in my earlier entry Software as Fiction)

So why are good developers just big babies?

Good developers move beyond the strict functionality of the code that they write – they move beyond higher order concepts of system quality – they even move beyond caring about and optimizing their work to maximize the value of the code they write. They have the ability to imagine wholly different worlds where the underlying assumptions, constraints and, by extension, their criteria for success may be completely different. This is what we call a “market disruption” like the Internet, cell phones, etc…

In short, good developers can
• Unit test (observe)
• Profile applications (consequences)
• Calculate business impact and mitigate security risk (predict)
Develop a vision for the future – develop predictions based upon “what if” scenarios based upon hypothetical (versus observed) events and consequences.

This is why its always good to let developers have some play time (and some milk and cookies too)

Thursday, August 20, 2009

How do I love thee? Let me count the ways.

- Sonnet 43, Elizabeth Barrett Browning

What’s love got to do with people and software? (apologies to Tina Turner’s Private Dancer)

Hint: if people live for love, then (software) businesses live for money.

When all is as it should be – love is at the heart of our life and value drives our business. Both are vital – and both are very very hard to measure. Its figuring out WHAT to measure that is so difficult. What really matters? How many poems someone writes? How heavily software is being used? Measuring is easy – measuring the right stuff is what is so very very hard.

At PreEmptive, we have been focusing on Microsoft’s Azure. For those that are non-technical (or who think you are but live in a cave and can’t see the horizon), Azure is a massive Microsoft entry into “cloud computing” – an approach that takes all the worry, hassle and expense of managing computers away (into a cloud) making software very much like a phone service – all (or most) equipment is shared by massive numbers of people and managed for you. Cloud users simply pay to use the service.

Unlike phones, however, software builders and buyers are not all that used to this business model – technically – it’s not that big a deal – but from a business perspective – figuring out what you pay for, how its measured, and what the costs will actually be – this is all new! Ms Browning measures her love in "depth, breadth, and height" - could software value be harder to measure than love?

If wireless phone companies did not charge by the minute, no one would count minutes – we would just talk. Well, developers have just been “talking” for their entire professional careers – and now they have to start structuring their work around these new rules to avoid waste and expense. Azure (and other competitive cloud platforms) is not really a technology innovation as much as it is a major shift in business model.

So, whether you are a romantic (like Ms Browning “I love thee to the depth and breadth and height”) or perhaps more hardened like Papillon Soo in Stanley Kubrick’s Full Metal Jacket, one thing is for sure - Microsoft’s Azure wants to “love you long time.”

How deep, wide, high or long is the question.

Check out a this article in SD Times - PreEmptive's Dotfuscator instruments Azure applications By David Worthington – where Dave makes many of the very same points in a much more professional manner.

For a more commercially-centered view on all of this, read my preemptive blog entry.

Spread the love!

Thursday, July 9, 2009

Update to last post... I hate to say i told you so...

Right from the NY Times headlines - Cyberattacks Jam Government and Commercial Web Sites in U.S. and South Korea. The article in part reads "SEOUL, South Korea — A wave of cyberattacks aimed at 27 American and South Korean government agencies and commercial Web sites temporarily jammed more than a third of them over the past five days, and several sites in South Korea came under renewed attack on Thursday."

I have seen the list - it's more than 27 sites and it is more of a probe than a serious attack. The attackers are learning from our response and refining their strategy. The fact that these attacks are being characterized as primitive should not make us feel any more secure - sorry - i promise not to turn this blog into a paranoid rant... (unless it's already too late;) - the next few postings will be cheery and sunny (even if it kills me).

Tuesday, June 30, 2009

A daker side of application and human behavior

In DC today at a security conference (Gartner) – and this has prompted the following - I use this blog explore the symmetry between applications and their human progenitors – today’s posting focuses on a darker side – the military, terrorism and war.

I have just left a presentation led by David Sanger, Chief Washington Correspondent for The New York Times. He focuses more broadly on foreign policy, globalization, nuclear proliferation, and the presidency – today’s discussion was on Cyber threats rather than nuclear or trade.

Applications are now soldiers, terrorists, saboteurs and secret agents.

Did you know that denial of service attacks (techniques for bringing down phone, power, broadcast and financial networks) are now a standard tactic in every army’s war book?

Just as air bombing is standard before a land battle begins, so are denial of service attacks.
  • Estonia experienced a devastating cyberattack in 2007 following a decision to move a statue memorializing Russian soldiers who fought during World War II. Pro-Russian hackers took down bank and school websites on Estonian networks.
  • Russia used denial of service attacks before attacking Georgia last year.
  • And earlier this week, Iranian news websites and those belonging to political organizations were hit following the contested re-election of President Mahmoud Ahmadinejad.
Did you know that the US power grids and financial markets are continuously probed searching for weaknesses to be exploited at some future date?

…and guess what? Unlike human terrorists, you cannot easily determine their origin. We have no borders to protect. And even when you find the source (computers) that are launching these attacks – they are rarely in the country of origin (Russia’s attack against Georgia emanated from Turkey). How do you think Turkey would feel if Georgia bombed Turkey to defend itself?

If you haven’t already heard, Obama will soon be appointing a “Cyber Czar” – and before you buy in to some hack (the media equivalent of a computer hacker) complaining that we should be focusing on “the real threats” overseas, our economy, etc. remember your history – think of The Maginot Line – and be grateful that we have a president that actually uses computers and understands their role as the literal “work horse” of the 21st century and, now, the emergence of an entirely new “military front.”

Friday, June 12, 2009

Stuff i took the time to post on LinkedIn that may be worth repeating

I got caught up in a LinkedIn discussion thread on the influence of analysts on application vendors and software categories - i think it bears repeating... the original question was in part ...
Do industry analysts have too much influence on software vendors, who call their products GRC or CCM/T - terms used by analysts?

There were a few comments before i wrote...

I probably read way too much into this question – but it hit a nerve and so here is a rather lengthy reply (for a linkedin comment anyhow).

At the most abstract level – the etymology of terms like GRC are no different than any other phrase or term in natural language – like heat off of an engine, meanings are generated through usage (which often diverges substantially over time from first use). This means that even though careful and deep thinkers take the time to carefully craft a coherent and fully realized definition of GRC – this is not, at the end of the day, the actual meaning of GRC. There are two scenarios here – a) people using the term with a shallower or incomplete understanding and b) people intentionally reusing the term to mean something slightly (or entirely) different. In either case, whoever gets the most air time generally wins.

Looking at the second use case – intentional misuse – this is extremely common in the commercial world (not just hi-tech). What does “natural” mean? How about “fat free”? Hi-tech examples are numerous too – enterprise content management (ECM) is another good example. In fact, I wrote a short column on this way back in 2002 (before blogs were big) entitled “Enterprise: how long is a piece of string?” http://gilbane.com/columns.pl?view=5 …here I offer my own musings on the tension between vendors, consumers and analysts at length– but the topic was not GRC – it was ECM. Is it surprising that vendors like IBM, Oracle, EMC and others are players in both?

To be clear, motivations are not always malicious or deceptive – as long as analysts need to produce a body of work that is organized, integrated and expandable (and commercially valuable) – they will develop (and insist upon controlling) their own taxonomies.

As long as suppliers are most interested in solving problems competitively and profitably, they will emphasize and focus only on problem domains where they are effective (no vendor paints a worldview with a hole in the middle).

And as long as enterprise consumers are focusing their scarce resources on the most material/pressing challenges and opportunities in front of them, they will ignore skills, technologies and opportunities that do not address their selfish interests. Each group works to influence the other two – but the tension is natural – and I believe healthy.

In my view, all three players are correct to do this (in fact, this is more of an ideal than a common practice). So, I guess the short answer from my perspective is that mapping capabilities to features or categories is, by design, an imprecise means of communicating priorities and intentions – and should never be relied upon to replace detailed and deliberate assessments/evaluations/recommendations.

Buyer beware – or – he who controls the language, controls everything – or – meaningful ambiguity is a good thing…

A few complimentary notes were posted on the above :) and then an interesting post came from Michael Rasmussen - a very effective analyst and thought leader in his domain of governance, risk and compliance management....

Michael wrote "Yes, industry analysts do have too much influence on defining and categorizing software. Particularly in markets such as GRC. I left Forrester after seven years because I was continually frustrated - my definition and approach to GRC was broader than Forrester's audience. Forrester, Gartner, and their peers are good at reaching the IT audience - so GRC (as a software category) often gets trapped within IT. Occasionally it breaks out into other areas such as finance where they have some traction. They fail to understand GRC's role in EH&S, Quality, CSR, and many other areas. "

...and that's when i went a little overboard for a linkedin discussion (it took two posts to fit it in - here it is)

POST 1

Perhaps because I too have spent many years in this business (over 20 as an ISV and even 2 as an analyst), I cannot resist the temptation to connect Michael’s point of view with my earlier post. Sadly for all of you, because the post was too long, you will have to read this post AND THEN THE NEXT POST for the punchline...

The ISV-enterprise-analyst knot If you deconstruct the influence of analysts on software categories, you will see that ISVs are keenly focused on their customer needs (this is the ISVs primary focus). In the same fashion, the ISV customers’ primary focus stems from their target customers requirements (whoever they are). Since assessing IT options (or HR policy or tax law or…) is NOT the ISV customers’ primary focus, they look to outside support that is, ideally, expert and independent. With regard to IT, they look to IT analysts. IT analyst firms in this particular scenario have enterprise IT as their primary customers (focus) too. So – ISV sells to Enterprise who is then sold to by Analyst firms who provide “independent” guidance. The result is a tightly woven financial, professional and organizational knot. Leading to the following good, bad and twisted consequences

1) This dynamic discourages innovation and transformational solutions: The enterprise IT group is generally not incented to modernize or re-engineer their IT strategy on their own initiative – and so, typically, do not look to analysts for this kind of advice – they want guidance with minimal risk, a proven (therefore established) approach, using equally stable technologies and suppliers.

2) IT analyst firms deliver what their customer-base wants – That means a topology and best practices that emphasizes a “rear-view mirror” perspective. This is especially true for companies like Gartner and Forrester because of their enterprise client base. (Note that Michael appears to validate this when he writes that analyst firms are “good at reaching the IT audience.” That’s no accident or even handicap from a business perspective – that is their North Star to hitting their revenue goals. This is the high-order bit, the organizing principle, their raison d'etre. )

3) ISVs must “set the table” to win sales. ISV’s try to influence (or appease) analysts as a tactic to influence their shared customer-base. The influence on ISVs (who ultimately must label their software as “grc” or “ecm” or whatever) is, therefore, indirect. If the enterprise IT customer was willing to pay analysts to produce transformational business and operational re-engineering recommendations – then that’s what analyst firms would immediately start to focus on. But, to date, market forces rarely lean in that direction.

4) Sometime the market does demand transformation. Disruptive technologies like the Internet, regulations like Sarbanes-Oxley or economic forces like the rise of India and China may force businesses to place new demands on IT that get passed to analyst firms which then generate short bursts of transformational analyst output. NOTE - This is the exception and lasts just long enough to address the threat and never long enough to reap all of the potential value. This is why so many companies will stop GRC investments once individual regulatory obligations appear to be met but well before an integrated and effective GRC transformation is even in view. It is organizational and professional entropy.

5) In order to transform businesses, you must cut the knot – If the success of a new business practice or technology requires organizational change and/or a re-education of professionals (inside any of these three organizational threads) – the interlocking dependencies of the ISV-enterprise-analyst knot must be severed.

Is this inherently bad? Read my next post please.... (its WAY shorter)

POST 2

Is this inherently bad? Of course, if you’re a spirit who thrives on transformational change – this will be extremely frustrating (and I count myself among that number). But I have to say that this is not the only view.

As our founding fathers recognized – perhaps the greatest threat to our liberty stems not from dictatorship, but from “a tyranny of the masses.” Like the executive, legislative and judicial branches of government, our “knot” of enterprise IT, ISV and analyst may slow things down to a maddening degree – but it also protects us from swinging corporate strategy and operations too often or too far in any one direction. (hey, let’s throw out our computers and just use iphones!).

Now, I would never presume to speak for or represent Michael’s views – but I did have the good fortune to be one of Michael’s clients when he was at Forrester and have had some experience with him in his subsequent “expanded” and independent role as well. My experience of Michael is that he is a man who is not readily satisfied with the status quo and is energized when he sees a way to materially transform the way people work – and by extension – the way they live.

Ironically, as Michael succeeds in his almost evangelical mission to raise our collective consciousness as to what GRC SHOULD mean, organizational changes within enterprises to better align with good GRC practices will be one sure result. This will in turn lead to a spike in demand for more sophisticated/expansive analyst services around “true GRC” and this will in turn bring “the new GRC” into the analyst firm mainstream. …and in a decade or so, someone will rail against these firms for stifling the next dimension of business/social/operational/financial management. Who knows, perhaps it will still be Michael.

Remember – an “end-to-end solution” is just a silo seen from the inside. (…and apologies in advance to Michael if I have in any way misrepresented or dumb'ed down his outlook beyond recognition)…

Sunday, May 31, 2009

Software as Fiction

I caught an installment of The Bob Edwards Weekend on PRI this week where he was interviewing Denis Dutton, a philosopher and author of The Art Instinct – Beauty, Pleasure, and Human Evolution. Dutton is also the founder and editor of the website Arts & Letters Daily which was named by the Guardian as the “best Web site in the world.”

Anyhow, and to oversimplify, his premise is that art is much more than heat thrown off of a cultural engine – rather, art sits at the heart of our evolutionary advantage. Art provides a safe, effective means to learn life’s tough lessons without actually having to suffer the scars or take the risks inherent in the real world. As a species, fiction gave us the ability to adapt and survive better than our less creative Neanderthal competitors.

My first thought was of one of my father’s stories – “The Zebra Story Teller” –Checkout the following analysis from The Norton Introduction to Literature: “’The Zebra Storyteller’ suggests that the purpose of stories is to prepare us for the unexpected. Though the storyteller (a zebra in the story) thinks he is just spinning stories out of his own imagination in order to amuse, his stories prove to be practical. When the extraordinary occurs—like a Siamese cat speaking Zebraic—the storyteller is prepared because he has already imagined it, and he alone is able to protect his tribe against the unheard‐of.”

In the context of Dutton’s thesis, the Zebra Storyteller describes how fiction emulates the science that establishes fiction as an emulator!

The Zebra Storyteller is included here at the end of this post.

What’s this have to do with software? If fiction is a safe way to explore and grow – what are computer games? Simulators for airplanes or war games? Test cases that are a part of every application development cycle? We typically think of software as a means of automation that increases productivity, improves quality, etc. – but if Dutton is right, software plays an equally important (or even more important) role as a "low-cost, low-risk surrogate experience."

…and for me – this leaves room (establishes the permanent need) for the truly creative developer who is not chained to a formal spec…

Programmers as poets writing software sonnets - diggit!

The Zebra Storyteller
by Spencer Holst

Once upon a time there was a Siamese cat who pretended to be a lion and spoke inappropriate Zebraic.

That language is whinnied by the race of striped horses in Africa.

Here now: An innocent zebra is walking in a jungle, and approaching from another direction is the little cat; they meet.

“Hello there!” says the Siamese cat in perfectly pronounced Zebraic. “It certainly is a pleasant day, isn’t it? The sun is shining, the birds are singing, isn’t the world a lovely place to live today!”

The zebra is so astonished at hearing a Siamese cat speaking like a zebra, why, he’s just fit to be tied.

So the little cat quickly ties him up, kills him, and drags the better parts of the carcass back to his den.

The cat successfully hunted zebras many months in this manner, dining on filet mignon of zebra every night, and from the better hides he made bow neckties and wide belts after the fashion of the decadent princes of the Old Siamese court.

He began boasting to his friends he was a lion, and he gave them as proof the fact that he hunted zebras.

The delicate noses of the zebras told them there was really no lion in the neighborhood. The zebra deaths caused many to avoid the region. Superstitious, they decided the woods were haunted by the ghost of a lion.

One day the storyteller of the zebras was ambling, and through his mind ran plots for stories to amuse the other zebras, when suddenly his eyes brightened, and he said, “That’s it! I’ll tell a story about a Siamese cat who learns to speak our language! What an idea! That’ll make ’em laugh!”

Just then the Siamese cat appeared before him, and said, “Hello there! Pleasant day today, isn’t it!”

The zebra storyteller wasn’t fit to be tied at hearing a cat speaking his language, because he’d been thinking about that very thing.

He took a good look at the cat, and he didn’t know why, but there was something about his looks he didn’t like, so he kicked him with a hoof and killed him.

That is the function of the storyteller.
--------------------------------------------------------------------------------

©Spencer Holst. From THE ZEBRA STORYTELLER, Station Hill Press

Thursday, March 12, 2009

I will be presenting at the MIT SPAM Conference 2009

Well, this is not really a posting about how applications are like people - its just an FYI that I will be presenting at The MIT Spam Conference 2009 being held at (surprise) MIT. It is a two day conference on March 26-27, 2009.

You can register (sponsor and MIT funded - no charge) by emailing the chair, Kathy Liszka Professor, Computer Science University of Akron, at mailto:liszka@uakron.edu?subject=SC2009

The agenda is reasonably technical and is as follows: (I hope to see you there!)

Thursday March 26, 2009
9:30 a.m. breakfast

10:00 a.m. chair opening
Kathy Liszka / Bill Yerazunis

10:15 a.m. keynote
Robert Bruen: ICANN Policy Enforcement

10:45 a.m. keynote
Garth Bruen: The Future of Anti-Spam: A Blueprint for New Internet Abuse Tools

11:15 a.m. paper
Adrian McElligott: Email Permission Keys

11:45 a.m. lunch

1:30 p.m. paper
Claudiu Musat: Spam Clustering Using Wave Oriented K Means

2:00 p.m. paper
Sebastian Holst: Account-free” Email Services to Combat Phishing, Brand Infringement, and Other Online Threats

2:45 p.m. paper
Nathan Friess: A Kosher Source of Ham

3:15 p.m. paper
Didier Colin: A Selective Learning Model For Spam Filtering

3:45 p.m. presentation
Rudi Vansnick: Is Spam in Europe easier to handle ?

6:00 p.m. reception
Courtesy of ComCast

Friday March 27, 2009
9:00 a.m. breakfast

9:30 a.m. paper
Tim Martin: Phishing for Answers: Exploring the Factors that Influence a Participant's Ability to Correctly Identify Email

10:00 a.m. paper
Reza Rajabiun: IPv6 and Spam

10:45 a.m. workshop
Adrian McElligott: How to integrate Email Permission Keys in to an existing Spam Filter in 5 easy steps

11:15 a.m. paper
Henry Stern: The Rise and Fall of Reactor Mailer

11:45 a.m. lunch

1:00 p.m. presentation
Andra Miloiu Costina: Do humans beat computers at pattern recognition?

1:30 p.m. paper
Cesar Fernandes: An Economic Approach to Reduce Commercial Spam

2:15 p.m. paper
Alexandru Catalin: Phishing 101

2:45 p.m. paper
Areej Al-Bataineh: Detection and Prevention Methods of Botnet-generated Spam