BeyeBLOGS | BeyeBLOGS Home | Get Your Own Blog

Main | May 2010 »

April 29, 2010 syndicated on BeyeNetwork

BeyeNetwork - Global coverage of the business intelligence ecosystem

I have been a member of the Business Intelligence Mecca that is for quite some time, but as of this week I am delighted to announce that they will be syndicating this blog at:

Thank you to all at at BeyeNetwork for setting this up.

tweet this Tweet this article on
Bookmark this article with:
| Facebook | | digg | Reddit | Stumble

Filed under: business intelligence, site update, social media Tagged: beyeblogs, beyenetwork

Posted by Peter Thomas at 6:52 PM

April 28, 2010

New Adventures in Wi-Fi – Track 2: Twitter

New Adventures in Wi-Fi (with apologies to R.E.M.)

Forming the second part of the trilogy that commenced with:

New Adventures in Wi-Fi – Track 1: Blogging


To tweet, or not to tweet. That is the question.

First of all some caveats:

I am not a social media expert, nor any of its many variants.
I do not work in marketing or PR.
I will not be encouraging you to unleash the power of FaceTube/YouSpace/MyBook to make the world a better place (and your bank vault a fuller one), or to sell a million more of your product.
I can not claim to have some secret formula for success in the world of on-line communication (indeed I tend to be allergic to such things as per Recipes for Success?).

If you want all the answers, then please look elsewhere. Good luck with your search!


I am an IT person, with a reasonable degree of commercial awareness and a background in sales and sales support.
I have been involved in running web-sites and various on-line communities since 1999.
I do author a business, technology and change blog that has been relatively well-received (why else would you be reading this?)
I think that can be an extremely useful way of interacting with people, expanding your network and coming into contact with interesting new people.

This is the middle chapter of a series of articles about the experiences of a neophyte in the sometimes confusing world of social media. View this article as akin to Herodotus describing crocodiles and you won’t go far wrong. If you learn something useful, then that’s great. If not, I hope that my adventures prove a harmless diversion for the reader.

I thought of adding a fourth zero, but that seemed much too applied. For the avoidance of doubt this illustration should not be taken as an endorsement of Ab Initio.

I covered some of my previous forays in what has now come to be called social media in my earlier article, so I won’t revisit this here. The main focus of this piece is Twitter, a service that I joined back in December 2008, a couple of months after establishing this blog. It took me some time to figure Twitter out and I am not sure that I entirely “get it” in full.

In a recent article – How I write – I referred to many of my blog posts flowing quickly and easily. I must admit that writing this piece is proving to be something more of a struggle. Perhaps this reflects the fact that making progress on Twitter was also anything but easy. Indeed I felt that for a long time I was blundering about without any real idea about how to use the medium, or what I wanted to use it for. It also probably reflects my admitted lack of expertise in social media.

An aside for fellow pedants:

One in a million

Twitter is positioned as a micro-blogging service. This terminology offends the scientific bent of my mind. Micro (μικρός) implies 10-6 or one millionth. I tend to write relatively long blog posts and the average size of one of my articles is about 1,200 words; this equates to just over 7,000 characters. Twitter’s 140 character limit (originally set as the length of an SMS) is one fiftieth of this figure, so a more accurate description of Twitter would be a centi-blogging service; for less verbose bloggers maybe deci-blogging would also work.

Many aficionados of Twitter claim that it is the ideal way to promote your product, your service and/or yourself (or all three at the same time). The same people also say that it is a great tool for listening to existing and potential customers, obtaining information about what they like and dislike and picking up on trends. All this may very well be true, but this is not how I have come to use Twitter and I will not be covering any of these aspects here.

For me the facility is not really about reaching a wide audience – however much I may be passionate about areas such as Business Intelligence, I realise that not everyone will feel then same. Instead it has been a great way to discover the members of a broader worldwide technology community focussed on areas such as databases and data warehousing, BI tools and approaches, numerical and text-based analysis and general technology industry issues.
So what is all the fuss about anway?

How come it doesn't recognise

Twitter started as a way to post updates from your mobile ‘phone by texting a message to a number (07624 801 423 here in the UK). The messages would generally be about the sorts of things that you would be doing when you don’t have access to a PC, but do to a mobile ‘phone. For example:

These messages were then posted on-line and could be read by other people. If these people found your output interesting (and let’s face it who could not be captivated by the examples I quote above), then they could subscribe to your posts (or follow your tweets in the lingo). When some one follows you, you are notified and can return the complement if you wish. In this way the network of people with whom you can share your updates grows.

At some point people began to realise that you could skip the mobile bit and use your computer to post tweets directly on-line. This opened up the entire:

type post and the rest, as they say is history. I have tweeted via my mobile ‘phone recently, but only by first loading Opera Mini and going to I suspect that there are people out there who have never sent an SMS to their Twitter account.

Hamlet Act 2, scene 2, 86–92

A relic of this history is the aforementioned 140 character limit. Because there is not much room to type, there is a limit to the length of thought that you can share. In turn this means that a defining characteristic of Twitter is brevity. For someone such as me who is not known for having this quality as a core characteristic, this presents something of a challenge. However when you have something exceeding 140 characters to say, the Twitter limit forces the approach of writing it down somewhere else (e.g. on a blog) and then posting a link. A lot of my Twitter posts contain either links to this site or to interesting articles that I have found elsewhere. In this way, Twitter has some attributes akin to a more dynamic version of a social bookmarking site (such as or

The other key characteristic of twitter is interaction. Most of my other tweets are either passing on comments made by other people, or links posted by them – of course this type of behaviour tends to lead to reciprocation, which binds people together (in a positive sense) and also potentially widens the network available to both. The balance of my tweeting is made up of chatting with people (tweeps if you must) either about industry issues, or – probably more frequently – just shooting the breeze.

To me rather than [insert appropriate negative power of 10 here]-blogging Twitter is much more akin to it’s historical roots of public texting. Instead of SMSing one or a small group of people, you share your abbreviated pearls of wisdom with potentially thousands of people, who have a much easier way of following your train of thought. Of course there is no guarantee that they put the same care and attention into reading your tweets as you did in to writing them; more on this later.
Some suggestions for blissful tweeting

Blue skies / Smiling at me / Nothing but blue skies / Do I see +++  Bluebirds / Singing a song / Nothing but bluebirds / All day long

These are some things that have worked for me and seem to make sense. There are lots of alternative perspectives out there, just a google away:

  1. Go to and sign-up for an account.

    Unless you want to stay anonymous, I would suggest using your real name and a user name that is close to this: I’m @peterjthomas for example.

  2. Fill in your profile and tell people a bit about yourself.

    There is nothing more off-putting than being followed by someone, clicking on their page and finding… nothing. Why would anyone want to listen to what you have to say if you don’t lay down some markers here? While you are at it, think about customising your page to make it a bit more distinctive. But don’t go to town, at least at present, it is not that easy to come up with a scheme that will work on multiple screen resolutions.

  3. Find some people to follow.

    This can be a little easier said than done. What you are most likely looking for is people with similar interests to yourself. There are a number of approaches.

    1. You may already know some peers who use Twitter, as well as following them, go to their page ( and see who they interact with when speaking about subjects that you also want to talk about. If they don’t have thousands of followers, take a look at the list and also look at who they follow.
    2. Many people in the blogosphere (as well as many corporations) have a Twitter presence and will often advertise this fact. If you have found an interesting blog article – say this one – then scan the site to see if there is a Twitter link; more often than not there will be.
    3. If you end up following some one that you view as being influential in your area, then take a look at the people that he or she tweets with – they will probably also be worth following.
    4. You can also use twitter search to see what other people are talking about that might be of interest – the following link looks for references to business intelligence: (more on how to tag your tweets later). It may be that some of the people that come up in a search list are worth following.
    5. Finally you can let other people do the hard work for you every Friday. Follow Friday is a Twitter tradition in which people give recommendations of people that they feel others may want to follow. This can be gold-dust for someone hoping to find like-minded people.
  4. Think about how to get people to follow you.

    Maybe a good way to think about this is to consider the exercise that you have just completed to look for people to follow. What would make your Twitter account come into focus in such a process? Whatever you are looking for in some one to follow, similar people will also be looking for, so try to fit the bill.

    If you are looking for people who share cool articles, then share cool articles. If you are looking for people who express opinions about things that are important to you, then express opinions; either on Twitter, or via a blog and post links on Twitter. If you are looking for people who engage with others, then engage with others yourself. You can reference people who are not following you (and indeed who you are not following) just by putting an ‘@’ in front of their name.

    For example even if you are not following me and you post:

    “Wow! that @peterjthomas really knows his business intelligence”

    then first of all I will notice (as you reference me) and second I’m as human as the next person and am likely to at least consider following you, or at the very least sharing your comment with my followers.

    An aside on sharing tweets:

    Twitter etiquette is that you don’t share other people’s tweets without referencing them. So in the above example I might re-tweet your kind comment as:

    “RT @your-name Wow! that @peterjthomas really knows his business intelligence << Thanks"

    the RT stands for re-tweet and the << indicates my additional comments, in this case to say thank you – people do the latter in a number of way. An alternative to using RT is as follows:

    “Wow! that @peterjthomas really knows his business intelligence (via @your-name)”

    Not only is this polite, but now @your-name and @peterjthomas are linked – if I was worth following, then me mentioning you is a worthwhile objective.

    Of course the other two keys to gaining followers are the same as for getting people to read your blog: first share links that are worthwhile sharing (particularly if they are your own work) and second try to engage with people and refrain from being a passive by-stander.

One thing that is probably dawning on any Twitter novices right now is that the above are not discrete activities that you do once and then are finished with. If you want to get the most out of Twitter, then you will have to keep doing them.
More advanced techniques

Paul Dirac - the UK's greatest physicist since Newton

Unless you are looking to create a social media presence for a Fortune 500 company (assuming that there are any left who have not already created such a thing), then the above pointers are probably more than enough to get you started. Like me you may then just muddle through, hopefully learning from your mistakes. Alternatively, there are any number of guides out there which may or may not strike a chord with you and suit your personal style; just search for them.
Be yourself

On the subject of personal style, I’d suggest (as I also suggested in my article on blogging) that you be yourself on Twitter. Even within 140 characters, trying to be something that you are not comes across as fake; people aren’t impressed. On the same subject, treat people as you would face-to-face. If you are trying to sell something – even just your personal brand – then would you ram this down people’s throats in person? If not, then why would it be OK to do this on Twitter. A more low key approach is likely to lead to engagement and a better outcome than blowing your trumpet from the roof-tops (I know, I have tried the latter and it doesn’t work too well).
Use hash-tags

Above I mentioned tagging your posts. So if you write something about cloud computing, you might want to tag it with a key word, e.g. “cloud”. Though Twitter’s own search engine and the various other tools that you can employ on Twitter data will search for any occurence of specified text, it is still traditional to use hash tags, so in the above example a tweet might look like:

“I have just come across a great article sumarising new development in cloud computing – #cloud

As ever the incomparable has an insight on this world that is both acerbic and insightful:

I learnt everything I know about title/alt text from Randall Munroe


To see a slightly more positive use of Twitter search and hash-tags, try looking up coverage of a recent Teradata analyst event using #td3pi.
Shorten your URLs

On the subject of links, the 140 character limitation means that you don’t want to waste space with long URLs. Using a URL shortener is mandatory – I use but there are many other such tools out there.
Check out the wide range of Twitter-related tools

Now that the subject of tools has come up, there is an entire hinterland of Twitter-related tools that can do a wide range of things to help you. These include:

  1. Twitter platforms

    These which help to manage your entire Twitter experience from reading other people’s posts, to making your own (sometimes doing link shortening for you automatically). If you are successful in finding people to follow and attracting people to follow you, then there will come a time when the noise level becomes unmanageable. This type of tool can help by providing filters and groups, which enable you to make sense of a tsunami of tweets, organise them and prioritise your time.

    I use TweetDeck, but again there are many alternatives.

  2. Twitter add-ins

    These are generally what you would employ on your blog or other internet site to allow people to easily tweet your content. There are several very slick and attractive looking options out there, just take a look at a handful of sites and take your pick. I’m staying old-school for present and hand-coding my Twitter links (as at the end of this article).

  3. Twitter analytics

    This is rather a grand name which covers everything from the trend of how many people are following you through to quite sophisticated analysis. Rather than provide a list, take a look at one that Pam Dyer has put together here.

  4. Other

    There are a lot of fun Twitter-related applications out there. Just one example to whet your appetite is the following app, written by @petewarden which graphs your relations to other people on Twitter and gives a very visual perspective on the totality of your tweeting:



In closing

I chose to close this article with the above image for a reason. To me it captures the essence of what Twitter is about; forming a network of associations with people who can enrich your understanding, provide you with fresh perspectives, or even simply make you smile. If you enjoy reading this blog and are looking for people to follow who might share your world-view then clicking on the above graphic and checking out some of the people I interact with most may be a good starting point.

If you chose to take the plunge with Twitter then good luck and I hope that you get as much out of it as I have. You can also then do me a favour and use the handy link just below to share this article with your followers!

tweet this Tweet this article on
Bookmark this article with:
| Facebook | | digg | Reddit | Stumble


Filed under: social media Tagged:,, follow friday, hash-tag,, tweetdeck,

Posted by Peter Thomas at 3:02 AM

April 25, 2010

How I write

Drawing Hands, Maurits Cornelis Escher, 1948 - probably a suitable image for rather a self-referential blog article
During a conversation with an associate last week, she commented that a lot of effort must go into creating new content for this site and posed the question of where I find the time. Part of the answer is writing in the evenings and at weekends (tick both boxes for this piece), or grabbing moments when I am travelling on business. However, another piece of the jigsaw puzzle is quite different and relates to how I write.

The same person was kind enough to say that she found my writing to be coherent and well-structured. In her mind, this implied an equally structured approach to composition. Sadly nothing could be further from the truth. In general I have an idea (sparked by a conversation on-line, something that happened in the work day, or reading an article) and create a draft as soon as I have time and access to the Internet. Often this is no more than a working title, maybe a line describing my idea and, if this is what inspired me to write, a link to the relevant web-page for future reference.

Sometimes I come back to these ideas as soon as I have time to write more fully. However, on occasion the gestation period is longer, either because other topics have consumed my attention, or because my thoughts have not matured enough to put fingertip to keyboard. At present for example, I have 17 drafts sitting in WordPress, the earliest of which dates back to December 2008 (maybe it will see the light of day at some point).

When I do get round to starting to write, the process is normally very fluid. Most frequently I will substantially finish a piece at a single sitting. The way that I write tends to be first via a stream of consciousness, which results in a large block of text. Next I restructure (I would be lost without cut and paste) and finally I trim (yes I do sometimes reduce the length of my articles), or expand as seems to make sense at the time.

If on re-reading I feel that I have not been clear enough in making my points, I might re-write a section, or add some clarifying comments. Sometimes I will change the order in which paragraphs appear in order to improve the flow. I may even feel that an initial draft is conflating two fairly distinct ideas and thus split the piece into separate articles; but this is very much an exception. If I ignore correcting typos, punctuation and grammar, then I would estimate that over 80% of any given article will be identical to how I first wrote it.

An idiosyncrasy is that I tend to write in the HTML pane of WordPress and often hand-craft things like tables (one reason I moved to WordPress from Blogger was that the latter didn’t – at least at the time – support HTML tables). I guess this is a hang-over from programming days (not that long ago as, alongside my other responsibilities, I was still programming professionally as recently as 2008). This means that I have greater control over how an article appears, but also leads to me using the “preview” feature quite extensively.

I also tend to spend quite a bit of time either finding suitable illustrations or creating them (I use a combination of Visio and Paint Shop Pro, both tools I have used in a work context for years). Sometimes the ideas for images come as I am in the midst of writing (and a brief search, or a quick bit of design work can give my unconscious time to think about the next bit of text), but equally often I get rather swept along by the prose, push on to completing this and then come back to add images as part of the editing process. Either way, nowadays I seem to spend almost as much time thinking about mouse-over text for images as I do coming up with images themselves.

Going back to the conversation that I mention at the beginning. We ended up talking more about how I write. I said that it was normally a relatively rapid process for me. The best analogy that I could come up with was the difference between speaking in your native tongue and translating into a language that you are not 100% fluent in. For me writing about topics in business intelligence, cultural transformation and technology feels very much a natural thing; like speaking English. It’s not something laboured and the words generally flow quickly easily.

Maybe I am just lucky in this respect. Or perhaps the secret of prolific blogging is to write about things that you both know something about and for which you have an ongoing passion.
An early blog prototype

tweet this Tweet this article on
Bookmark this article with:
| Facebook | | digg | Reddit | Stumble


Filed under: social media Tagged: blogging,, writing

Posted by Peter Thomas at 7:56 PM

April 24, 2010

Ann All on measuring the effectiveness of BI – IT Business Edge

The value of your BI precisely 2½ inches

Measuring the effectiveness of business intelligence programmes is something that I both speak and blog about regularly. In fact Measuring the benefits of Business Intelligence remains the most frequently read article on this site (with over 4,000 hits at the time of writing).

  Ann All IT Business Edge  

Ann All from IT Business Edge recently blogged on this topic and included some of my thoughts, starting at the foot of page 1. The article is entitled Will ‘Hair of the Dog’ Help Companies Measure BI Efforts?. I would recommend having a read.

You can also read another article on this site featuring Ann All: ”Big vs. Small BI” by Ann All at IT Business Edge

tweet this Tweet this article on
Bookmark this article with:
| Facebook | | digg | Reddit | Stumble


Filed under: business intelligence Tagged: ann all, it business edge

Posted by Peter Thomas at 12:48 PM

April 23, 2010

The Cloud Circle Forum – London

The Cloud Circle


Earlier this week I attended the inaugural The Cloud Circle Forum in London. The Cloud Circle is the UK’s first independent Business and IT focused Cloud Computing Community. It is also the sister community of the Business Intelligence-focussed Obis Omni, an organisation with whom I have a longstanding (though I hasten to add, non-contractual) relationship (a list of Obis Omni seminars at which I have presented appears here, and you can also find some of my articles syndicated on their site).

There was a full programme with the morning being taken up by plenary presentations from Harrogate’s InTechnology (@InTechnology) and CloudOrigin (aka Cloud Computing evangelist Richard Hall – @CloudOrigin), followed by two Windows Azure case studies; one from EasyJet and one from Active Web Solutions (@AWSIpswich) for the Royal National Lifeboat Association – these were hosted by Microsoft themselves.

The afternoon programme saw delegates split into two work-streams, one focussed on strategy and management, the other on technology. Work-related pressures meant that I was unable to attend this part of the day, which was a shame as several bits of the morning speeches were helpful.

Who you gonna call?

Unfortunately, despite the fact that virtually every organisiation and individual I have mentioned so far has a Twitter account and the additional fact that there were hundreds of delegates at the forum, there was virtually no Twitter coverage. Maybe we can get too carried away with the all pervasiveness of social media sometimes. There are clearly some avenues of professional life where its influence is yet to be fully felt; even IT conferences!

I tweeted some commentary on the InTechnology presentation and the stream may be viewed here while it persists. However by the time that Richard Hall stood up to speak, a combination of a lack of reception (the auditorium was in the basement) and issues with mobile Twitter on my hand-held brought this activity to a halt.
The morning presentations

Note: I don’t want to steal the thunder from any of the speakers and so this article does not cover the content of their presentations in any detail. Instead my aim is to highlight a few points and provide a flavour of their talks.


The InTechnology talk was interesting in parts, in particular their focus on the savings to be achieved in cloud-based telephony alone. One of their speakers also suggested that the benefits of Cloud Computing were potentially reduced if an organisation worked with more than one vendor, which is clearly an aspect to consider.

Their presentations were topped and tailed by two segments of a Cloud Computing-related spoof of The Apprentice. Clearly some money had gone into this and the results were either hilarious or somewhat ill-advised depending on your personal taste. I have to admit to falling closer to the latter camp. While some delegates seemed to enjoy the fun of the fair, I felt the video distracted somewhat from InTechnology’s core message.

I billed Richard Hall as a Cloud Computing evangelist and certainly his tub-thumping upped the tempo. He made some interesting points, which included his assertion that the proprietors of cloud server farms were employing cutting edge technology that was not currently commercially available and might never be. The point here was that cloud providers were becoming true experts in the area with capabilities far beyond normal organisations. This segued with his prediction that there would be only four, or at most five, mega cloud vendors in the future.

Richard did have one slide focusing on the potential drawbacks (or current short-comings) of Cloud Computing, but you could tell his heart wasn’t really in it. One sensed that Richard never met a cloud he didn’t like, even referring to his only personal Road to Damascus during his talk. However one very valid point he made was that the legal agreements and licensing arrangements for Cloud Computing were significantly lagging the flexibility of the technology itself. This chimes with my own experience of the area.

Microsoft Active Web Solutions

The real-life case studies of cloud-based success were perhaps more telling than the earlier sessions. Bert Craven, Enterprise Architect at EasyJet, spoke about how his company had been moving selected elements of their IT assets to the cloud. Interestingly, while the original plan had been to keep some critical applications (the sort for which 99.9% availability is not good enough) in-house, one of these was now in the process of becoming cloud-based.

Richard Prodger, Technical Director of AWS, spoke about the work that his company had been doing with the RNLI – a charity that runs volunteer lifeboats around the coasts of Britain. The specific project was to provide fishermen with devices that would automatically alert the RNLI control centre if they fell overboard and then provide accurate positioning information enabling a faster rescue and thus one that would be more likely to result in success. Richard shared stories of several fishermen who were alive today thanks to the system. Here the cloud was not the original vehicle, but something that was subsequently employed to scale up the service.

Both case studies used Windows Azure as a component. I have not used this toolset, nor have I been briefed on it and so will refrain from any comment beyond stating that both Bert and Richard seemed happy with its capabilities; particularly in securely exposing internal systems to external web-users.
Some thoughts on what I heard and saw

When multiple presenters state that there is no agreed definition of the central subject matter of a seminar and then proceed to provide slightly different takes on this, you know that you are dealing with an emerging technology. That is not to deny the obvious potential of the area, but a degree of maturation is still necessary in this part of the industry before – in Richard Hall’s words – Cloud Computing becomes the future of IT.

There was more than one elephant in the room. First of which is bandwidth, which is relatively plentiful and relatively cheap in many parts of the world, but equally has neither trait in many others. This will be of concern to a lot of global organisations. Of course this is a problem that will undoubtedly go away in time, but it may dog true enterprise implementations of misison-critical Cloud Computing for some years yet.

Security remains a concern, it may well be true that the experts in Cloud Computing will be an order of magnitude more careful and competent about handling their customers’ data than many internal IT departments. However the point is that they are already handling the data of many customers and one error, or one act of malfeasance by an employee, could have a major impact. You may well be safer flying than driving your own car, but when a plane crashes, people tend to notice.

Future consolidation in Cloud Computing was mentioned by a number of speakers. Although this issue is not solely the preserve of cloud technology, it does raise some concerns about betting on the right horse. As has been seen in many areas of industry, the titans of today may be the minnows of tomorrow. When you are trusting an external organisation with your transactions, it helps to know for certain (or as close as you can get to it) that they will be around in five years’ time.

One of the central pitches of Cloud Computing is “let us look after the heavy lifting and your people can focus on more value-added activities”. While there are certainly economies to be seized in this area, the Cloud Computing industry may be doing itself a disservice by stating that customers can effectively stop worrying about functions moved to the cloud. In my mind a lot of care and attention will need to be put into managing relationships with cloud vendors and into integrating cloud-based systems with the rest of the internal IT landscape (or with other cloud-based systems). It may be that this type of work costs a lot less than the internal alternative, but it is nevertheless invidious to suggest that no work at all is required. This line of attack is reminiscent of some of the turn of the millennium sales pitches of ERP vendors, not all of which turned out to be well-founded.

In finishing this slightly downbeat section (and before a more optimistic coda), I’ll return to the commercial issues that Richard Hall referenced. He claimed – correctly in my opinion – that a major benefit of cloud-based solutions was not only that they scaled-up, but that they scaled-down. The “knob” could be adjusted in either direction according to an organisation’s needs. The problem here is that many parts of the Cloud Computing industry still seem wedded to multi-year fixed licensing deals with little commercial scope to scale either up or down without renegotiating the contract. What is technologically feasible may not be contractually pain-free. In the same vein more flexible termination clauses and guaranteed portability of data from one vendor to another need to be sorted out before Cloud Computing is fully embraced by many organisations.

On a more positive note, the above issues are maybe the typical growth pains of a nascent industry. No doubt solutions to them will be knocked into shape in the coming years. It is always tempting fate to predict the future with too much accuracy, but at this point it seems certain that Cloud Computing will play an increasingly important role in the IT landscapes of tomorrow. If nothing else this is attested to by the number of delegates attending Tuesday’s meeting.

The Cloud Circle are to be commended for getting out in front of this important issue and I hope that their work will better disseminate understanding of what is likely to become and important area and enable a wider range of organisations to begin to take advantage of it.

tweet this Tweet this article on
Bookmark this article with:
| Facebook | | digg | Reddit | Stumble


Filed under: cloud computing, microsoft Tagged: active web solutions, cloudcircle, cloudorigin,, intechnology, microsoft, rnli

Posted by Peter Thomas at 4:48 AM

April 21, 2010

Patterns patterns everywhere

Look at the beautiful shapes!


A lot of human scientific and technological progress over the span of recorded history has been related to discerning patterns. People noticed that the Sun and Moon both had regular periodicity to their movements, leading to models that ultimately changed our view of our place in the Universe. The apparently wandering trails swept out by the planets were later regularised by the work of Johannes Kepler and Tycho Brahe; an outstanding example of a simple idea explaining more complex observations.

In general Mathematics has provided a framework for understanding the world around us; perhaps most elegantly (at least in work that is generally accessible to the non-professional) in Newton’s Laws of Motion (which explained why Kepler and Brahe’s models for planetary movement worked). The simple formulae employed by Newton seemed to offer a precise set of rules governing everything from the trajectory of an arrow to the orbits of the planets and indeed galaxies; a triumph for the application of Mathematics to the natural world and surely one of humankind’s greatest achievements.

The Antikythera mechanism

For centuries it appeared that natural phenomena seemed to have simple principles underlying them, which were susceptible to description in the language of Mathematics. Sometimes (actually much more often than you might think) the Mathematics became complicated and precision was dropped in favour of – generally more than good enough – estimation; but philosophically Mathematics and the nature of things appeared to be inextricably interlinked. The Physicist and Nobel Laureate E.P. Wigner put this rather more eloquently:

The miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve.

Dihedral Group 3

In my youth I studied Group Theory, a branch of mathematics concerned with patterns and symmetry. The historical roots (no pun intended[1]) of Group Theory are in the solvability of polynomial equations, but the relation with symmetry emerged over time; revealing an important linkage between geometry and algebra. While Group Theory is a part of Pure Mathematics (supposedly studied for its own intrinsic worth, rather than any real-world applications), its applications are actually manifold. Just one example is that groups lie (again no pun intended[2]) at the heart of the Standard Model of Particle Physics.

However, two major challenges to this happy symbiosis between Mathematics and the Natural Sciences arose. One was an abrupt earthquake caused by Kurt Gdel in 1931. The other was more of a slowly rising flood, beginning in the 1880s with Henri Poincar and (arguably) culminating with Ruelle, May and Yorke in 1977 (though with many other notables contributing both before and after 1977). The linkage between Mathematics and Science persists, but maybe some of the chains that form it have been weakened.
Potentially fallacious patterns

However, rather than this article becoming a dissertation on incompleteness theorems or (the rather misleadingly named) chaos theory, I wanted to return to something more visceral that probably underpins at least the beginnings of the long association of Mathematics and Science. Here I refer to people’s general view that things tend to behave the same way as they have in the past. As mentioned at the beginning of this article, the sun comes up each morning, the moon waxes and wanes each month, summer becomes autumn (fall) becomes winter becomes spring and so on. When you knock your coffee cup over it reliably falls to the ground and the contents spill everywhere. These observations about genuine patterns have served us well over the centuries.

It seems a very common human trait to look for patterns. Given the ubiquity of this, it is likely to have had some evolutionary benefit. Indeed patterns are often there and are often useful – there is indeed normally more traffic on the roads at 5pm on Fridays than on other days of the week. Government spending does (with the possible exception of current circumstances) generally go up in advance of an election. However such patterns may be less useful in other areas. While winter is generally colder than summer (in the Northern hemisphere), the average temperature and average rainfall in any given month varies a lot year-on-year. Nevertheless, even within this variability, we try to discern patterns to changes that occur in the weather.


We may come to the conclusion that winters are less severe than when we were younger and thus impute a trend in gradually moderating winters; perhaps punctuated by some years that don’t fit what we assume is an underlying curve. We may take rolling averages to try to iron out local “noise” in various phenomena such as stock prices. This technique relies on the assumption that things change gradually. If the average July temperature has increased by 2°C in the last 100 years, then it maybe makes sense to assume that it will increase by the same 2°C ±0.2°C in the next 100 years. Some of the work I described earlier has rigorously proved that a lot of these human precepts are untrue in many important fields, not least weather prediction. The phrase long-term forecast has been 100% shown to be an oxymoron. Many systems – even the simplest, even those which are apparently stable[3] – can change rapidly and unpredictably and weather is one of them.

Of course the rules state that you must have a picture of a strange attractor in any article referencing chaos theory - I do however get points for not using the word 'fractal' anywhere in the text!

For the avoidance of doubt I am not leaping into the general Climate Change debate here – except in the most general sense. Instead I am highlighting the often erroneous human tendency to believe that when things change they do so smoothly and predictably. That when a pattern shifts, it does so to something quite like the previous pattern. While this assumed smoothness is at the foundation of many of our most powerful models and techniques (for example the grand edifice of The Calculus), in many circumstances it is not a good fit for the choppiness seen in nature.
Obligatory topical section on volcanoes

First published in September 1843 to take part in 'a severe contest between intelligence, which presses forward, and an unworthy, timid ignorance obstructing our progress' [nice use of the Oxford / Harvard comma BTW]

The above observations about the occasionally illusory nature of patterns lead us to more current matters. I was recently reading an article about the Eyjafjallajokull eruption in The Economist. This is suffused with a search for patterns in the history of volcanic eruptions. Here are just a few examples:

  1. Last time Eyjafjallajokull erupted, from late 1821 to early 1823, it also had quite viscous lava. But that does not mean it produced fine ash continuously all the time. The activity settled into a pattern of flaring up every now and then before dying back down to a grumble. If this eruption continues for a similar length of time, it would seem fair to expect something similar.
  2. Previous eruptions of Eyjafjallajokull seem to have acted as harbingers of a subsequent Katla [a nearby volcano] eruptions.
  3. [However] Only two or three [...] of the 23 eruptions of Katla over historical times (which in Iceland means the past 1,200 years or so) have been preceded by eruptions of Eyjafjallajokull.
  4. Katla does seem to erupt on a semi-regular basis, with typical periods between eruptions of between 30 and 80 years. The last eruption was in 1918, which makes the next overdue.

Planes beware!

To be fair, The Economist did lace their piece with various caveats, for example the above-quoted “it would seem fair to expect”, but not all publications are so scrupulous. There is perhaps something comforting in all this numerology, maybe it gives us the illusion that we can make meaningful predictions about what a volcano will do next. Modern geologists have used a number of techniques to warn of imminent eruptions and these approaches have been successful and saved lives. However this is not the same thing as predicting that an eruption is likely in the next ten years solely because they normally occur every century and it is 90 years since the last one. Long-term forecasts of volcanic activity are as chimerical as long-term weather forecasts.
A little light analysis

Looking at another famous volcano, Vesuvius, I have put together the following simple chart.

Spot the pattern?

The average period between eruptions is just shy of 14 years, but the pattern is anything but regular. If we expand our range a bit, we might ask how many eruptions occurred between 10 and 20 years after the previous one. The answer is just 9 of the 26[4], or about 35%. Even if we expand our range to periods of calm lasting between 5 and 25 years (so 10 years of leeway on either side), we only capture 77% of eruptions. The standard deviation of the periods between recorded eruptions is a whopping 12.5; eruptions of Vesuvius are not regular events.

One aspect of truly random distributions at first seems counterfactual, this is their lumpiness. It might seem reasonable to assume that a random set of events would lead to a nicely spaced out distribution; maybe not a set of evenly-spaced points, but a close approximation to one. In fact the opposite is generally true; random distributions will have clusters of events close to each other and large gaps between them.

Pseudo-random and truly random

The above exhibit (a non-wrapped version of which may be viewed by clicking on it) illustrates this point. It compares a set of pseudo-random numbers (the upper points) with a set of truly random numbers (the lower points)[5]. There are some gaps in the upper distribution, but none are large and the spread is pretty even. By contrast in the lower set there are many large gaps (some of the more major ones being tagged a, ... ,h) and significant clumping[6]. Which of these two distributions more closely matches the eruptions of Vesuvius? What does this tell us about the predictability of its eruptions?
The predictive analytics angle
As always in closing I will bring these discussions back to a business focus. The above observations should give people involved in applying statistical techniques to make predictions about the future some pause for thought. Here I am not targeting the professional statistician; I assume such people will be more than aware of potential pitfalls and possess much greater depth of knowledge than myself about how to avoid them. However many users of numbers will not have this background and we are all genetically programmed to seek patterns, even where none may exist. Predictive analytics is a very useful tool when applied correctly and when its findings are presented as a potential range of outcomes, complete with associated probabilities. Unfortunately this is not always the case.

It is worth noting that many business events can be just as unpredictable as volcanic eruptions. Trying to foresee the future with too much precision is going to lead to disappointment; to say nothing of being engulfed by lava flows.

But the model said...

Explanatory notes

[1] The solvability of polynomials is of course equivalent to whether or not roots of them exist.
[2] Lie groups lie at the heart of quantum field theory – a interesting lexicographical symmetry in itself
[3] Indeed it has been argued that non-linear systems are more robust in response to external stimuli than classical ones. The latter tend to respond to “jolts” in a smooth manner leading to a change in state. The former often will revert to their previous strange attractor. It has been postulated that evolution has taken advantage of this fact in demonstrably chaotic systems such as the human heart.
[4] Here I include the – to date – 66 years since Vesuvius’ last eruption in 1944 and exclude the eruption in 1631 as there is no record of the preceding one.
[5] For anyone interested, the upper set of numbers were generated using Excel’s RAND() function and the lower are successive triplets of the decimal expansion of pi, e.g. 141, 592, 653 etc.
[6] Again for those interested the average gap in the upper set is 10.1 with a standard deviation of 4.3; the figures for the lower set are 9.7 and 9.6 respectively.


tweet this Tweet this article on
Bookmark this article with:
| Facebook | | digg | Reddit | Stumble


Filed under: business intelligence Tagged: chaos theory, eyjafjallajokull, mathematics, non-linear systems, predictive analytcis, statistics, vesuvius, volcano, weather prediction

Posted by Peter Thomas at 6:42 PM

April 19, 2010

Informatica interview

While spring cleaning at home at the weekend, I came across a DVD of an interview I did for Informatica back in March 2005. This is still accessible on the Informatica web-site and appears in my video library, but I thought that I had lost my copy of the original.

Having made this discovery, I added it to my selection of videos on

In the interview I stress the need for consistency in management information; the dynamics of the Insurance industry and the business value added by Business Intelligence in a pan-European insurance organisation.

Disclosure – Part I: In the work I refer to above, I leveraged Infomatica’s toolset (PowerCentre) alongside software from Oracle (RDBMS and PL/SQL), IBM Cognos (PowerPlay and ReportStudio) and Microsoft (.NET). I have used tools from other vendors in other projects. While there is clearly a promotional sub-text to the video, it is not a product endorsement and I believe that my comments are generally applicable to any business intelligence / data warehousing project.
Disclosure – Part II: I have already had it pointed out to me – by @ocdqblog and others – that the braces (suspenders if you are from the US; suspenders having quite a different connotation in the UK) were perhaps something of a fashion faux pas. My American partner has long since despaired of my British approach to “co-ordination” of patterns. You may be glad to know that I no longer own the offending item.
tweet this Tweet this article on
Bookmark this article with:
| Facebook | | digg | Reddit | Stumble


Filed under: business intelligence, data warehousing, informatica Tagged: data integration, video

Posted by Peter Thomas at 11:42 AM

How to Measure BI Value – Microsoft Services

Microsoft Services

One of the benefits of the platform is that you can get some indication as to which other parts the the web are directing traffic your way. It was via this facility that I came across an article on Microsoft‘s site linking back to my piece Measuring the benefits of Business Intelligence. The title, sub-title and authorship of the Microsoft post is as follows:

How to Measure BI Value

A thorough assessment will help you demonstrate the effectiveness of your BI investments. We offer 8 factors to consider.

By Paula Klein, TechWeb

You can read the article here.

As always, my aim in writing this column is to remain vendor-neutral, however the Microsoft piece is not specifically pushing their BI products (though clearly further information about them is only a click away), but rather offering some general commentary.

Again it is interesting to note the penetration of social media (such as this blog) into mainstream technology business.

tweet this Tweet this article on
Bookmark this article with:
| Facebook | | digg | Reddit | Stumble


Filed under: business intelligence, microsoft, social media Tagged:

Posted by Peter Thomas at 8:40 AM

April 16, 2010

Ovum / Butler Group BI Seminar in London

Sarah Burnett, Madan Sheina and Peter Thomas

On 29th April 2010, I will be speaking at this event, along with Sarah Burnett (Sarah’s blog, @sarahburnett) and Madan Sheina. The seminar is entitled Business Intelligence – Effective Strategies for Successful Deployment and you can find further details here.


Ovum - Incorporating Butler Group Ovum provides clients with independent and objective analysis that enables them to make better business and technology decisions.


tweet this Tweet this article on
Bookmark this article with:
| Facebook | | digg | Reddit | Stumble


Filed under: business intelligence Tagged: butler group, ovum, sarah burnett, seminar

Posted by Peter Thomas at 5:55 AM

April 13, 2010

Independent Analysts and Social Media – a marriage made in heaven

Oracle EPM and BI Merv Adrian - IT Market Strategies for Suppliers

I have been a regular visitor to Merv Adrian’s excellent blog since just after its inception and have got to know Merv virtually via twitter (@merv) and other channels. I recently read his article : Oracle Ups EPM Ante, which covered Oracle’s latest progress in integrating its various in-house and acquired technologies in the Enterprise Performance Management and Business Intelligence arenas.

The article is clearly written and helpful, I recommend you take a look if these areas impinge upon you. One section caught my attention (my emphasis):

Finally, Oracle has long had a sizable base in government, and its new Hyperion Public Sector Planning and Budgeting app suite continues the integration theme, tapping its ERP apps (both Oracle E-Business Suite [EBS] and PeopleSoft ERP) for bidirectional feeds.

My current responsibilities include EPM, BI and the third Oracle ERP product, JD Edwards. I don’t work in the public sector, but was nevertheless interested in the concept of how and whether JDE fitted into the above scenario. I posted a comment and within a few hours Merv replied, having spoken to his senior Oracle contacts. The reply was from a vendor-neutral source, but based on information “straight from the horse’s mouth”. It is illuminating to ponder how I could have got a credible answer to this type of question any quicker.

To recap, my interactions with Merv are via the professional social media Holy Trinity of blogs, and The above is just one small example of how industry experts can leverage social media to get their message across, increase their network of influence and deliver very rapid value. I can only see these types of interactions increasing in the future. Sometimes social media can be over-hyped, but in the world of industry analysis it seems to be a marriage made in heaven.

Analyst and consultant Merv Adrian founded IT Market Strategy after three decades in the IT industry. During his tenure as Senior Vice President at Forrester Research, he was responsible for all of Forrester’s technology research, covered the software industry and launched Forrester’s well-regarded practice in Analyst Relations. Earlier, as Vice President at Giga Information Group, Merv focused on facilitating collaborative research and served as executive editor of the monthly Research Digest and weekly GigaFlash.

Prior to becoming an analyst, Merv was Senior Director, Strategic Marketing at Sybase, where he also held director positions in data warehouse marketing and analyst relations. Prior to Sybase, Merv served as a marketing manager at Information Builders, where he founded and edited a technical journal and a marketing quarterly, subsequently becoming involved in corporate and product marketing and launching a formal AR role.

tweet this Tweet this article on
Bookmark this article with:
| Facebook | | digg | Reddit | Stumble


Filed under: enterprise performance management, oracle, social media Tagged:, merv adrian,

Posted by Peter Thomas at 7:45 AM

BI and Competition – Bruno Aziza at Microsoft

Bruno Aziza Worldwide Strategy Lead, Business Intelligence at Microsoft


Bruno Aziza, Worldwide Strategy Lead for Business Intelligence at Microsoft recently drew my attention to his article on The Official Microsoft Blog entitled Use Business Intelligence To Compete More Effectively.

My blog attempts to stay vendor-neutral, but much of Bruno’s article is also in the same vein; aside from the banner appearing at the top of course. It is noteworthy how many of the big players are realising that engaging with the on-line community in a sotto voce manner is probably worth much more than a fortissimo sales pitch. This approach was also notable in another output from the BI stable at Microsoft; Nic Smith’s ”History of Business Intelligence” , which I reviewed in March 2009. However, aside from these comments I’ll focus more on what Bruno says than on who he works for; and what he says is interesting.

His main thesis is that good BI can “sharpen competitive skills [...] turning competitive insights into new ways to do business”. I think that it is intriguing how some organisations, ideally having already got their internal BI working well, are now looking to squeeze even further value out of their BI platform by incorporating more outward-looking information; information relating to their markets, their customers and their competitors. This was the tenth BI trend I predicted in another article from March 2009. However, I can’t really claim to be all that prescient as this development seems pretty common-sensical to me.
Setting the bar higher

Competition between companies is generally seen as a positive thing – one reason that there is so much focus on anti-trust laws at present. Competition makes the companies involved in it (or at least those that survive) healthier, their products more attuned to customer needs, their services more apt. It also tends to deliver better value and choice to customers and thus in aggregate drives overall economic well-being (though of course it can also generate losers).

Setting the bar higher

In one of my my earliest blog articles, Business Intelligence and Transparency, I argued that good BI could also drive healthy internal competition by making the performance of different teams and individuals more accessible and comparable (not least to the teams and individuals themselves). My suggestion was that this would in turn drive a focus on relative performance, rather than settling for absolute performance. The latter can lead to complacency, the former ensures that the bar is always reset a little higher. Although this might seem potentially divisive at first, my experience of it in operation was that it led to a very positive corporate culture.

Although organisations in competition with each other are unlikely to share benchmarks in the same way as sub-sections of a single organisation, it is often possible to glean information from customers, industry associations, market research companies, or even the published accounts of other firms. Blended with internal data, this type of information can form a powerful combination; though accuracy is something that needs to be born in mind even more than with data that is subject to internal governance.
A new source of competitive advantage

"Lightning" striking twice in Bejing

Bruno’s suggestion is that the way that companies leverage commonly available information (say Governmental statistics) and combine this with their own numbers is in itself a source of competitive advantage. I think that there is something important here. One of the plaudits laid at the feet of retail behemonth Wal Mart is that it is great at leveraging the masses of data collected in its stores and using this in creative ways; ways that some of its competition cannot master to the same degree.

In recent decades a lot of organisations have attempted to define their core competencies and then stick to these. Maybe a competency in generating meaningful information from both internal and external sources and then – crucially – using this to drive different behaviours, is something that no self-respecting company should be without in the 2010s.

You can follow Bruno on at @brunoaziza

tweet this Tweet this article on
Bookmark this article with:
| Facebook | | digg | Reddit | Stumble


Filed under: business intelligence, microsoft, social media Tagged: 100m sprint, bruno aziza, high jump, wal mart

Posted by Peter Thomas at 4:26 AM