Thursday, November 26, 2009

Introducing the IT Market Clock

Introducing the IT Market Clock
By Brian Gammage, vice president and Fellow, Gartner

Published: November 26 2009 16:15 | Last updated: November 26 2009 16:15

IT is no longer an emerging set of capabilities and markets – it is a maturing business tool and must be managed as such.

Although new capabilities continue to appear in the market, their adoption and use require them to be integrated into a portfolio of existing IT assets, many of which are already mature.

Some IT assets are no longer required, or no longer deliver sufficient business value to justify the costs of maintaining them. Usually, working to budget means new IT products and services can only be adopted if existing IT assets are retired or replaced.

Every IT product and service has a finite useful life and must eventually be retired or replaced. Correct timing of this retirement/replacement is critical.

The second part of useful life, from maturity to obsolescence, must be considered when managing IT assets throughout their whole life cycles. Most organisations require more-holistic mechanisms for planning IT divestment and reinvestment activity.

Gartner’s IT Market Clock is a new framework that supports strategic investment and divestment decisions. Tools and methodologies that focus only on technology adoption are no longer sufficient to support the decisions required to manage portfolios of IT assets throughout their full lifetime of use.

Gartner’s Hype Cycle for example, which the Gartner Market Clock complements, is a buyer’s decision framework for technology adoption, but its view ends when mainstream adoption begins, which typically equates to an adoption level of between 20 and 50 per cent.

Simply, the Hype Cycle supports “technology hunting” decisions, while the IT Market Clock supports “farming” decisions for assets already in use.

The IT Market Clock uses a clock-face metaphor to represent relative market time. Each point positioned on the IT Market Clock represents an IT asset or asset class: for example, desktop PCs, packaged maintenance and support services or corporate learning systems.

Technology assets are positioned on the IT Market Clock using two parameters. The first is where they currently lie within their own useful market life, from the first time the technology product or service can be acquired and used to the last time it can be viably used.

This determines the rotational position of the asset on the Market Clock – each begins at 0 (called ”Market Start”), and moves clockwise round to 12 o’clock.

The second is relative level of commoditisation, ie the ease with which the technology product or service can be interchanged with alternatives. Relative commoditisation determines the distance from the centre of the Market Clock; assets further from the centre are more commoditised.

Commoditisation is a proxy for the balance of market power between buyers/users and suppliers. For most asset classes, relative commoditisation levels begin low, increase steadily as the market matures and then decrease again toward end of life.

The IT Market Clock is divided into quarters, each representing one of four market phases of the useful market life of an IT asset.

The Advantage quarter represents the first stage of market life, during which technologies are often proprietary or highly customised and assets provide differentiated technology, service or capability.

There will usually be limited supply options and high dependence on relevant skills. Users should focus on benefits received.

Choice is the second phase of market life, during which technology assets are subject to increasing levels of standardisation and growing supply options. Users should re-evaluate the level of required customisation, prices and supply choices periodically as assets in this phase offer the greatest scope for cost savings.

The Cost quarter is the third phase of market life, during which assets reach their highest levels of commoditisation. Differentiation between alternative sources is at its minimum level and competition centres on price. Users should focus on acquisition and switching costs and ensure minimal skill-set dependencies.

Replacement is the final phase of market life, during which assets begin to move towards end of life, usually because they comprise legacy technologies, services or capabilities.

Supply choices and access to available skill sets will be decreasing, leading to rising operational costs. Their retirement or upgrade is essential. User organisations need to monitor operating costs for IT products and services in the disfavoured phase of their market life.

Operating costs rise toward end of market life, highlighting a growing urgency for retirement or replacement. For example, the skills needed to support and maintain mainframes and business applications at end-of-life are in increasingly short supply.

Suppliers and buying organisations can move to offset these issues during the Replacement phase, as, for example, has happened in the UK, with leading financial institutions encouraging universities to place Assemble and Cobol (which is now 50 years old) back on their curriculums.

But while such moves can alleviate immediate problems, each initiative to extend useful life typically comes at higher cost.

Moreover, as more companies move off legacy technologies, the burden of responsibility for maintaining associated skill sets falls to a diminishing number of organisations. The marginal costs of continuing to use technologies as they approach the end of their useful lives will increase.

With a holistic decision framework, user organisations will be able to manage their asset portfolios proactively and determine the right time to adopt and deploy emerging or adolescent technology options, establish road map plans for replacement and upgrade of existing technology assets, and perform reviews with suppliers for best saving opportunities.

Although such a framework is focused on technology assets, the same approach could also be extended and applied to any class of business assets.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

The final frontier of business advantage

The final frontier of business advantage
By Alan Cane

Published: November 26 2009 18:04 | Last updated: November 26 2009 18:04

Business intelligence, information intelligence, business analytics: whatever you call it, all the evidence is that ways of turning a company’s raw data into information that can be used to improve performance and achieve competitive advantage is the topic du jour in many business leaders’ minds.

A survey carried out this year by the US-based consultancy Forrester Research revealed that of more than 1,000 IT decision makers canvassed in North America and Europe, more than two thirds were considering, piloting, implementing or expanding business intelligence (BI) systems.

“Even in these tough economic times, virtually nobody in our surveys says they are retrenching or reducing their business intelligence initiatives,” says Boris Evelson, a principal analyst for Forrester with more than 30 years experience of BI implementation behind him.

What is BI management? It is not about the technical nitty gritty of data warehousing or cleansing technology. While technologies are important – and most are good and effective, according to Mr Evelson – BI management is about ways of systematically making the most of customer information– what it is and what you can do with it.

More prosaically, it is everything that has to be done to raw data before they can be manipulated to facilitate better decision making.

Dashboard that can give a warning light on overspending

Law firm Clifford Chance has found itself learning about habits it never knew it had since analysing its spending trends through an online service provided by Rosslyn Analytics, a boutique software company based in London, writes Dan Ilett.

“It’s very flexible,” says Julien Cranwell, Clifford Chance’s procurement manager. “You can look at your data to reduce spending. We’ve identified opportunities that we wouldn’t have otherwise seen. It’s made us feel a lot more confident of the data we’ve been using.”

The company, which has 29 offices in 20 countries, used a web-based tool called rapidintel.com. The service works like a dashboard with charts and graphs to give an overview of where money has been spent.

“It aggregates and shares information,” says Charles Clark, chief executive of Rosslyn Analytics. “We extract the data in a few hours and categorise them so they go into certain buckets. We then add other data such as credit card or risk information.

“It’s presented as a ready-to-use report. The data cube never changes but they can see it from so many different angles. It’s one view of all company-wide finance, procurement, accounts payable and spend data.”

“Some of the larger areas of spending have been travel, catering and entertainment,” says Mr Cranwell. “It shows where we have varying levels of spending between offices. We are then in a position of power because we know much more about our spending patterns.

“We’ve also looked at a cost recovery programme. Using Rosslyn’s expertise we’re using a module that works on contract management.”

The firm claims to have seen a return on investment of 100 per cent within two months. “The payback period was very fast indeed,” says Mr Cranwell.
It is also about understanding the business and its processes well enough to know what questions should be asked of the data to improve performance.

The basic idea was pioneered more than a decade ago by the US computer manufacturer Teradata, which combined supercomputer performance with sophisticated software to scan and detect trends and patterns in huge volumes of data.

But it was expensive and ahead of its time. Today, high-performance, low-cost computer systems and cheap memory mean that enterprises can and are collecting and storing data in unprecedented amounts.

However, they are struggling to make sense of what they have.

In Mr Evelson’s words: “We have to find the data, we have to extract it, we have to integrate it, we have to map apples to oranges, we have to clean it up, we have to aggregate it, we have to model it and we have to store it in something like a data warehouse.

“We have to understand what kind of metrics we want to track – times, customers, regions and then, and only then, can we start reporting.”

Everybody agrees there is nothing simple about these operations. “It is a very complex endeavour,” says Mr Evelson, “and that is why this market is very immature.”

The business opportunity for BI software has not been lost on IT companies and there has already been significant consolidation in the market, with IBM acquiring, among others, Cognos; SAP buying Business Objects; and Oracle purchasing Hyperion to add BI strings to their respective bows.

Microsoft offers BI software called SharePoint Server and there is considerable interest in open source BI software from younger companies such as Pentaho and Jaspersoft.

IBM alone reckons to have spent $12bn and trained 4,000 consultants over the past few years to develop the tools and knowledge which will encourage intelligence management in its customers.

Ambuj Goyal, who leads the company’s information management initiative, argues that it is a new approach that will “turn the world a little bit upside down”.

“Business efficiency over the past 20 years was all about automating a process – enterprise resource planning [ERP] for example. It generated huge efficiencies for businesses but is no longer a [competitive] differentiator.

“In the past two or three years we have started to look at information as a strategic capital asset for the organisation. This will generate 20, 30 or 40 per cent improvements in the way we run businesses as opposed to the 3 or 5 per cent improvements we achieved before.”

But revolutions are rarely pain-free. According to the Forrester survey: “For many large enterprises, BI remains and will continue to be the ‘last frontier’ of competitive differentiation.

“Unfortunately, as the demand for pervasive and comprehensive BI applications continues to increase, the complexity, cost and effort of large-enterprise BI implementations increases as well.

“As a result, the great examples of successful implementations among Forrester’s clients are outnumbered by the volume of underperforming BI environments.”

In fact, more than two thirds of users questioned said they found BI applications hard or very hard to learn, navigate and use.

The business case for BI management is not helped by the difficulty of making a strong case for return on investment.

It is, for example, hard to decide which tools and processes should be included in the assessment – Microsoft’s SharePoint is much more than a BI tool, for example, but separating out which strands are contributing to improved revenues and which are not is a challenge.

As Mr Evelson notes: “The grey boundary lines around which process and tools to include, the multiple BI components that typically need to be customised and integrated, and the frequent unpredictability of BI system integration efforts all make BI business cases an effort not for the faint of heart.”

How, then, should executives think about business intelligence management? Royce Bell, information management specialist with the consultancy Accenture takes a robustly pragmatic view: “Business is made up of processes. Some of them may interact with the outside world, but there is a definite chain of events.

“All that business intelligence is supposed to inform, is any decision along that chain of events. The question an executive should be asking is: ‘At this point in the chain, what information do I need?’.

“Going through each and every one of your processes to be able to ask that question is hard. People are disappointed because they haven’t been able to get wisdom simply by piling all the data in one place.

“That [data warehousing and mining] sounds more exciting and more fun than going through your processes to determine what you need.”

Mr Bell believes that many executives are suspicious of the quality of the information provided by BI software: they think the data are “rubbish”, and there is no doubt that transforming data into intelligence requires clean data.

Roger Llewellyn is chief executive of the UK software group Kognitio, which has responsibility for analysing, among other things, telephone calls made by customers of British Telecom and store purchases that use the Nectar loyalty card of supermarket chain, J Sainsbury.

He says that up to 80 per cent of the price of a new contract can be the cost of cleaning the data – converting, in one case, 15 data types to a single standard.

The Sainsbury contract involves the analysis of the 20bn items purchased in the chain’s stores every nine months – enough, if typed on paper, to make an in-tray pile almost 17kms high.

How can this huge volume of bits and bytes be turned into useful information?

Mr Llewellyn gives the example of skin creams sold to counter stretch marks. Generally bought predominantly by women, if particular stores show high sales volumes, there are likely to be a lot of pregnancies in those areas – an alert for the store manager to stock up on maternity magazines, baby food and clothing.

And if most of the clothing bought is blue, there will be a lot of baby boys in the region: “From buying a jar of stretch cream, I’ve almost got you for life,” Mr Llewellyn beams.

James McGeever, chief financial officer of the US company NetSuite, which markets BI management software, underlines the importance of clean, unambiguous data in breaking down “silos” – data stored in different places and formats within an organisation: “I believe that if the same piece of data exists in two places then one will be wrong.”

The NetSuite answer for its customers is to convert all the data to one consistent type and store it in one repository: “The physical process of loading the data is not as tough as it may sound. It’s actually deciding what data to store there and how to organise your workflows that is the difficult part.”

NetSuite provides executives with tailored “dashboards”, a visual representation of the information important to their jobs.

A well-designed dashboard providing the right amount of pertinent information is a crucial part of BI according to Peter Lumley and Stephen Black of PA Consulting.

They point out that it is often forgotten that managers have limited time to absorb and act on information which, in any case, may be imperfect – if it was perfect, decision making would be no chore at all. A well-designed dashboard can help managers make the best possible decision from incomplete information.

The information, of course, has to be trusted and that is where technology can play an important part – in the automatic roll-up of data to a central repository: “Every time you go through a stage with manual intervention you have the opportunity for time delay and misinterpretation,” Mr Lumley argues.

And these mis-steps are precisely what business intelligence management hopes to avoid.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

The final frontier of business advantage

The final frontier of business advantage
By Alan Cane

Published: November 26 2009 18:04 | Last updated: November 26 2009 18:04

Business intelligence, information intelligence, business analytics: whatever you call it, all the evidence is that ways of turning a company’s raw data into information that can be used to improve performance and achieve competitive advantage is the topic du jour in many business leaders’ minds.

A survey carried out this year by the US-based consultancy Forrester Research revealed that of more than 1,000 IT decision makers canvassed in North America and Europe, more than two thirds were considering, piloting, implementing or expanding business intelligence (BI) systems.

“Even in these tough economic times, virtually nobody in our surveys says they are retrenching or reducing their business intelligence initiatives,” says Boris Evelson, a principal analyst for Forrester with more than 30 years experience of BI implementation behind him.

What is BI management? It is not about the technical nitty gritty of data warehousing or cleansing technology. While technologies are important – and most are good and effective, according to Mr Evelson – BI management is about ways of systematically making the most of customer information– what it is and what you can do with it.

More prosaically, it is everything that has to be done to raw data before they can be manipulated to facilitate better decision making.

Dashboard that can give a warning light on overspending

Law firm Clifford Chance has found itself learning about habits it never knew it had since analysing its spending trends through an online service provided by Rosslyn Analytics, a boutique software company based in London, writes Dan Ilett.

“It’s very flexible,” says Julien Cranwell, Clifford Chance’s procurement manager. “You can look at your data to reduce spending. We’ve identified opportunities that we wouldn’t have otherwise seen. It’s made us feel a lot more confident of the data we’ve been using.”

The company, which has 29 offices in 20 countries, used a web-based tool called rapidintel.com. The service works like a dashboard with charts and graphs to give an overview of where money has been spent.

“It aggregates and shares information,” says Charles Clark, chief executive of Rosslyn Analytics. “We extract the data in a few hours and categorise them so they go into certain buckets. We then add other data such as credit card or risk information.

“It’s presented as a ready-to-use report. The data cube never changes but they can see it from so many different angles. It’s one view of all company-wide finance, procurement, accounts payable and spend data.”

“Some of the larger areas of spending have been travel, catering and entertainment,” says Mr Cranwell. “It shows where we have varying levels of spending between offices. We are then in a position of power because we know much more about our spending patterns.

“We’ve also looked at a cost recovery programme. Using Rosslyn’s expertise we’re using a module that works on contract management.”

The firm claims to have seen a return on investment of 100 per cent within two months. “The payback period was very fast indeed,” says Mr Cranwell.
It is also about understanding the business and its processes well enough to know what questions should be asked of the data to improve performance.

The basic idea was pioneered more than a decade ago by the US computer manufacturer Teradata, which combined supercomputer performance with sophisticated software to scan and detect trends and patterns in huge volumes of data.

But it was expensive and ahead of its time. Today, high-performance, low-cost computer systems and cheap memory mean that enterprises can and are collecting and storing data in unprecedented amounts.

However, they are struggling to make sense of what they have.

In Mr Evelson’s words: “We have to find the data, we have to extract it, we have to integrate it, we have to map apples to oranges, we have to clean it up, we have to aggregate it, we have to model it and we have to store it in something like a data warehouse.

“We have to understand what kind of metrics we want to track – times, customers, regions and then, and only then, can we start reporting.”

Everybody agrees there is nothing simple about these operations. “It is a very complex endeavour,” says Mr Evelson, “and that is why this market is very immature.”

The business opportunity for BI software has not been lost on IT companies and there has already been significant consolidation in the market, with IBM acquiring, among others, Cognos; SAP buying Business Objects; and Oracle purchasing Hyperion to add BI strings to their respective bows.

Microsoft offers BI software called SharePoint Server and there is considerable interest in open source BI software from younger companies such as Pentaho and Jaspersoft.

IBM alone reckons to have spent $12bn and trained 4,000 consultants over the past few years to develop the tools and knowledge which will encourage intelligence management in its customers.

Ambuj Goyal, who leads the company’s information management initiative, argues that it is a new approach that will “turn the world a little bit upside down”.

“Business efficiency over the past 20 years was all about automating a process – enterprise resource planning [ERP] for example. It generated huge efficiencies for businesses but is no longer a [competitive] differentiator.

“In the past two or three years we have started to look at information as a strategic capital asset for the organisation. This will generate 20, 30 or 40 per cent improvements in the way we run businesses as opposed to the 3 or 5 per cent improvements we achieved before.”

But revolutions are rarely pain-free. According to the Forrester survey: “For many large enterprises, BI remains and will continue to be the ‘last frontier’ of competitive differentiation.

“Unfortunately, as the demand for pervasive and comprehensive BI applications continues to increase, the complexity, cost and effort of large-enterprise BI implementations increases as well.

“As a result, the great examples of successful implementations among Forrester’s clients are outnumbered by the volume of underperforming BI environments.”

In fact, more than two thirds of users questioned said they found BI applications hard or very hard to learn, navigate and use.

The business case for BI management is not helped by the difficulty of making a strong case for return on investment.

It is, for example, hard to decide which tools and processes should be included in the assessment – Microsoft’s SharePoint is much more than a BI tool, for example, but separating out which strands are contributing to improved revenues and which are not is a challenge.

As Mr Evelson notes: “The grey boundary lines around which process and tools to include, the multiple BI components that typically need to be customised and integrated, and the frequent unpredictability of BI system integration efforts all make BI business cases an effort not for the faint of heart.”

How, then, should executives think about business intelligence management? Royce Bell, information management specialist with the consultancy Accenture takes a robustly pragmatic view: “Business is made up of processes. Some of them may interact with the outside world, but there is a definite chain of events.

“All that business intelligence is supposed to inform, is any decision along that chain of events. The question an executive should be asking is: ‘At this point in the chain, what information do I need?’.

“Going through each and every one of your processes to be able to ask that question is hard. People are disappointed because they haven’t been able to get wisdom simply by piling all the data in one place.

“That [data warehousing and mining] sounds more exciting and more fun than going through your processes to determine what you need.”

Mr Bell believes that many executives are suspicious of the quality of the information provided by BI software: they think the data are “rubbish”, and there is no doubt that transforming data into intelligence requires clean data.

Roger Llewellyn is chief executive of the UK software group Kognitio, which has responsibility for analysing, among other things, telephone calls made by customers of British Telecom and store purchases that use the Nectar loyalty card of supermarket chain, J Sainsbury.

He says that up to 80 per cent of the price of a new contract can be the cost of cleaning the data – converting, in one case, 15 data types to a single standard.

The Sainsbury contract involves the analysis of the 20bn items purchased in the chain’s stores every nine months – enough, if typed on paper, to make an in-tray pile almost 17kms high.

How can this huge volume of bits and bytes be turned into useful information?

Mr Llewellyn gives the example of skin creams sold to counter stretch marks. Generally bought predominantly by women, if particular stores show high sales volumes, there are likely to be a lot of pregnancies in those areas – an alert for the store manager to stock up on maternity magazines, baby food and clothing.

And if most of the clothing bought is blue, there will be a lot of baby boys in the region: “From buying a jar of stretch cream, I’ve almost got you for life,” Mr Llewellyn beams.

James McGeever, chief financial officer of the US company NetSuite, which markets BI management software, underlines the importance of clean, unambiguous data in breaking down “silos” – data stored in different places and formats within an organisation: “I believe that if the same piece of data exists in two places then one will be wrong.”

The NetSuite answer for its customers is to convert all the data to one consistent type and store it in one repository: “The physical process of loading the data is not as tough as it may sound. It’s actually deciding what data to store there and how to organise your workflows that is the difficult part.”

NetSuite provides executives with tailored “dashboards”, a visual representation of the information important to their jobs.

A well-designed dashboard providing the right amount of pertinent information is a crucial part of BI according to Peter Lumley and Stephen Black of PA Consulting.

They point out that it is often forgotten that managers have limited time to absorb and act on information which, in any case, may be imperfect – if it was perfect, decision making would be no chore at all. A well-designed dashboard can help managers make the best possible decision from incomplete information.

The information, of course, has to be trusted and that is where technology can play an important part – in the automatic roll-up of data to a central repository: “Every time you go through a stage with manual intervention you have the opportunity for time delay and misinterpretation,” Mr Lumley argues.

And these mis-steps are precisely what business intelligence management hopes to avoid.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Resources: Finding a home for all that data

Resources: Finding a home for all that data
By Stephen Pritchard

Published: November 26 2009 18:04 | Last updated: November 26 2009 18:04

When companies started to build the first enterprise data warehouse and knowledge management systems, in the late 1970s, there was little doubt that these were projects that demanded significant investment in both time and resources.

The early data warehouse systems certainly required mainframe resources, and running queries took days, if not weeks.

But advances in computing power, as well as improvements in programming, have done much to reduce the infrastructure demands of business intelligence (BI). It is now quite possible to run small-scale BI queries using little more than a data source, a laptop computer and a spreadsheet program.

Some businesses – especially smaller ones – do indeed manage their data analysis this way.

However, BI experts caution that this approach struggles to scale up to support the larger enterprise, and can raise real difficulties in areas such as data governance and lead to companies having multiple master data sets, or “multiple versions of the truth”.

“Many people start with something small in scope, and there is nothing wrong with that,” says Jeanne Harris, a BI specialist at Accenture’s Institute for High Performance Business.

“But if marketing, and finance, and sales have their own scorecards, based on their own data, it will be a Tower of Babel. Very few organisations have done a good job of creating a single view of their data.”

Nor is the hardware challenge one that chief information officers – or users of business data – can completely ignore.

Although processing power has increased in line with Moore’s Law and data storage has also fallen in price, the growth of business data is faster still. Volumes of data are reckoned to double every 12 to 18 months, twice as fast as just three years ago.

Some businesses are reacting by moving to grid-based supercomputers, or by offloading BI processing to private or public “clouds”. Others are deploying solid-state hard drives in their data warehouses, because of the superior data throughput they offer.

But such systems are expensive and large organisations, in particular, are beginning to struggle with the time it takes to load data into a warehouse or a BI system, especially if it comes from multiple sources.

“With data warehousing appliances [dedicated computers for data processing], the bottleneck is not the speed of the box or the quantity of storage but the time it takes to load the information, especially if you are dealing with demographic information,” says Bill Hewitt, president and chief executive of Kalido, a data management company.

“Even at data loading rates of 10 gigabytes an hour, there is one company that is looking at 39 weeks to load its data.”

This is leading some companies to consider alternative approaches to analytics, such as stream-based processing. It is also prompting businesses to look at BI tools, as well as broader-based technologies such as enterprise search, that can examine data in situ, rather than require them to be loaded into a warehouse and then processed.

Such technologies could also help businesses to overcome their reliance on data from operational systems, such as customer relationship management or enterprise resource planning. Such transactional data are almost always historic, and leads to BI acting as a “rear view mirror” for management, rather than as an accurate predictor of trends.

“Most organisations don’t use external data but rely on [data from] their operational systems to solve specific problems,” explains Earl Atkinson, a BI expert at PA Consulting Group. As a result, the data will only be as good – and as timely – as the information held in those underlying systems.

Before companies can build enterprise-wide knowledge management or BI systems, they also need to work on the quality of the data. Data can also be accurate but partial, or misleading, especially if they were originally gathered for a different purpose.

“A customer, for example, can exist in multiple IT systems,” points out Tony Young, CIO of Informatica, a data management technology vendor. “You need to have a common agreement on who the customer is, for example, if you want to look at their history.

“If I ask a financial person who the customer is, it is the person you bill. Marketing will say it’s the person who responds to a campaign. For sales it might be the person signing the cheque. These are all correct, but they are not common. You have to agree how you are going to treat that information.”

This, more than hardware assets, network capacity, or even the ability to write complex algorithms to analyse data, goes to the heart of the debate around the resources needed for advanced business intelligence.

Organisations need to decide, early on, which information they are going to use, and be honest about the completeness, or otherwise, of their data sets.

If they do not, the results can be disastrous.

“In the run up to the financial crisis, institutions knew that there were three categories of risk but they only had data for one. So that was the one they thought about,” says Accenture’s Ms Harris. “You need to understand all of the risk variables and how they relate to each other, and this needs different technologies and capabilities in modelling, and in experimental design.”

Organisations also need to consider whether conventional data sources, such as those produced by back-office IT applications, or by more specialist tools, such as a retail point-of-sale system or a supply chain management system, really give the full picture.

Increasingly, companies are looking for ways to mine the information held in “unstructured” data, such as e-mails, presentations and documents, or even video clips or recorded phone calls, to provide a basis for BI, and hence better decision making.

“As much as 80 per cent of the information in a company is unstructured, against just 20 per cent that is structured,” notes Bob Tennant, chief executive at Recommind, a company that specialises in using search technology for information risk management.

“Most business intelligence is focused on that 20 per cent of structured data, as it is pretty high value and easy to deal with. But there are a lot of useful, unstructured data that are not being taken advantage of.”

Tapping into that unstructured information might not be easy. But it is the best, and for some companies, probably the only way to make more use of existing resources, in order to make better business decisions.

..............................................................................................

Q&A: ING Lease UK

ING Lease UK is part of the ING Group – one of the largest financial companies in the world. In 2004, the company acquired three businesses from Abbey National Group.

With 300 employees and 100,000 customers, the company has to ensure its reporting and market perception is as accurate as it can be.

Dan Ilett, for Digital Business, questioned Chris Stamper, chief executive of ING Lease UK, about how it creates useful intelligence from its information.

Digital Business What did you do to improve internal reporting?

Chris Stamper We turned conventional wisdom on its head. We found a tool that allowed the business to assemble all information from disparate data sources into one platform. This allowed us to make decisions in real time.

We ignored the “start small and learn” approach and took the “start big and understand” approach by focusing on the most fundamental question we needed answering which was “where do we make our profit and why?”.

DB What has been your return?

CS As an example, analysis of secondary income opportunity has driven £600,000 ($997,091) of additional annual income.

DB How has using “internal” business intelligence helped?

CS First, it has given us the ability to make decisions based on fact rather than intuition or perception and has provided complete transparency when understanding profit and loss levers.

We have now moved to a “nowhere to hide from the facts” culture, the IT department has been removed from the critical path to information and everyone in the organisation has access to answers. This encourages collaboration and end-to-end thinking.

DB What lessons did you learn from this? What would you tell others to do?

CS That perception-based decision making is a characteristic of sales-led organisations. That culture can be very quickly moved with the right tools and environment.

We now have a strong focus on real data quality.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Displaying the intelligence: Search goes on for a ‘single view of the truth’

Displaying the intelligence: Search goes on for a ‘single view of the truth’By Ross Tieman

Published: November 26 2009 18:04 | Last updated: November 26 2009 18:04

The idea that you can keep tabs on how an organisation is performing from a desktop display while also focusing on its strategic direction is hugely appealing.

Every day, many of us do precisely this in a car: the dashboard monitors its systems and speed, while helping the driver safely negotiate the obstacles of a journey. Could similar displays not help in running a company, a sales department, or a group of hospitals?

In theory, they can.

Most industrial processes today are run by mouse-clicks – from nuclear power stations to cloth-cutting machines. Corporate systems store every digit of data created, whether by the sales staff logging their calls, the accounts clerks issuing invoices, the machines doing the manufacturing or the purchasing manager placing orders for materials.

Yet these glorious, information-rich data are so often compartmentalised in fragmented systems, each designed to serve a particular business or organisational function. Bolting them together to turn data into information about corporate or organisational performance can be an IT chief’s nightmare.

It might seem as though a few wires and some simple software could enable data to flow seamlessly between systems, enabling the chief executive to see the basics, such as sales, deliveries, and how much cash the business is using, when they log-on in the morning.

Yet Bill Fuessler, IBM Global Financial Management Lead for business consulting, says this can prove stunningly difficult. “One of the biggest issues is getting commonality of data definition,” he says. “And that problem will last for several years more.”

Standards, and even digital definitions of commonplace business words, may differ in the sales department from those used in marketing, or finance. Combine the data sets, and the “information” simply doesn’t add up. What chief executive would drive a car whose dashboard said it might – or might not – be overheating?

Software companies, however, understand the issues and are working hard on how to extract information from data and reach what Richard Neale, marketing director of SAP BusinessObjects, calls “a single view of the truth”.

For mid-sized companies unencumbered by a long tail of legacy systems and data, or those willing to start again at square one, there are software-as-a-service specialists, such as NetSuite, capable of providing a state-of-the-art system containing every byte of corporate data, fully integrated, on a common set of definitions, accessible at will.

But abstracting information for a corporate, not-for-profit, or even public sector dashboard display is also attainable.

First, you have to discover who wants, or needs, to know what.

In a car there is a speedometer and a fuel gauge, possibly with information on fuel consumption, or distance until you next need to fill the tank. But most of the other dashboard data are displayed only if needed, as an alert – such as when the cooling system fails or a seat-belt is unbuckled.

Business intelligence displays need to follow the same precepts. They have to provide appropriate “mission critical” information for all; to enable users to call up information relevant to their role or task; and to provide appropriate alerts when things go wrong. There is no one-size-fits-all system.

In a car, every driver is engaged in a similar task, but in a company, some users – typically the chief executive or finance chief – need access to a broad range of information, while a departmental head might be interested in particular sub-sets of data. Almost everybody also needs alerts relating to their own areas of responsibility.

That information, as distinct from data, may have to reach them wherever they are. Mr Neale, at SAP, says that increasingly, dashboards are being delivered not just on desktops, but on mobile devices, including smartphones.

The latest generation of SAP BusinessObjects software enables users to have “widgets” on their desktops that highlight particular features of organisational performance.

It can also deliver a sophisticated alert to a smartphone, as a graphic display that enables the user to “mine” the information, calling up detail to establish the nature and cause of the problem to which they are being alerted. An alert could relate to inventory levels, risk, cash balances or even a cost or time over-run on a project.

That list highlights the importance of delivering relevant information to the responsible individual. To be valuable, it has to contain signals that the recipient may need to act upon. The IT boss may need to know if the system is likely to crash, but it’s the finance director who cares about the cash balances, while the IT department overrunning its budget may matter to both.

The desktop remains the presentation location of choice because the size of its display permits a lot of information to be shown.

Historically, many organisations have relied on Excel spreadsheets or Microsoft Office tools to present business information to users.

Today, using modern software, the information can be displayed in the form of gauges, pie-charts, graphs, thermometers, heat-maps – just about any format the user prefers.

What business intelligence data add is the ability to explore the information easily with mouse clicks to discover what happened, where, and why.

A typical NetSuite display is presented on a series of tabs, with pages that might include a meter, top selling items as a bar chart, key performance indicators that provide pop-up graphs, and comparative sales as a chart with variable time-spans. If you have reliable real-time data, you can sort and display it any way you like.

As IBM’s Mr Fuessler says, if a retail company’s sales fall, it is handy to be able to uncover quickly that it happened because of a holiday in Boston that closed three stores, for example, and is not the start of an alarming trend. Inadequate information can lead to false conclusions.

Nigel Rayner, research vice-president at Gartner, says: “When you get the dashboard in, that is when you start to get awkward questions. The chief executive can see revenue is going down, or up, but doesn’t know why. Dashboards are always about reporting. They don’t help you make decisions.”

By definition, dashboards only present current or historic data. But decision-makers want to be able to predict the future. People running large companies, public-sector organisations and even not-for-profits want the IT equivalent of the forward-looking radar that some car-makers have trialled.

As Mr Rayner says: “You need more performance management applications to help people model options.” This is where a lot of corporate IT investment is now going, he says.

But if you are going to start making decisions about business strategy based upon conclusions drawn from computer software you need clean data, and answers to current questions, rather than whatever the system was set up to measure five years ago.

“Most organisations have far too many metrics, without being able to plot cause and effect relationships,” Mr Rayner says. “These are pure business problems, and more technology is not the answer.”

So departmental bosses have to sit down together and agree the questions they want answered, and what they want to measure to get them.

To move from mere dashboards to directing the course of an organisation by drawing on all the information squirreled within its systems, Mr Rayner elaborates a four-stage process. Start by monitoring performance, set up an enterprise metric framework, and add analytic and modelling capabilities with performance management applications. Only then, he says, can you go develop a pattern-based business strategy.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Wednesday, November 25, 2009

Gartner: Die Top 10 der Mobilanwendungen für private Nutzer im Jahr 2012

Das IT-Marktforschungs- und Beratungsunternehmen Gartner hat die 10 Mobilanwendungen für Privatanwender identifiziert, die im Jahr 2012 am wichtigsten sein werden.

"Mobilanwendungen und Services für Consumer sind nicht mehr nur die Domäne der Mobilfunkbetreiber", kommentiert Sandy Shen, Research Director bei Gartner. "Das wachsende Interesse der Verbraucher an Smartphones, das Engagement der Internet-Player im Mobil-Bereich sowie die Entstehung von Application Stores und branchenübergreifenden Services reduziert die Dominanz der Mobilfunkbetreiber. Jeder Marktteilnehmer hat Einfluss darauf, wie die Anwendung zum Kunden kommt und von ihm wahrgenommen wird. Und die Kunden treffen mit ihrer Aufmerksamkeit und ihrer Kaufkraft die letzte Entscheidung."

Aus der Original-Pressemeldung:

The top ten consumer mobile applications in 2012 will include:

No. 1: Money Transfer

This service allows people to send money to others using Short Message Service (SMS). Its lower costs, faster speed and convenience compared with traditional transfer services have strong appeal to users in developing markets, and most services signed up several million users within their first year. However, challenges do exist in both regulatory and operational risks. Because of the fast growth of mobile money transfer, regulators in many markets are piling in to investigate the impact on consumer costs, security, fraud and money laundering. On the operational side, market conditions vary, as do the local resources of service providers, so providers need different market strategies when entering a new territory.

No. 2: Location-Based Services

Location-based services (LBS) form part of context-aware services, a service that Gartner expects will be one of the most disruptive in the next few years. Gartner predicts that the LBS user base will grow globally from 96 million in 2009 to more than 526 million in 2012. LBS is ranked No. 2 in Gartner’s top ten because of its perceived high user value and its influence on user loyalty. Its high user value is the result of its ability to meet a range of needs, ranging from productivity and goal fulfilment to social networking and entertainment.

No. 3: Mobile Search

The ultimate purpose of mobile search is to drive sales and marketing opportunities on the mobile phone. To achieve this, the industry first needs to improve the user experience of mobile search so that people will come back again. Mobile search is ranked No. 3 because of its high impact on technology innovation and industry revenue. Consumers will stay loyal to some search services, but instead of sticking to one or two search providers on the internet, Gartner expects loyalty on the mobile phone to be shared between a few search providers that have unique technologies for mobile search.

No. 4: Mobile Browsing

Mobile browsing is a widely available technology present on more than 60 per cent of handsets shipped in 2009, a percentage Gartner expects to rise to approximately 80 per cent in 2013. Gartner has ranked mobile browsing No. 4 because of its broad appeal to all businesses. Mobile web systems have the potential to offer a good return on investment. They involve much lower development costs than native code, reuse many existing skills and tools, and can be agile - both delivered and updated quickly. Therefore, the mobile web will be a key part of most corporate business-to-consumer (B2C) mobile strategies.

No. 5: Mobile Health Monitoring

Mobile health monitoring is the use of IT and mobile telecommunications to monitor patients remotely, and could help governments, care delivery organisations (CDOs) and healthcare payers reduce costs related to chronic diseases and improve the quality of life of their patients. In developing markets, the mobility aspect is key as mobile network coverage is superior to fixed network in the majority of developing countries. Currently, mobile health monitoring is at an early stage of market maturity and implementation, and project rollouts have so far been limited to pilot projects. In the future, the industry will be able to monetise the service by offering mobile healthcare monitoring products, services and solutions to CDOs.

No. 6: Mobile Payment

Mobile payment usually serves three purposes. First, it is a way of making payment when few alternatives are available. Second, it is an extension of online payment for easy access and convenience. Third, it is an additional factor of authentication for enhanced security. Mobile payment made Gartner’s top ten list because of the number of parties it affects - including mobile carriers, banks, merchants, device vendors, regulators and consumers - and the rising interest from both developing and developed markets. Because of the many choices of technologies and business models, as well as regulatory requirements and local conditions, mobile payment will be a highly fragmented market. There will not be standard practices of deployment, so parties will need to find a working solution on a case-by-case basis.

No. 8: Mobile Advertising

Mobile advertising in all regions is continuing to grow through the economic downturn, driven by interest from advertisers in this new opportunity and by the increased use of smartphones and the wireless Internet. Total spending on mobile advertising in 2008 was $530.2 million, which Gartner expects to will grow to $7.5 billion in 2012. Mobile advertising makes the top ten list because it will be an important way to monetise content on the mobile internet, offering free applications and services to end users. The mobile channel will be used as part of larger advertising campaigns in various media, including TV, radio, print and outdoors.

No. 9: Mobile Instant Messaging

Price and usability problems have historically held back adoption of mobile instant messaging (IM), while commercial barriers and uncertain business models have precluded widespread carrier deployment and promotion. Mobile IM is on Gartner’s top ten list because of latent user demand and market conditions that are conducive to its future adoption. It has a particular appeal to users in developing markets that may rely on mobile phones as their only connectivity device. Mobile IM presents an opportunity for mobile advertising and social networking, which have been built into some of the more advanced mobile IM clients.

No. 10: Mobile Music

Mobile music so far has been disappointing - except for ring tones and ring-back tones, which have turned into a multibillion-dollar service. On the other hand, it is unfair to dismiss the value of mobile music, as consumers want music on their phones and to carry it around. We see efforts by various players in coming up with innovative models, such as device or service bundles, to address pricing and usability issues. iTunes makes people pay for music, which shows that a superior user experience does make a difference.

Weitere Informationen finden Sie in der Original-Pressemeldung (s.o).

25.11.2009, Sabine Minar, Text 100 GmbH

Thursday, November 12, 2009

How texting could transform bank services

How texting could transform bank services
By Peter Tanner, managing director of Boomerang SMS Solutions

Published: November 12 2009 17:48 | Last updated: November 12 2009 17:48

Growing numbers of banks and financial institutions are adopting text messaging as part of a raft of measures designed to improve customer communication, enhance service levels and attain competitive advantage.

However, the constraints of traditional text technology have limited the range of services that can be delivered to customers.

But using an auditable, two-way texting solution will enable banks to transform the relevance and quality of their customer service, from ordering new cheque books to checking transaction patterns in a bid to reduce the impact of fraud.

Critically, I believe that by integrating this solution into core banking applications, workflow can be automated, significantly reducing costs by removing the need for manual intervention.

Financial institutions are looking to transform customer interaction with new innovative services and a wider range of communication options. For these institutions, however, economic pressures dictate that such services must be delivered without big investment or ongoing costs. The delivery method must also be simple and widely available to ensure banks can reach as many customers as possible.

As a result, growing numbers of banks recognise that investing in SMS offers excellent value, while enhancing the quality of the service provided. Quick, simple and used by the vast majority of customers, SMS is a useful tool to update customers on account balance, for example, or raise an alert for unusual transaction patterns.

However, this method of communication is still one dimensional: traditional SMS technologies do not enable a customer’s reply to trigger action. If there is a problem that demands a response from the customer, such as confirming if a transaction is fraudulent, the bank will be burdened by the time and cost associated with manually handling that customer response, whether at the call centre or in branch.

Next generation technology, however, can guarantee that multiple outbound messages are specifically matched with their appropriate response. This is key, as it enables banks to integrate SMS reliably into their workflow processes, transforming the potential range and nature of services available to customers

Automating the production of texts, just as standard letters are produced today, and triggering database actions on the basis of a customer SMS response eliminates the need for manual intervention at local branches or the call centre, greatly reducing the administrative burden.

For example, a bank sends a text to a customer reporting a suspicious transaction and the customer’s response is automatically recognised by the core software. If the customer responds “Yes” to the question: “Is this transaction genuine?” the system will process the transaction as usual. If the response is ”No”, the database will suspend the account and move automatically into its anti-fraud process.

Critically, as long as there is no problem, the bank will need to undertake no manual administrative process: the entire process is handled automatically by the system, providing a quicker, more efficient and less costly means of communicating with the customer.

With 80 per cent of texts being received within 60 seconds, this full circle texting technology provides the fastest way to communicate efficiently with customers.

Critically, these messages are inherently secure; texts are extremely hard to intercept and, in the unlikely event that a phone is stolen, actions such as money transfers can be additionally secured via the use of variable PIN codes.

Fears of mobile phishing can also be allayed through the use of specific text number ranges by the bank and supported by additional personal information.

For customers, the appeal of a two-way text solution is clear. Information from the bank is instantly retrieved irrespective of location and where a response can be made by SMS, the inconvenience of a lengthy phone call or branch visit is avoided.

The two-way approach also enables customers to access a range of services offered by the banks, starting perhaps with simple options such as a text-based chequebook ordering process.

Indeed, further down the line customers may well be willing to pay for some of these more sophisticated services, such as potential fraud alerts or notification of nearing overdraft limits.

For the customer travelling abroad, the fact that the bank raises a text-based alert of an overseas transaction provides a high level of confidence. The ability to respond via text confirming that the transaction is genuine, in seconds, removes the risk of the account being suspended which is an inconvenient by-product of today’s transaction tracking technology.

If the transaction is fraudulent, the immediacy of the communication and the automation with core systems to suspend the account boosts customer confidence while also minimising their exposure to financial distress.

Indeed, the provision of real time transaction information via text improves confidence in the quality of service and enables the customer to take control. It can also be applied to a range of financial services. From loan applications to insurance policy renewals, as well as the added value services increasingly being offered by card providers such as booking flights, financial institutions can empower customers to take control of their finances.

Those financial services organisations that have already embraced texting to improve customer services are providing better, more immediate information. But the next generation of texting technology enables banks to transform the quality and immediacy of these services.

Critically, by fully integrating this technology into core applications, this transformation in service and communication can be achieved while also streamlining processes, increasing automation and driving down manual intervention to achieve significant cost benefits.

By closing the loop with two way SMS communication, tightly integrated with core systems, financial institutions can improve customer service while also driving down administration overheads and reducing the financial and personal impact of fraudulent transactions both on the institution and the customer.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Monday, November 02, 2009

Did IT Work? BPM is finally aligning business and IT

Did IT Work? BPM is finally aligning business and ITBy Stephen Pritchard

Published: November 2 2009 16:44 | Last updated: November 2 2009 16:44

Ensuring that IT is in step with the business is a constant challenge and any tool that allows applications to be developed for the business quickly, using terminology that line of business managers understand, will find a ready market.

One such technology – perhaps the only such technology – is business process management. BPM is not specifically an IT term: rather it is a management practice that sets out to look at how a business runs its processes, improve them, and ensure that the company then keeps running according to that best practice.

IT’s role in BPM is most often associated with a set of development tools that translate business processes or workflows into software. Usually these tools work by modelling the business process visually, so that both IT and non-IT people can see how the application will work.

Once the workflow has been captured, the BPM tool then produces the code for a the new software application in a semi-automated way. The idea is to speed up development times, and even allow non-IT specialists to develop quite complex business applications.

Such is the appetite for business process improvements that companies are continuing to invest in BPM, despite the strictures being placed on other parts of the IT budget.

According to industry analysts Gartner, 37 per cent of companies in North America and western Europe are either thinking about investing in BMP, or have already done so.

One of the attractions of BPM, says Michele Cantara, vice president in Gartner’s business of IT research division, is that projects do not have to be on a very large scale in order to produce a return on investment. “Half of the companies in our BPM awards broke even in the first year,” she explains. “These are not large, intergalactic projects. In terms of project costs, the budget is usually in the range of $400,000 to $600,000.”

Often, BPM projects will be significantly smaller than that. As Ian Gotts, chief executive of Nimbus Partners, a BPM vendor, points out, early stage projects are often in the £30,000 to £50,000 range. This can extend to multi-million pound projects with a two to three year implementation period in industries such as the utilities “where the business case justifies it”.

However, both vendors and analysts agree that early-stage BPM works best where the tool is used to capture a structured business flow with well-defined information. It becomes more difficult to model business flows that depend heavily on human decision-making or judgments, or where information is contained in documents or media files rather than databases.

“Companies focus on process improvements, and so they look [first] at documented or automated processes,” says Ms Cantara. “They don’t necessarily look at human tasks that are part informal work processes; they don’t necessarily look at processes that are more ‘squishy’, ad hoc or collaborative, that might vary from individual to individual or situation to situation.”

None the less, companies are finding that business process management is enabling them to tackle projects more quickly and efficiently than before.

“Modern BPM is a tool that enables a different conversation with the business. It is a visual tool that lets you build both complex and simple business processes in very visual way,” says Toby Redshaw, CIO of Aviva, the insurance company.

Aviva currently has 23 live BPM projects. One, the “Joiners, movers and leavers” system, tracks staff across their time with Aviva, from both an HR, and an information and systems access, point of view. It was built in less than 12 weeks using BPM tools from Lombardi.

“It is an important project from an HR but also a controls perspective,” says Mr Redshaw. “We took a process that is complex and difficult, and we delivered in eight weeks with three weeks testing.”

Mr Redshaw believes that development through BPM is, on average, three times faster than conventional development, and business users are more satisfied with the results.

“They say ‘you IT monkeys finally sent us people who speak our language’,” he says, although he points out that the effectiveness of BPM really comes from the more visual and iterative methods it forces upon both business and IT teams.

BPM was also the route taken by another insurance company, Skandia, when it came to modernising its workflow for handling customer correspondence. Originally, staff would log incoming post into a database, which created work items for distribution to departments. These were then transferred to a number of end user systems.

By using BPM, Skandia was able to centralise its processes into a single workflow system and remove a large amount of laborious administrative tasks, freeing up employees to spend more time with customers.

“Workflow [a BPM product from vendor Tibco] has automated that,” explains Tim Mann, platform development director at Skandia. “The post is scanned in, and the workflow system knows where to send it. It tells a supervisor [which tasks are waiting] and moves on. It has replaced several end-user systems and human supervision.”

Improvements to local workflow methods are saving Skandia £250,000 a year, and rolling the system out to 10 customer services teams equates to £300,000 in productivity savings. Increasing business volumes at the insurer meant there were no job losses, but the business is “doing more with the same resources”.

In addition, Skandia expects to save £150,000 annually by reducing its reliance on paper, bringing lower costs for printing and also transport and storage.

A further benefit, Mr Mann suggests, comes in the form of improved staff satisfaction. “Staff feel more engaged,” he says. “A lot of the work, when it was paper based, was repetitive. Workflow helps them get through it, and allows them to spend more time dealing with policyholders by e-mail or on the phone. Good customer service is about having good people on the phone; that is where we add value.”

Skandia’s experience supports the argument that BPM can work for smaller, more localised projects as well as for larger, business-wide projects. “A lot of it is about streamlining processes, and changing the way you are working, layering technology over the top and making it much more smooth and efficient,” says Mr Mann.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.