Wednesday, December 09, 2009

Digital Business

Digital Digest – Managing Intelligence
In this multi-media Digital Business digest we examine how organisations can gather information, analyse it, and serve it up in a meaningful, usable form

Video: business intelligence in action plus panel discussions
Podcast: disparate sources – how to use data from a decentralised business in several languages The days of the Next Big Thing could be over
Maybe there will be no one idea or invention, but a wave of disruptive technologies

Does IT work? Monitoring staff requires care
Tracking workers via mobile devices raises privacy concerns

Enterprise 2.0 is vital for business
Real benefits await successful adopters of new online tools

View latest print issue and full archive
Published on December 10 2009, or download as a pdf
Related content and features

Does IT work? Monitoring staff requires care

Does IT work? Monitoring staff requires care
By Stephen Pritchard

Published: December 9 2009 16:29 | Last updated: December 9 2009 16:29

New devices and faster networks are driving up productivity by giving mobile workers direct access to corporate e-mail and applications on the move.

Analysis by Research in Motion, maker of the BlackBerry, found improvements in productivity in field service and sales of more than 20 per cent – the equivalent of an additional customer visit each day.

But managing an increasingly mobile workforce poses challenges for businesses.

Tools for managing the mobile devices themselves, such as the BlackBerry Enterprise Server, Microsoft’s System Center Mobile Device Manager (SCMDM), or LogMeIn Mobile are now reasonably mature and give strong levels of control over device content management and security. But managing the staff using the devices is more complicated, and more controversial.

Smartphone and personal digital assistant technology allows businesses to monitor where employees are, at any time, via GPS (global positioning system) chips.

With more smartphones and PDAs now offering GPS to support mapping and navigation software, businesses can tap into the data via specialist software that reports employees’ locations by linking location data to a business application, or through fleet management and tracking systems.

Businesses can also monitor their employees much more accurately by looking at the workflow information produced by mobile versions of CRM, salesforce automation, or other enterprise applications.

Monitoring technology, though, raises concerns about employee privacy, as well as the impact such data collection has on workforce autonomy, incentives, and management practice.

Although the technology exists to track exactly where someone is, if not what they are doing, it is often a poor substitute for supervision by experienced foremen and managers.

“Workforce tracking is a natural outgrowth of knowing where your assets are,” says Kevin O’Marah, chief strategy officer at AMR Research, an analyst company that specialises in technology for vertical markets such as retail, distribution and manufacturing.

“Tracking [individual] people is much more sinister, but the technology makes it very obvious where people are. Most of the value in track-and-trace comes from tracing assets such as trucks, and from areas such as speed monitoring. You can find out if a truck has been racing along at 85 miles per hour, and then the driver took a long break. Companies care because of fuel efficiency.”

Drivers of vehicle and industrial plant – often with price tags of $250,000 or more – accept a certain degree of monitoring as part of their jobs. And, according to Bob Walton, president of Qualcomm Enterprise Services, the potential downsides can be offset by providing services the drivers value, such as the ability to complete paperwork and training via an in-cab console.

Use of tracking systems does become more contentious if employees are expected to carry monitored devices outside the cab; extending the technology further, to an individual’s BlackBerry or iPhone, is even more likely to raise concerns.

“It is being done, especially monitoring where people are, in order to route them to the next job,” says Nick White, telecoms director at Deloitte, the professional services firm. “But there is absolutely an issue about privacy.”

Much depends on the degree of autonomy that different types of worker need, or expect. “If you try to control a salesforce to the nth degree, you will get resistance,” says Mr White. “If it is engineering, you want the workforce to be focused on the task, not worrying about what the next job will be.”

Some people will even appreciate a degree of monitoring, for example if they work alone in potentially hazardous or dangerous areas. Lone worker monitoring has already proved popular among groups including taxi drivers, and health care workers, who appreciate the improved sense of safety it brings.

Then there is the question of making up lost time, especially for employees who work on commission.

“People cancel appointments, so a salesperson wants to know who is the next best person to call on, who are the nearest customers or perhaps, those who recently ordered from the competition,” says David Perry, a director at Cognito, a specialist mobility vendor.

Mobile device user Mitie Pest Control uses device tracking to allocate employees to jobs, to monitor how long jobs take and also to ensure customers sign for any work carried out.

Although the company does use the technology to track the productivity of individuals, managing director Peter Trotman stresses this will not work if the result is simply heavy-handed management. There has to be feedback and training for staff who perform less well.

“There was some scepticism and resistance initially, as with any technology,” he says. “But because it replaces tedious paperwork and provides more accurate information, our staff have found it helps. They accept it as a useful tool, not an inconvenient management oversight.”

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

The days of the Next Big Thing could be over

The days of the Next Big Thing could be over
By Alan Cane

Published: December 9 2009 16:29 | Last updated: December 9 2009 16:29

The main difficulty in predicting the “next big thing” – apart from the ambitious nature of the task – lies in defining just what a “big thing” is.

Is it something that will have a lasting and material impact on society – the emergence of revolutionary inventions – the transistor, for example, or the integrated circuit and the microprocessor?

Or could it be software – the Cobol programming language that changed business data processing and continues to influence its progress?

For some, systems are their “big thing” – the advent of mobile phone networks in the 1980s, followed by the internet, and with it e-mail and the world wide web. Today, many believe “the cloud”, an abstraction that represents the electronically interconnected world, fits the bill.

But as Rob Gear, manager of PA Consulting’s innovation unit points out: “Some breakthroughs will transform life for certain people in certain geographies but that same breakthrough will have little or no relevance for others. A BlackBerry or iPhone might have transformed the life of your average urban office worker but it has had little or no bearing on the life of the tribesmen of the Masai Mara.”

Mr Gear’s colleague, David Elton, however, thinks that “big things” are less rare than is believed: “These are things that have changed the way we live and work: search engines, text messaging, wikis, bar codes, RFID (radio frequency identifiers), liquid crystal displays and cheap disk storage.”

He says “market moments” – the coming together of technology, price point and market demand – define big things, giving as an example online retailing: “The first time round in 2001-2003, it was a damp squib. The second go, from 2004, took off like a train. The difference: a market moment. People wanted it, the technology was there; they just needed secure online payment mechanisms.”

Some developments have universal significance. Kishore Swaminathan, chief scientist at Accenture’s technology laboratories, believes no single thing is the answer – it is more a phenomenon, or “scale”.

“The necessity that will drive all future inventions of significance is exponential growth,” he says. “We currently understand linear but not exponential growth. Successful companies, inventions and societies will be those that master scale. Three specific areas of necessity will drive invention – energy, health and mega-cities. Scale is not the same as big. The dinosaurs were big, the internet has scale.”

Rudy Puryear, head of Bain & Company’s global IT practice, argues that businesses are facing structural shifts that will “easily trump emerging technologies as the ‘next big thing’.”

He points to IT collapsing under its own weight: “In a recovery, the fact that IT can no longer respond within a reasonable time cycle will come to the fore. We are expecting to see a surge in IT projects that actually address complexity.”

He says that chief information officers must regain the right to take centralised decisions and that outsourcing will change from cost tactic to strategic weapon: “The smartest CIOs will find ways to use outsourcing providers to do more than cut costs.”

Industry experts such as Joerg Heistermann, chief executive of the Americas Region for the business process management software group IDS Scheer, doubts that 2010 will see breakthrough technologies, arguing that existing developments such as cloud computing may offer amazing possibilities.

“Real innovation is hard,” he says. “It means the destruction of what exists today and requires that we convince people to change . . . an IT industry devoid of supposed breakthroughs would still have plenty of work to do with our bread and butter – continuous improvement.

“Connecting customers and providers, optimising supply chains, streamlining accounting or making interfaces easier to use – these recurring projects are constantly needed to improve any company’s efficiency, customer satisfaction and profitability.”

A number of experts, including Colin Bannister, head of technical sales for Computer Associates UK, also argues that there will no single “next big thing” but instead, waves of disruptive technologies “which will ebb and flow”.

“The risks around them must be managed, complexity removed and company-wide management tools made available to CIOs, if these technologies are to provide added value for businesses within today’s rapid timeframes for payback,” he predicts.

As examples, he cites service-oriented architectures, virtualisation and cloud computing, pointing out that each can increase risk and complexity unless tightly managed.

Growing complexity also worries Karl Havers, head of Ernst & Young’s European technology team, who admits to simple personal requirements: “Let me use three devices instead of a dozen connecting me through the smart grid to my home, shopping, car and family.

“Let that happen far faster than currently and when I want it. Oh, and I would like to be able to rely on simple things like mobile networks to work and not drop calls and the voice quality on my landline to be as good as it used to be when using voice over internet protocol and a remote handset.”

Mr Havers concludes: “The next big idea will be about solving the confusion and plethora of alternatives for people, making things simple and reliable.”

For a contrary view, I spoke to Josh Bernoff, senior vice-president with the consultancy Forrester Research, who says that employees and customers are already taking technology into their own hands with dramatic consequences: “No matter what company you work for, your employees have better technology than you,” he says.

“With their iPhones, their Facebook connections and cheap computing power for rent, they can solve their own problems using technology. They’re building the solutions your company will run on right now, right under the noses of your IT department staff.

“We can tell you about the marketers at Black & Decker who let salespeople use little video cameras to gain an edge on the competition. Or the guy at the US State Department who built his own teleconferencing application to spread US ideas around the world. You can embrace their problem-solving power, or you can hide in a corner,” he challenges.

In fact, an intersection between unified communications (UC) and social networking is already developing, according to Neil Louw, CIO at Dimension Data Europe: “More businesses are realising the potential to harness the burgeoning ‘unified communications mindset’ of their employees – developed through the personal use of tools common to UC and social networking, such as instant messaging, webcams and groups – by introducing enterprise-ready equivalents as part of their UC strategy.”

Cloud computing, however, is high on many lists of likely barnstormers. Hub Vandervort, chief technology officer of Progress Software, believes adoption will be faster than most analysts think because of economics: “It’s a simple empirical model: in a 1,000-machine data centre, efficiency will typically be at 20 per cent to 30 per cent. Getting a further 10 per cent from your infrastructure by moving it to the cloud will save $6m a year – and many data centres are far larger than 1,000 machines,” he says.

Andrew McGrath, commercial director for the communications group ntl:Telewest Business agrees that the benefits and efficiencies of cloud computing and server virtualisation could prove too good to ignore.

“This, in turn, will make the network underpinning these IT initiatives even more important. As a result, the next big thing for business will be the adoption of Ethernet networks. Capable of transporting huge volumes of data at great speed, they are the key to success for the adoption of technologies that rely on shared services.”

And here is a wild card: IBM believes the hottest technology trend of 2010 will be advanced analytics – software capable of making sense of the mountains of raw data companies are routinely storing these days.

IBM argues that predictive analytics will emerge as an essential tool for competitive advantage, focusing on assets – information – that companies already possess.

But even the best predictive analytics are not enough to tell us unequivocally whether they can be the “next big thing”.

On ft.com Alan Cane says: necessity will sort “hot” tech­nologies from the cool, in his regular Perspectives column at:
ft.com/digitalbusiness

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Enterprise 2.0 is vital for business

Enterprise 2.0 is vital for business
By Andrew McAfee

Published: December 9 2009 16:29 | Last updated: December 9 2009 16:29

Every day, more companies are deploying the technologies of Web 2.0, and also adopting the approaches to teamwork and interaction that have made Wikipedia, Facebook, Twitter, and other Web 2.0 resources so phenomenally popular.

I call this trend Enterprise 2.0 (E2.0), and have made it the subject of much of my research since 2006.

Corporate executives ask three excellent questions about E2.0. What, if anything, is so novel about it? What are the benefits? And the risks?

Enterprise 2.0 is actually something new. It is enabled by technologies that were not widely available 10 or even five years ago. These include blogs, wikis, social networking software such as Facebook, and “microblogging” utilities such as Twitter.

All of these tools share three fundamental properties. First, they are “frictionless” – easy to learn and make use of.

Second, they are free-form, meaning that they do not have pre-defined workflows and do not place users into categories. Instead, everyone starts as equals, contributing to a blank slate. This sounds like a recipe for chaos, but it is not.

The third property shared by all 2.0 technologies, and the most remarkable, is the emergence of patterns and structure in a system without central co-ordination.

To make this concept concrete when I’m speaking, I ask audience members to raise their hands if their organisation’s intranet is easier to search and navigate than the public internet. Very few hands go up, even though intranets are designed and maintained by professionals whose job it is to build navigable web environments.

The internet works better because even though it is radically decentralised and unco- ordinated it is not unstructured. It has a dense structure defined by all the links between pages.

This structure changes continuously and actually becomes more refined as the net grows. It is emergent, rather than imposed. The technology- enabled communities of Enterprise 2.0 work the same way.

Beyond better intranet navigation, what benefits can an organisation expect from E2.0?

The consultancy firm McKinsey has conducted three annual surveys on this question. In the most recent, published in September, respondents reported benefits that included better access to knowledge and internal experts, greater employee and customer satisfaction, and higher rates of innovation.

The magnitude of the gains was striking, ranging from 20 per cent (innovation rates) to 35 per cent (access to internal experts).

These self-reported and subjective data must be interpreted with caution, but are still compelling. They indicate that real business benefits await successful adopters of emergent tools and work practices.

Such improvements arise because E2.0 brings much-needed technological support to the informal organisation. The formal organisation is characterised by hierarchical organisational charts and standardised, repeatable business processes.

It received a technological shot in the arm in the mid 1990s when large-scale commercial applications such as ERP and CRM became available. Research suggests these applications significantly boosted productivity and performance. They did so primarily by allowing companies to standardise best practices and by making huge amounts of structured data available for analysis.

These tools, however, did not do as much to support the less formal and structured work of an organisation. And as we all know, the informal organisation is tremendously important. It is where many exceptions are handled, questions answered, and connections made. It is also often where novel ideas are sparked and new threats and opportunities identified.

Yet until now, the informal organisation has been almost entirely unsupported by IT. E-mail works when you know who you want to send a message to, but what about when you do not – when you are not sure who has the knowledge or expertise you are looking for?

The first generation of knowledge management systems attempted to address this challenge, but they were too structured; they did not match the emergent nature of the informal organisation.

E2.0 technologies do. When they get going, it becomes easy to find a bit of knowledge, or a knowledgeable person. It also becomes easy to learn what others are working on, and to be helpful to them. And it becomes possible to float a question to the entire organisation.

As Eric Raymond, the open source software advocate, says: “With enough eyeballs all bugs are shallow.” Enterprise 2.0 delivers benefits because it brings all of a company’s eyeballs to bear on challenges and opportunities rather than assigning them only to the “proper” authorities.

Now for the final question: what are the risks of E2.0? I find that they are actually quite small. The tools themselves are comparatively cheap, so financial risk is minimal. The biggest potential threat is that people will misuse the new technologies, either by putting up inappropriate material or by inadvertently revealing secrets.

This very rarely happens in practice, however: my collection of E2.0 horror stories is essentially non-existent. There are two main reasons for this.

First, contributors in corporate environments are almost always identifiable. Without the cloak of anonymity, bad online behaviour is much less common. Second, people know how to behave at work, and most are inclined to do so.

I believe that we are in the early phases of another era of technology-fuelled business improvement. Enterprise 2.0 is bringing significant gains to companies of all sizes, and in all industries.

Given the mismatch between its benefits and risks, and given the competitive imperative to seize all possible sources of advantage, sitting this one out seems like a very bad idea.

Andrew McAfee is a principal research scientist at the Center for Digital Business at MIT. He is the author of Enterprise 2.0, published by Harvard Business Press. His blog is andrewmcafee.org/blog; his Twitter identity is @amcafee

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Thursday, November 26, 2009

Introducing the IT Market Clock

Introducing the IT Market Clock
By Brian Gammage, vice president and Fellow, Gartner

Published: November 26 2009 16:15 | Last updated: November 26 2009 16:15

IT is no longer an emerging set of capabilities and markets – it is a maturing business tool and must be managed as such.

Although new capabilities continue to appear in the market, their adoption and use require them to be integrated into a portfolio of existing IT assets, many of which are already mature.

Some IT assets are no longer required, or no longer deliver sufficient business value to justify the costs of maintaining them. Usually, working to budget means new IT products and services can only be adopted if existing IT assets are retired or replaced.

Every IT product and service has a finite useful life and must eventually be retired or replaced. Correct timing of this retirement/replacement is critical.

The second part of useful life, from maturity to obsolescence, must be considered when managing IT assets throughout their whole life cycles. Most organisations require more-holistic mechanisms for planning IT divestment and reinvestment activity.

Gartner’s IT Market Clock is a new framework that supports strategic investment and divestment decisions. Tools and methodologies that focus only on technology adoption are no longer sufficient to support the decisions required to manage portfolios of IT assets throughout their full lifetime of use.

Gartner’s Hype Cycle for example, which the Gartner Market Clock complements, is a buyer’s decision framework for technology adoption, but its view ends when mainstream adoption begins, which typically equates to an adoption level of between 20 and 50 per cent.

Simply, the Hype Cycle supports “technology hunting” decisions, while the IT Market Clock supports “farming” decisions for assets already in use.

The IT Market Clock uses a clock-face metaphor to represent relative market time. Each point positioned on the IT Market Clock represents an IT asset or asset class: for example, desktop PCs, packaged maintenance and support services or corporate learning systems.

Technology assets are positioned on the IT Market Clock using two parameters. The first is where they currently lie within their own useful market life, from the first time the technology product or service can be acquired and used to the last time it can be viably used.

This determines the rotational position of the asset on the Market Clock – each begins at 0 (called ”Market Start”), and moves clockwise round to 12 o’clock.

The second is relative level of commoditisation, ie the ease with which the technology product or service can be interchanged with alternatives. Relative commoditisation determines the distance from the centre of the Market Clock; assets further from the centre are more commoditised.

Commoditisation is a proxy for the balance of market power between buyers/users and suppliers. For most asset classes, relative commoditisation levels begin low, increase steadily as the market matures and then decrease again toward end of life.

The IT Market Clock is divided into quarters, each representing one of four market phases of the useful market life of an IT asset.

The Advantage quarter represents the first stage of market life, during which technologies are often proprietary or highly customised and assets provide differentiated technology, service or capability.

There will usually be limited supply options and high dependence on relevant skills. Users should focus on benefits received.

Choice is the second phase of market life, during which technology assets are subject to increasing levels of standardisation and growing supply options. Users should re-evaluate the level of required customisation, prices and supply choices periodically as assets in this phase offer the greatest scope for cost savings.

The Cost quarter is the third phase of market life, during which assets reach their highest levels of commoditisation. Differentiation between alternative sources is at its minimum level and competition centres on price. Users should focus on acquisition and switching costs and ensure minimal skill-set dependencies.

Replacement is the final phase of market life, during which assets begin to move towards end of life, usually because they comprise legacy technologies, services or capabilities.

Supply choices and access to available skill sets will be decreasing, leading to rising operational costs. Their retirement or upgrade is essential. User organisations need to monitor operating costs for IT products and services in the disfavoured phase of their market life.

Operating costs rise toward end of market life, highlighting a growing urgency for retirement or replacement. For example, the skills needed to support and maintain mainframes and business applications at end-of-life are in increasingly short supply.

Suppliers and buying organisations can move to offset these issues during the Replacement phase, as, for example, has happened in the UK, with leading financial institutions encouraging universities to place Assemble and Cobol (which is now 50 years old) back on their curriculums.

But while such moves can alleviate immediate problems, each initiative to extend useful life typically comes at higher cost.

Moreover, as more companies move off legacy technologies, the burden of responsibility for maintaining associated skill sets falls to a diminishing number of organisations. The marginal costs of continuing to use technologies as they approach the end of their useful lives will increase.

With a holistic decision framework, user organisations will be able to manage their asset portfolios proactively and determine the right time to adopt and deploy emerging or adolescent technology options, establish road map plans for replacement and upgrade of existing technology assets, and perform reviews with suppliers for best saving opportunities.

Although such a framework is focused on technology assets, the same approach could also be extended and applied to any class of business assets.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

The final frontier of business advantage

The final frontier of business advantage
By Alan Cane

Published: November 26 2009 18:04 | Last updated: November 26 2009 18:04

Business intelligence, information intelligence, business analytics: whatever you call it, all the evidence is that ways of turning a company’s raw data into information that can be used to improve performance and achieve competitive advantage is the topic du jour in many business leaders’ minds.

A survey carried out this year by the US-based consultancy Forrester Research revealed that of more than 1,000 IT decision makers canvassed in North America and Europe, more than two thirds were considering, piloting, implementing or expanding business intelligence (BI) systems.

“Even in these tough economic times, virtually nobody in our surveys says they are retrenching or reducing their business intelligence initiatives,” says Boris Evelson, a principal analyst for Forrester with more than 30 years experience of BI implementation behind him.

What is BI management? It is not about the technical nitty gritty of data warehousing or cleansing technology. While technologies are important – and most are good and effective, according to Mr Evelson – BI management is about ways of systematically making the most of customer information– what it is and what you can do with it.

More prosaically, it is everything that has to be done to raw data before they can be manipulated to facilitate better decision making.

Dashboard that can give a warning light on overspending

Law firm Clifford Chance has found itself learning about habits it never knew it had since analysing its spending trends through an online service provided by Rosslyn Analytics, a boutique software company based in London, writes Dan Ilett.

“It’s very flexible,” says Julien Cranwell, Clifford Chance’s procurement manager. “You can look at your data to reduce spending. We’ve identified opportunities that we wouldn’t have otherwise seen. It’s made us feel a lot more confident of the data we’ve been using.”

The company, which has 29 offices in 20 countries, used a web-based tool called rapidintel.com. The service works like a dashboard with charts and graphs to give an overview of where money has been spent.

“It aggregates and shares information,” says Charles Clark, chief executive of Rosslyn Analytics. “We extract the data in a few hours and categorise them so they go into certain buckets. We then add other data such as credit card or risk information.

“It’s presented as a ready-to-use report. The data cube never changes but they can see it from so many different angles. It’s one view of all company-wide finance, procurement, accounts payable and spend data.”

“Some of the larger areas of spending have been travel, catering and entertainment,” says Mr Cranwell. “It shows where we have varying levels of spending between offices. We are then in a position of power because we know much more about our spending patterns.

“We’ve also looked at a cost recovery programme. Using Rosslyn’s expertise we’re using a module that works on contract management.”

The firm claims to have seen a return on investment of 100 per cent within two months. “The payback period was very fast indeed,” says Mr Cranwell.
It is also about understanding the business and its processes well enough to know what questions should be asked of the data to improve performance.

The basic idea was pioneered more than a decade ago by the US computer manufacturer Teradata, which combined supercomputer performance with sophisticated software to scan and detect trends and patterns in huge volumes of data.

But it was expensive and ahead of its time. Today, high-performance, low-cost computer systems and cheap memory mean that enterprises can and are collecting and storing data in unprecedented amounts.

However, they are struggling to make sense of what they have.

In Mr Evelson’s words: “We have to find the data, we have to extract it, we have to integrate it, we have to map apples to oranges, we have to clean it up, we have to aggregate it, we have to model it and we have to store it in something like a data warehouse.

“We have to understand what kind of metrics we want to track – times, customers, regions and then, and only then, can we start reporting.”

Everybody agrees there is nothing simple about these operations. “It is a very complex endeavour,” says Mr Evelson, “and that is why this market is very immature.”

The business opportunity for BI software has not been lost on IT companies and there has already been significant consolidation in the market, with IBM acquiring, among others, Cognos; SAP buying Business Objects; and Oracle purchasing Hyperion to add BI strings to their respective bows.

Microsoft offers BI software called SharePoint Server and there is considerable interest in open source BI software from younger companies such as Pentaho and Jaspersoft.

IBM alone reckons to have spent $12bn and trained 4,000 consultants over the past few years to develop the tools and knowledge which will encourage intelligence management in its customers.

Ambuj Goyal, who leads the company’s information management initiative, argues that it is a new approach that will “turn the world a little bit upside down”.

“Business efficiency over the past 20 years was all about automating a process – enterprise resource planning [ERP] for example. It generated huge efficiencies for businesses but is no longer a [competitive] differentiator.

“In the past two or three years we have started to look at information as a strategic capital asset for the organisation. This will generate 20, 30 or 40 per cent improvements in the way we run businesses as opposed to the 3 or 5 per cent improvements we achieved before.”

But revolutions are rarely pain-free. According to the Forrester survey: “For many large enterprises, BI remains and will continue to be the ‘last frontier’ of competitive differentiation.

“Unfortunately, as the demand for pervasive and comprehensive BI applications continues to increase, the complexity, cost and effort of large-enterprise BI implementations increases as well.

“As a result, the great examples of successful implementations among Forrester’s clients are outnumbered by the volume of underperforming BI environments.”

In fact, more than two thirds of users questioned said they found BI applications hard or very hard to learn, navigate and use.

The business case for BI management is not helped by the difficulty of making a strong case for return on investment.

It is, for example, hard to decide which tools and processes should be included in the assessment – Microsoft’s SharePoint is much more than a BI tool, for example, but separating out which strands are contributing to improved revenues and which are not is a challenge.

As Mr Evelson notes: “The grey boundary lines around which process and tools to include, the multiple BI components that typically need to be customised and integrated, and the frequent unpredictability of BI system integration efforts all make BI business cases an effort not for the faint of heart.”

How, then, should executives think about business intelligence management? Royce Bell, information management specialist with the consultancy Accenture takes a robustly pragmatic view: “Business is made up of processes. Some of them may interact with the outside world, but there is a definite chain of events.

“All that business intelligence is supposed to inform, is any decision along that chain of events. The question an executive should be asking is: ‘At this point in the chain, what information do I need?’.

“Going through each and every one of your processes to be able to ask that question is hard. People are disappointed because they haven’t been able to get wisdom simply by piling all the data in one place.

“That [data warehousing and mining] sounds more exciting and more fun than going through your processes to determine what you need.”

Mr Bell believes that many executives are suspicious of the quality of the information provided by BI software: they think the data are “rubbish”, and there is no doubt that transforming data into intelligence requires clean data.

Roger Llewellyn is chief executive of the UK software group Kognitio, which has responsibility for analysing, among other things, telephone calls made by customers of British Telecom and store purchases that use the Nectar loyalty card of supermarket chain, J Sainsbury.

He says that up to 80 per cent of the price of a new contract can be the cost of cleaning the data – converting, in one case, 15 data types to a single standard.

The Sainsbury contract involves the analysis of the 20bn items purchased in the chain’s stores every nine months – enough, if typed on paper, to make an in-tray pile almost 17kms high.

How can this huge volume of bits and bytes be turned into useful information?

Mr Llewellyn gives the example of skin creams sold to counter stretch marks. Generally bought predominantly by women, if particular stores show high sales volumes, there are likely to be a lot of pregnancies in those areas – an alert for the store manager to stock up on maternity magazines, baby food and clothing.

And if most of the clothing bought is blue, there will be a lot of baby boys in the region: “From buying a jar of stretch cream, I’ve almost got you for life,” Mr Llewellyn beams.

James McGeever, chief financial officer of the US company NetSuite, which markets BI management software, underlines the importance of clean, unambiguous data in breaking down “silos” – data stored in different places and formats within an organisation: “I believe that if the same piece of data exists in two places then one will be wrong.”

The NetSuite answer for its customers is to convert all the data to one consistent type and store it in one repository: “The physical process of loading the data is not as tough as it may sound. It’s actually deciding what data to store there and how to organise your workflows that is the difficult part.”

NetSuite provides executives with tailored “dashboards”, a visual representation of the information important to their jobs.

A well-designed dashboard providing the right amount of pertinent information is a crucial part of BI according to Peter Lumley and Stephen Black of PA Consulting.

They point out that it is often forgotten that managers have limited time to absorb and act on information which, in any case, may be imperfect – if it was perfect, decision making would be no chore at all. A well-designed dashboard can help managers make the best possible decision from incomplete information.

The information, of course, has to be trusted and that is where technology can play an important part – in the automatic roll-up of data to a central repository: “Every time you go through a stage with manual intervention you have the opportunity for time delay and misinterpretation,” Mr Lumley argues.

And these mis-steps are precisely what business intelligence management hopes to avoid.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

The final frontier of business advantage

The final frontier of business advantage
By Alan Cane

Published: November 26 2009 18:04 | Last updated: November 26 2009 18:04

Business intelligence, information intelligence, business analytics: whatever you call it, all the evidence is that ways of turning a company’s raw data into information that can be used to improve performance and achieve competitive advantage is the topic du jour in many business leaders’ minds.

A survey carried out this year by the US-based consultancy Forrester Research revealed that of more than 1,000 IT decision makers canvassed in North America and Europe, more than two thirds were considering, piloting, implementing or expanding business intelligence (BI) systems.

“Even in these tough economic times, virtually nobody in our surveys says they are retrenching or reducing their business intelligence initiatives,” says Boris Evelson, a principal analyst for Forrester with more than 30 years experience of BI implementation behind him.

What is BI management? It is not about the technical nitty gritty of data warehousing or cleansing technology. While technologies are important – and most are good and effective, according to Mr Evelson – BI management is about ways of systematically making the most of customer information– what it is and what you can do with it.

More prosaically, it is everything that has to be done to raw data before they can be manipulated to facilitate better decision making.

Dashboard that can give a warning light on overspending

Law firm Clifford Chance has found itself learning about habits it never knew it had since analysing its spending trends through an online service provided by Rosslyn Analytics, a boutique software company based in London, writes Dan Ilett.

“It’s very flexible,” says Julien Cranwell, Clifford Chance’s procurement manager. “You can look at your data to reduce spending. We’ve identified opportunities that we wouldn’t have otherwise seen. It’s made us feel a lot more confident of the data we’ve been using.”

The company, which has 29 offices in 20 countries, used a web-based tool called rapidintel.com. The service works like a dashboard with charts and graphs to give an overview of where money has been spent.

“It aggregates and shares information,” says Charles Clark, chief executive of Rosslyn Analytics. “We extract the data in a few hours and categorise them so they go into certain buckets. We then add other data such as credit card or risk information.

“It’s presented as a ready-to-use report. The data cube never changes but they can see it from so many different angles. It’s one view of all company-wide finance, procurement, accounts payable and spend data.”

“Some of the larger areas of spending have been travel, catering and entertainment,” says Mr Cranwell. “It shows where we have varying levels of spending between offices. We are then in a position of power because we know much more about our spending patterns.

“We’ve also looked at a cost recovery programme. Using Rosslyn’s expertise we’re using a module that works on contract management.”

The firm claims to have seen a return on investment of 100 per cent within two months. “The payback period was very fast indeed,” says Mr Cranwell.
It is also about understanding the business and its processes well enough to know what questions should be asked of the data to improve performance.

The basic idea was pioneered more than a decade ago by the US computer manufacturer Teradata, which combined supercomputer performance with sophisticated software to scan and detect trends and patterns in huge volumes of data.

But it was expensive and ahead of its time. Today, high-performance, low-cost computer systems and cheap memory mean that enterprises can and are collecting and storing data in unprecedented amounts.

However, they are struggling to make sense of what they have.

In Mr Evelson’s words: “We have to find the data, we have to extract it, we have to integrate it, we have to map apples to oranges, we have to clean it up, we have to aggregate it, we have to model it and we have to store it in something like a data warehouse.

“We have to understand what kind of metrics we want to track – times, customers, regions and then, and only then, can we start reporting.”

Everybody agrees there is nothing simple about these operations. “It is a very complex endeavour,” says Mr Evelson, “and that is why this market is very immature.”

The business opportunity for BI software has not been lost on IT companies and there has already been significant consolidation in the market, with IBM acquiring, among others, Cognos; SAP buying Business Objects; and Oracle purchasing Hyperion to add BI strings to their respective bows.

Microsoft offers BI software called SharePoint Server and there is considerable interest in open source BI software from younger companies such as Pentaho and Jaspersoft.

IBM alone reckons to have spent $12bn and trained 4,000 consultants over the past few years to develop the tools and knowledge which will encourage intelligence management in its customers.

Ambuj Goyal, who leads the company’s information management initiative, argues that it is a new approach that will “turn the world a little bit upside down”.

“Business efficiency over the past 20 years was all about automating a process – enterprise resource planning [ERP] for example. It generated huge efficiencies for businesses but is no longer a [competitive] differentiator.

“In the past two or three years we have started to look at information as a strategic capital asset for the organisation. This will generate 20, 30 or 40 per cent improvements in the way we run businesses as opposed to the 3 or 5 per cent improvements we achieved before.”

But revolutions are rarely pain-free. According to the Forrester survey: “For many large enterprises, BI remains and will continue to be the ‘last frontier’ of competitive differentiation.

“Unfortunately, as the demand for pervasive and comprehensive BI applications continues to increase, the complexity, cost and effort of large-enterprise BI implementations increases as well.

“As a result, the great examples of successful implementations among Forrester’s clients are outnumbered by the volume of underperforming BI environments.”

In fact, more than two thirds of users questioned said they found BI applications hard or very hard to learn, navigate and use.

The business case for BI management is not helped by the difficulty of making a strong case for return on investment.

It is, for example, hard to decide which tools and processes should be included in the assessment – Microsoft’s SharePoint is much more than a BI tool, for example, but separating out which strands are contributing to improved revenues and which are not is a challenge.

As Mr Evelson notes: “The grey boundary lines around which process and tools to include, the multiple BI components that typically need to be customised and integrated, and the frequent unpredictability of BI system integration efforts all make BI business cases an effort not for the faint of heart.”

How, then, should executives think about business intelligence management? Royce Bell, information management specialist with the consultancy Accenture takes a robustly pragmatic view: “Business is made up of processes. Some of them may interact with the outside world, but there is a definite chain of events.

“All that business intelligence is supposed to inform, is any decision along that chain of events. The question an executive should be asking is: ‘At this point in the chain, what information do I need?’.

“Going through each and every one of your processes to be able to ask that question is hard. People are disappointed because they haven’t been able to get wisdom simply by piling all the data in one place.

“That [data warehousing and mining] sounds more exciting and more fun than going through your processes to determine what you need.”

Mr Bell believes that many executives are suspicious of the quality of the information provided by BI software: they think the data are “rubbish”, and there is no doubt that transforming data into intelligence requires clean data.

Roger Llewellyn is chief executive of the UK software group Kognitio, which has responsibility for analysing, among other things, telephone calls made by customers of British Telecom and store purchases that use the Nectar loyalty card of supermarket chain, J Sainsbury.

He says that up to 80 per cent of the price of a new contract can be the cost of cleaning the data – converting, in one case, 15 data types to a single standard.

The Sainsbury contract involves the analysis of the 20bn items purchased in the chain’s stores every nine months – enough, if typed on paper, to make an in-tray pile almost 17kms high.

How can this huge volume of bits and bytes be turned into useful information?

Mr Llewellyn gives the example of skin creams sold to counter stretch marks. Generally bought predominantly by women, if particular stores show high sales volumes, there are likely to be a lot of pregnancies in those areas – an alert for the store manager to stock up on maternity magazines, baby food and clothing.

And if most of the clothing bought is blue, there will be a lot of baby boys in the region: “From buying a jar of stretch cream, I’ve almost got you for life,” Mr Llewellyn beams.

James McGeever, chief financial officer of the US company NetSuite, which markets BI management software, underlines the importance of clean, unambiguous data in breaking down “silos” – data stored in different places and formats within an organisation: “I believe that if the same piece of data exists in two places then one will be wrong.”

The NetSuite answer for its customers is to convert all the data to one consistent type and store it in one repository: “The physical process of loading the data is not as tough as it may sound. It’s actually deciding what data to store there and how to organise your workflows that is the difficult part.”

NetSuite provides executives with tailored “dashboards”, a visual representation of the information important to their jobs.

A well-designed dashboard providing the right amount of pertinent information is a crucial part of BI according to Peter Lumley and Stephen Black of PA Consulting.

They point out that it is often forgotten that managers have limited time to absorb and act on information which, in any case, may be imperfect – if it was perfect, decision making would be no chore at all. A well-designed dashboard can help managers make the best possible decision from incomplete information.

The information, of course, has to be trusted and that is where technology can play an important part – in the automatic roll-up of data to a central repository: “Every time you go through a stage with manual intervention you have the opportunity for time delay and misinterpretation,” Mr Lumley argues.

And these mis-steps are precisely what business intelligence management hopes to avoid.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Resources: Finding a home for all that data

Resources: Finding a home for all that data
By Stephen Pritchard

Published: November 26 2009 18:04 | Last updated: November 26 2009 18:04

When companies started to build the first enterprise data warehouse and knowledge management systems, in the late 1970s, there was little doubt that these were projects that demanded significant investment in both time and resources.

The early data warehouse systems certainly required mainframe resources, and running queries took days, if not weeks.

But advances in computing power, as well as improvements in programming, have done much to reduce the infrastructure demands of business intelligence (BI). It is now quite possible to run small-scale BI queries using little more than a data source, a laptop computer and a spreadsheet program.

Some businesses – especially smaller ones – do indeed manage their data analysis this way.

However, BI experts caution that this approach struggles to scale up to support the larger enterprise, and can raise real difficulties in areas such as data governance and lead to companies having multiple master data sets, or “multiple versions of the truth”.

“Many people start with something small in scope, and there is nothing wrong with that,” says Jeanne Harris, a BI specialist at Accenture’s Institute for High Performance Business.

“But if marketing, and finance, and sales have their own scorecards, based on their own data, it will be a Tower of Babel. Very few organisations have done a good job of creating a single view of their data.”

Nor is the hardware challenge one that chief information officers – or users of business data – can completely ignore.

Although processing power has increased in line with Moore’s Law and data storage has also fallen in price, the growth of business data is faster still. Volumes of data are reckoned to double every 12 to 18 months, twice as fast as just three years ago.

Some businesses are reacting by moving to grid-based supercomputers, or by offloading BI processing to private or public “clouds”. Others are deploying solid-state hard drives in their data warehouses, because of the superior data throughput they offer.

But such systems are expensive and large organisations, in particular, are beginning to struggle with the time it takes to load data into a warehouse or a BI system, especially if it comes from multiple sources.

“With data warehousing appliances [dedicated computers for data processing], the bottleneck is not the speed of the box or the quantity of storage but the time it takes to load the information, especially if you are dealing with demographic information,” says Bill Hewitt, president and chief executive of Kalido, a data management company.

“Even at data loading rates of 10 gigabytes an hour, there is one company that is looking at 39 weeks to load its data.”

This is leading some companies to consider alternative approaches to analytics, such as stream-based processing. It is also prompting businesses to look at BI tools, as well as broader-based technologies such as enterprise search, that can examine data in situ, rather than require them to be loaded into a warehouse and then processed.

Such technologies could also help businesses to overcome their reliance on data from operational systems, such as customer relationship management or enterprise resource planning. Such transactional data are almost always historic, and leads to BI acting as a “rear view mirror” for management, rather than as an accurate predictor of trends.

“Most organisations don’t use external data but rely on [data from] their operational systems to solve specific problems,” explains Earl Atkinson, a BI expert at PA Consulting Group. As a result, the data will only be as good – and as timely – as the information held in those underlying systems.

Before companies can build enterprise-wide knowledge management or BI systems, they also need to work on the quality of the data. Data can also be accurate but partial, or misleading, especially if they were originally gathered for a different purpose.

“A customer, for example, can exist in multiple IT systems,” points out Tony Young, CIO of Informatica, a data management technology vendor. “You need to have a common agreement on who the customer is, for example, if you want to look at their history.

“If I ask a financial person who the customer is, it is the person you bill. Marketing will say it’s the person who responds to a campaign. For sales it might be the person signing the cheque. These are all correct, but they are not common. You have to agree how you are going to treat that information.”

This, more than hardware assets, network capacity, or even the ability to write complex algorithms to analyse data, goes to the heart of the debate around the resources needed for advanced business intelligence.

Organisations need to decide, early on, which information they are going to use, and be honest about the completeness, or otherwise, of their data sets.

If they do not, the results can be disastrous.

“In the run up to the financial crisis, institutions knew that there were three categories of risk but they only had data for one. So that was the one they thought about,” says Accenture’s Ms Harris. “You need to understand all of the risk variables and how they relate to each other, and this needs different technologies and capabilities in modelling, and in experimental design.”

Organisations also need to consider whether conventional data sources, such as those produced by back-office IT applications, or by more specialist tools, such as a retail point-of-sale system or a supply chain management system, really give the full picture.

Increasingly, companies are looking for ways to mine the information held in “unstructured” data, such as e-mails, presentations and documents, or even video clips or recorded phone calls, to provide a basis for BI, and hence better decision making.

“As much as 80 per cent of the information in a company is unstructured, against just 20 per cent that is structured,” notes Bob Tennant, chief executive at Recommind, a company that specialises in using search technology for information risk management.

“Most business intelligence is focused on that 20 per cent of structured data, as it is pretty high value and easy to deal with. But there are a lot of useful, unstructured data that are not being taken advantage of.”

Tapping into that unstructured information might not be easy. But it is the best, and for some companies, probably the only way to make more use of existing resources, in order to make better business decisions.

..............................................................................................

Q&A: ING Lease UK

ING Lease UK is part of the ING Group – one of the largest financial companies in the world. In 2004, the company acquired three businesses from Abbey National Group.

With 300 employees and 100,000 customers, the company has to ensure its reporting and market perception is as accurate as it can be.

Dan Ilett, for Digital Business, questioned Chris Stamper, chief executive of ING Lease UK, about how it creates useful intelligence from its information.

Digital Business What did you do to improve internal reporting?

Chris Stamper We turned conventional wisdom on its head. We found a tool that allowed the business to assemble all information from disparate data sources into one platform. This allowed us to make decisions in real time.

We ignored the “start small and learn” approach and took the “start big and understand” approach by focusing on the most fundamental question we needed answering which was “where do we make our profit and why?”.

DB What has been your return?

CS As an example, analysis of secondary income opportunity has driven £600,000 ($997,091) of additional annual income.

DB How has using “internal” business intelligence helped?

CS First, it has given us the ability to make decisions based on fact rather than intuition or perception and has provided complete transparency when understanding profit and loss levers.

We have now moved to a “nowhere to hide from the facts” culture, the IT department has been removed from the critical path to information and everyone in the organisation has access to answers. This encourages collaboration and end-to-end thinking.

DB What lessons did you learn from this? What would you tell others to do?

CS That perception-based decision making is a characteristic of sales-led organisations. That culture can be very quickly moved with the right tools and environment.

We now have a strong focus on real data quality.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Displaying the intelligence: Search goes on for a ‘single view of the truth’

Displaying the intelligence: Search goes on for a ‘single view of the truth’By Ross Tieman

Published: November 26 2009 18:04 | Last updated: November 26 2009 18:04

The idea that you can keep tabs on how an organisation is performing from a desktop display while also focusing on its strategic direction is hugely appealing.

Every day, many of us do precisely this in a car: the dashboard monitors its systems and speed, while helping the driver safely negotiate the obstacles of a journey. Could similar displays not help in running a company, a sales department, or a group of hospitals?

In theory, they can.

Most industrial processes today are run by mouse-clicks – from nuclear power stations to cloth-cutting machines. Corporate systems store every digit of data created, whether by the sales staff logging their calls, the accounts clerks issuing invoices, the machines doing the manufacturing or the purchasing manager placing orders for materials.

Yet these glorious, information-rich data are so often compartmentalised in fragmented systems, each designed to serve a particular business or organisational function. Bolting them together to turn data into information about corporate or organisational performance can be an IT chief’s nightmare.

It might seem as though a few wires and some simple software could enable data to flow seamlessly between systems, enabling the chief executive to see the basics, such as sales, deliveries, and how much cash the business is using, when they log-on in the morning.

Yet Bill Fuessler, IBM Global Financial Management Lead for business consulting, says this can prove stunningly difficult. “One of the biggest issues is getting commonality of data definition,” he says. “And that problem will last for several years more.”

Standards, and even digital definitions of commonplace business words, may differ in the sales department from those used in marketing, or finance. Combine the data sets, and the “information” simply doesn’t add up. What chief executive would drive a car whose dashboard said it might – or might not – be overheating?

Software companies, however, understand the issues and are working hard on how to extract information from data and reach what Richard Neale, marketing director of SAP BusinessObjects, calls “a single view of the truth”.

For mid-sized companies unencumbered by a long tail of legacy systems and data, or those willing to start again at square one, there are software-as-a-service specialists, such as NetSuite, capable of providing a state-of-the-art system containing every byte of corporate data, fully integrated, on a common set of definitions, accessible at will.

But abstracting information for a corporate, not-for-profit, or even public sector dashboard display is also attainable.

First, you have to discover who wants, or needs, to know what.

In a car there is a speedometer and a fuel gauge, possibly with information on fuel consumption, or distance until you next need to fill the tank. But most of the other dashboard data are displayed only if needed, as an alert – such as when the cooling system fails or a seat-belt is unbuckled.

Business intelligence displays need to follow the same precepts. They have to provide appropriate “mission critical” information for all; to enable users to call up information relevant to their role or task; and to provide appropriate alerts when things go wrong. There is no one-size-fits-all system.

In a car, every driver is engaged in a similar task, but in a company, some users – typically the chief executive or finance chief – need access to a broad range of information, while a departmental head might be interested in particular sub-sets of data. Almost everybody also needs alerts relating to their own areas of responsibility.

That information, as distinct from data, may have to reach them wherever they are. Mr Neale, at SAP, says that increasingly, dashboards are being delivered not just on desktops, but on mobile devices, including smartphones.

The latest generation of SAP BusinessObjects software enables users to have “widgets” on their desktops that highlight particular features of organisational performance.

It can also deliver a sophisticated alert to a smartphone, as a graphic display that enables the user to “mine” the information, calling up detail to establish the nature and cause of the problem to which they are being alerted. An alert could relate to inventory levels, risk, cash balances or even a cost or time over-run on a project.

That list highlights the importance of delivering relevant information to the responsible individual. To be valuable, it has to contain signals that the recipient may need to act upon. The IT boss may need to know if the system is likely to crash, but it’s the finance director who cares about the cash balances, while the IT department overrunning its budget may matter to both.

The desktop remains the presentation location of choice because the size of its display permits a lot of information to be shown.

Historically, many organisations have relied on Excel spreadsheets or Microsoft Office tools to present business information to users.

Today, using modern software, the information can be displayed in the form of gauges, pie-charts, graphs, thermometers, heat-maps – just about any format the user prefers.

What business intelligence data add is the ability to explore the information easily with mouse clicks to discover what happened, where, and why.

A typical NetSuite display is presented on a series of tabs, with pages that might include a meter, top selling items as a bar chart, key performance indicators that provide pop-up graphs, and comparative sales as a chart with variable time-spans. If you have reliable real-time data, you can sort and display it any way you like.

As IBM’s Mr Fuessler says, if a retail company’s sales fall, it is handy to be able to uncover quickly that it happened because of a holiday in Boston that closed three stores, for example, and is not the start of an alarming trend. Inadequate information can lead to false conclusions.

Nigel Rayner, research vice-president at Gartner, says: “When you get the dashboard in, that is when you start to get awkward questions. The chief executive can see revenue is going down, or up, but doesn’t know why. Dashboards are always about reporting. They don’t help you make decisions.”

By definition, dashboards only present current or historic data. But decision-makers want to be able to predict the future. People running large companies, public-sector organisations and even not-for-profits want the IT equivalent of the forward-looking radar that some car-makers have trialled.

As Mr Rayner says: “You need more performance management applications to help people model options.” This is where a lot of corporate IT investment is now going, he says.

But if you are going to start making decisions about business strategy based upon conclusions drawn from computer software you need clean data, and answers to current questions, rather than whatever the system was set up to measure five years ago.

“Most organisations have far too many metrics, without being able to plot cause and effect relationships,” Mr Rayner says. “These are pure business problems, and more technology is not the answer.”

So departmental bosses have to sit down together and agree the questions they want answered, and what they want to measure to get them.

To move from mere dashboards to directing the course of an organisation by drawing on all the information squirreled within its systems, Mr Rayner elaborates a four-stage process. Start by monitoring performance, set up an enterprise metric framework, and add analytic and modelling capabilities with performance management applications. Only then, he says, can you go develop a pattern-based business strategy.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Wednesday, November 25, 2009

Gartner: Die Top 10 der Mobilanwendungen für private Nutzer im Jahr 2012

Das IT-Marktforschungs- und Beratungsunternehmen Gartner hat die 10 Mobilanwendungen für Privatanwender identifiziert, die im Jahr 2012 am wichtigsten sein werden.

"Mobilanwendungen und Services für Consumer sind nicht mehr nur die Domäne der Mobilfunkbetreiber", kommentiert Sandy Shen, Research Director bei Gartner. "Das wachsende Interesse der Verbraucher an Smartphones, das Engagement der Internet-Player im Mobil-Bereich sowie die Entstehung von Application Stores und branchenübergreifenden Services reduziert die Dominanz der Mobilfunkbetreiber. Jeder Marktteilnehmer hat Einfluss darauf, wie die Anwendung zum Kunden kommt und von ihm wahrgenommen wird. Und die Kunden treffen mit ihrer Aufmerksamkeit und ihrer Kaufkraft die letzte Entscheidung."

Aus der Original-Pressemeldung:

The top ten consumer mobile applications in 2012 will include:

No. 1: Money Transfer

This service allows people to send money to others using Short Message Service (SMS). Its lower costs, faster speed and convenience compared with traditional transfer services have strong appeal to users in developing markets, and most services signed up several million users within their first year. However, challenges do exist in both regulatory and operational risks. Because of the fast growth of mobile money transfer, regulators in many markets are piling in to investigate the impact on consumer costs, security, fraud and money laundering. On the operational side, market conditions vary, as do the local resources of service providers, so providers need different market strategies when entering a new territory.

No. 2: Location-Based Services

Location-based services (LBS) form part of context-aware services, a service that Gartner expects will be one of the most disruptive in the next few years. Gartner predicts that the LBS user base will grow globally from 96 million in 2009 to more than 526 million in 2012. LBS is ranked No. 2 in Gartner’s top ten because of its perceived high user value and its influence on user loyalty. Its high user value is the result of its ability to meet a range of needs, ranging from productivity and goal fulfilment to social networking and entertainment.

No. 3: Mobile Search

The ultimate purpose of mobile search is to drive sales and marketing opportunities on the mobile phone. To achieve this, the industry first needs to improve the user experience of mobile search so that people will come back again. Mobile search is ranked No. 3 because of its high impact on technology innovation and industry revenue. Consumers will stay loyal to some search services, but instead of sticking to one or two search providers on the internet, Gartner expects loyalty on the mobile phone to be shared between a few search providers that have unique technologies for mobile search.

No. 4: Mobile Browsing

Mobile browsing is a widely available technology present on more than 60 per cent of handsets shipped in 2009, a percentage Gartner expects to rise to approximately 80 per cent in 2013. Gartner has ranked mobile browsing No. 4 because of its broad appeal to all businesses. Mobile web systems have the potential to offer a good return on investment. They involve much lower development costs than native code, reuse many existing skills and tools, and can be agile - both delivered and updated quickly. Therefore, the mobile web will be a key part of most corporate business-to-consumer (B2C) mobile strategies.

No. 5: Mobile Health Monitoring

Mobile health monitoring is the use of IT and mobile telecommunications to monitor patients remotely, and could help governments, care delivery organisations (CDOs) and healthcare payers reduce costs related to chronic diseases and improve the quality of life of their patients. In developing markets, the mobility aspect is key as mobile network coverage is superior to fixed network in the majority of developing countries. Currently, mobile health monitoring is at an early stage of market maturity and implementation, and project rollouts have so far been limited to pilot projects. In the future, the industry will be able to monetise the service by offering mobile healthcare monitoring products, services and solutions to CDOs.

No. 6: Mobile Payment

Mobile payment usually serves three purposes. First, it is a way of making payment when few alternatives are available. Second, it is an extension of online payment for easy access and convenience. Third, it is an additional factor of authentication for enhanced security. Mobile payment made Gartner’s top ten list because of the number of parties it affects - including mobile carriers, banks, merchants, device vendors, regulators and consumers - and the rising interest from both developing and developed markets. Because of the many choices of technologies and business models, as well as regulatory requirements and local conditions, mobile payment will be a highly fragmented market. There will not be standard practices of deployment, so parties will need to find a working solution on a case-by-case basis.

No. 8: Mobile Advertising

Mobile advertising in all regions is continuing to grow through the economic downturn, driven by interest from advertisers in this new opportunity and by the increased use of smartphones and the wireless Internet. Total spending on mobile advertising in 2008 was $530.2 million, which Gartner expects to will grow to $7.5 billion in 2012. Mobile advertising makes the top ten list because it will be an important way to monetise content on the mobile internet, offering free applications and services to end users. The mobile channel will be used as part of larger advertising campaigns in various media, including TV, radio, print and outdoors.

No. 9: Mobile Instant Messaging

Price and usability problems have historically held back adoption of mobile instant messaging (IM), while commercial barriers and uncertain business models have precluded widespread carrier deployment and promotion. Mobile IM is on Gartner’s top ten list because of latent user demand and market conditions that are conducive to its future adoption. It has a particular appeal to users in developing markets that may rely on mobile phones as their only connectivity device. Mobile IM presents an opportunity for mobile advertising and social networking, which have been built into some of the more advanced mobile IM clients.

No. 10: Mobile Music

Mobile music so far has been disappointing - except for ring tones and ring-back tones, which have turned into a multibillion-dollar service. On the other hand, it is unfair to dismiss the value of mobile music, as consumers want music on their phones and to carry it around. We see efforts by various players in coming up with innovative models, such as device or service bundles, to address pricing and usability issues. iTunes makes people pay for music, which shows that a superior user experience does make a difference.

Weitere Informationen finden Sie in der Original-Pressemeldung (s.o).

25.11.2009, Sabine Minar, Text 100 GmbH

Thursday, November 12, 2009

How texting could transform bank services

How texting could transform bank services
By Peter Tanner, managing director of Boomerang SMS Solutions

Published: November 12 2009 17:48 | Last updated: November 12 2009 17:48

Growing numbers of banks and financial institutions are adopting text messaging as part of a raft of measures designed to improve customer communication, enhance service levels and attain competitive advantage.

However, the constraints of traditional text technology have limited the range of services that can be delivered to customers.

But using an auditable, two-way texting solution will enable banks to transform the relevance and quality of their customer service, from ordering new cheque books to checking transaction patterns in a bid to reduce the impact of fraud.

Critically, I believe that by integrating this solution into core banking applications, workflow can be automated, significantly reducing costs by removing the need for manual intervention.

Financial institutions are looking to transform customer interaction with new innovative services and a wider range of communication options. For these institutions, however, economic pressures dictate that such services must be delivered without big investment or ongoing costs. The delivery method must also be simple and widely available to ensure banks can reach as many customers as possible.

As a result, growing numbers of banks recognise that investing in SMS offers excellent value, while enhancing the quality of the service provided. Quick, simple and used by the vast majority of customers, SMS is a useful tool to update customers on account balance, for example, or raise an alert for unusual transaction patterns.

However, this method of communication is still one dimensional: traditional SMS technologies do not enable a customer’s reply to trigger action. If there is a problem that demands a response from the customer, such as confirming if a transaction is fraudulent, the bank will be burdened by the time and cost associated with manually handling that customer response, whether at the call centre or in branch.

Next generation technology, however, can guarantee that multiple outbound messages are specifically matched with their appropriate response. This is key, as it enables banks to integrate SMS reliably into their workflow processes, transforming the potential range and nature of services available to customers

Automating the production of texts, just as standard letters are produced today, and triggering database actions on the basis of a customer SMS response eliminates the need for manual intervention at local branches or the call centre, greatly reducing the administrative burden.

For example, a bank sends a text to a customer reporting a suspicious transaction and the customer’s response is automatically recognised by the core software. If the customer responds “Yes” to the question: “Is this transaction genuine?” the system will process the transaction as usual. If the response is ”No”, the database will suspend the account and move automatically into its anti-fraud process.

Critically, as long as there is no problem, the bank will need to undertake no manual administrative process: the entire process is handled automatically by the system, providing a quicker, more efficient and less costly means of communicating with the customer.

With 80 per cent of texts being received within 60 seconds, this full circle texting technology provides the fastest way to communicate efficiently with customers.

Critically, these messages are inherently secure; texts are extremely hard to intercept and, in the unlikely event that a phone is stolen, actions such as money transfers can be additionally secured via the use of variable PIN codes.

Fears of mobile phishing can also be allayed through the use of specific text number ranges by the bank and supported by additional personal information.

For customers, the appeal of a two-way text solution is clear. Information from the bank is instantly retrieved irrespective of location and where a response can be made by SMS, the inconvenience of a lengthy phone call or branch visit is avoided.

The two-way approach also enables customers to access a range of services offered by the banks, starting perhaps with simple options such as a text-based chequebook ordering process.

Indeed, further down the line customers may well be willing to pay for some of these more sophisticated services, such as potential fraud alerts or notification of nearing overdraft limits.

For the customer travelling abroad, the fact that the bank raises a text-based alert of an overseas transaction provides a high level of confidence. The ability to respond via text confirming that the transaction is genuine, in seconds, removes the risk of the account being suspended which is an inconvenient by-product of today’s transaction tracking technology.

If the transaction is fraudulent, the immediacy of the communication and the automation with core systems to suspend the account boosts customer confidence while also minimising their exposure to financial distress.

Indeed, the provision of real time transaction information via text improves confidence in the quality of service and enables the customer to take control. It can also be applied to a range of financial services. From loan applications to insurance policy renewals, as well as the added value services increasingly being offered by card providers such as booking flights, financial institutions can empower customers to take control of their finances.

Those financial services organisations that have already embraced texting to improve customer services are providing better, more immediate information. But the next generation of texting technology enables banks to transform the quality and immediacy of these services.

Critically, by fully integrating this technology into core applications, this transformation in service and communication can be achieved while also streamlining processes, increasing automation and driving down manual intervention to achieve significant cost benefits.

By closing the loop with two way SMS communication, tightly integrated with core systems, financial institutions can improve customer service while also driving down administration overheads and reducing the financial and personal impact of fraudulent transactions both on the institution and the customer.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Monday, November 02, 2009

Did IT Work? BPM is finally aligning business and IT

Did IT Work? BPM is finally aligning business and ITBy Stephen Pritchard

Published: November 2 2009 16:44 | Last updated: November 2 2009 16:44

Ensuring that IT is in step with the business is a constant challenge and any tool that allows applications to be developed for the business quickly, using terminology that line of business managers understand, will find a ready market.

One such technology – perhaps the only such technology – is business process management. BPM is not specifically an IT term: rather it is a management practice that sets out to look at how a business runs its processes, improve them, and ensure that the company then keeps running according to that best practice.

IT’s role in BPM is most often associated with a set of development tools that translate business processes or workflows into software. Usually these tools work by modelling the business process visually, so that both IT and non-IT people can see how the application will work.

Once the workflow has been captured, the BPM tool then produces the code for a the new software application in a semi-automated way. The idea is to speed up development times, and even allow non-IT specialists to develop quite complex business applications.

Such is the appetite for business process improvements that companies are continuing to invest in BPM, despite the strictures being placed on other parts of the IT budget.

According to industry analysts Gartner, 37 per cent of companies in North America and western Europe are either thinking about investing in BMP, or have already done so.

One of the attractions of BPM, says Michele Cantara, vice president in Gartner’s business of IT research division, is that projects do not have to be on a very large scale in order to produce a return on investment. “Half of the companies in our BPM awards broke even in the first year,” she explains. “These are not large, intergalactic projects. In terms of project costs, the budget is usually in the range of $400,000 to $600,000.”

Often, BPM projects will be significantly smaller than that. As Ian Gotts, chief executive of Nimbus Partners, a BPM vendor, points out, early stage projects are often in the £30,000 to £50,000 range. This can extend to multi-million pound projects with a two to three year implementation period in industries such as the utilities “where the business case justifies it”.

However, both vendors and analysts agree that early-stage BPM works best where the tool is used to capture a structured business flow with well-defined information. It becomes more difficult to model business flows that depend heavily on human decision-making or judgments, or where information is contained in documents or media files rather than databases.

“Companies focus on process improvements, and so they look [first] at documented or automated processes,” says Ms Cantara. “They don’t necessarily look at human tasks that are part informal work processes; they don’t necessarily look at processes that are more ‘squishy’, ad hoc or collaborative, that might vary from individual to individual or situation to situation.”

None the less, companies are finding that business process management is enabling them to tackle projects more quickly and efficiently than before.

“Modern BPM is a tool that enables a different conversation with the business. It is a visual tool that lets you build both complex and simple business processes in very visual way,” says Toby Redshaw, CIO of Aviva, the insurance company.

Aviva currently has 23 live BPM projects. One, the “Joiners, movers and leavers” system, tracks staff across their time with Aviva, from both an HR, and an information and systems access, point of view. It was built in less than 12 weeks using BPM tools from Lombardi.

“It is an important project from an HR but also a controls perspective,” says Mr Redshaw. “We took a process that is complex and difficult, and we delivered in eight weeks with three weeks testing.”

Mr Redshaw believes that development through BPM is, on average, three times faster than conventional development, and business users are more satisfied with the results.

“They say ‘you IT monkeys finally sent us people who speak our language’,” he says, although he points out that the effectiveness of BPM really comes from the more visual and iterative methods it forces upon both business and IT teams.

BPM was also the route taken by another insurance company, Skandia, when it came to modernising its workflow for handling customer correspondence. Originally, staff would log incoming post into a database, which created work items for distribution to departments. These were then transferred to a number of end user systems.

By using BPM, Skandia was able to centralise its processes into a single workflow system and remove a large amount of laborious administrative tasks, freeing up employees to spend more time with customers.

“Workflow [a BPM product from vendor Tibco] has automated that,” explains Tim Mann, platform development director at Skandia. “The post is scanned in, and the workflow system knows where to send it. It tells a supervisor [which tasks are waiting] and moves on. It has replaced several end-user systems and human supervision.”

Improvements to local workflow methods are saving Skandia £250,000 a year, and rolling the system out to 10 customer services teams equates to £300,000 in productivity savings. Increasing business volumes at the insurer meant there were no job losses, but the business is “doing more with the same resources”.

In addition, Skandia expects to save £150,000 annually by reducing its reliance on paper, bringing lower costs for printing and also transport and storage.

A further benefit, Mr Mann suggests, comes in the form of improved staff satisfaction. “Staff feel more engaged,” he says. “A lot of the work, when it was paper based, was repetitive. Workflow helps them get through it, and allows them to spend more time dealing with policyholders by e-mail or on the phone. Good customer service is about having good people on the phone; that is where we add value.”

Skandia’s experience supports the argument that BPM can work for smaller, more localised projects as well as for larger, business-wide projects. “A lot of it is about streamlining processes, and changing the way you are working, layering technology over the top and making it much more smooth and efficient,” says Mr Mann.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Tuesday, October 27, 2009

The real benefits of outsourcing – value beyond the one-time cost saving

By Sanjiv Gossain, UK managing director of Cognizant Technology Solutions

Published: October 22 2009 11:42 | Last updated: October 22 2009 11:42

From IT maintenance to CRM and business process automation, outsourcing is firmly ingrained in company culture and is central to the smooth operation of the world’s biggest and most renowned businesses.

The benefits are supposedly clear, with cost-reduction typically the number one goal. However, despite vast sums spent on outsourcing contracts each year – more than $42.2bn in 2008, according to Gartner – it appears many companies are failing to keep track of their outsourcing investments and are subsequently missing out on the major benefits of outsourcing.

Cost saving, of course, is not the only desired outcome when entering into an outsourcing relationship. According to Gartner research, organisations still outsource for “efficiency, access to skills, focus on core business, innovation, modernisation and even business transformation”.

Yet the demand for cost reduction remains high and research recently conducted by Cognizant, in partnership with Warwick Business School, finds that a proven return on investment is required in a very short time.

Over half of more than 250 European chief information and chief finance officers surveyed are demanding ROI within the first 12 months of an outsourcing agreement being confirmed. The current economic situation has no doubt intensified this need, with outsourcing providers under increasing pressure to drive more value with their clients and deliver longer-term business benefits.

Whether an outsourcing agreement has saved money over the short term isn’t too difficult to measure; in the simplest terms, it boils down to whether the new supplier can do the task more cheaply than it was done previously in-house or with an alternative outsourcing supplier.

However, given that many of these relationships can stretch over a considerable length of time – the BBC recently extended one of its contracts for a further nine years – companies expect to profit from the additional benefits outlined above.

It goes without saying, therefore, that every business has a solid methodology and auditing process in place to measure the benefits of their outsourcing investment. Or does it?

Our research suggests that business leaders are failing to get to grips with measuring the full financial impact of the outsourcing contracts they commission. Perhaps the most alarming discovery is that fewer than half of CIOs and CFOs have even tried to quantify the financial contribution of outsourcing to their business.

There is a widespread belief that the long-term value of outsourcing cannot actually be measured. More than a third (37 per cent) admit they do not try to measure the return, while a further 20 per cent do not even know whether they have tried.

This is perhaps unsurprising when considered that only 29 per cent believe that the contribution can be properly assessed beyond the one-time cost saving.

So what methods are being used to track and prove the value of these huge investments? The CIOs and CFOs surveyed provided several answers and in some cases, it seems the methods are vague at best.

Some show a degree of methodology, even if they couldn’t quite articulate what it was. Others amount to little more than “back of an envelope” sums. Examples included “Manual calculation”; “You know what it costs but you don’t really know the value”; “The accountants will use some formula for calculating ROI”.

Just 7 per cent of respondents were very confident that they know what they are spending in terms of time and money on their outsourcing arrangements.

Companies undertake outsourcing initiatives for a wide range of disciplines. So while a one-size-fits-all method for measuring value may not make sense, it is imperative to have some method to indicate what has been gained and at what price.

To measure outsourcing’s impact, businesses require a form of Return On Outsourcing methodology that includes benefits along three dimensions: innovation (the basis of future benefits, valued financially), process optimisation (quantified and valued over time) and total cost of ownership (reflected in IT budgets and IT accountability).

Value along all three of these dimensions should be addressed as part of the planning process and tracked through the life of the initiative. This should enable both the client and the vendor to see the business value and cost advantages from the outsourcing investment, understand the operational conditions and best practices that lead to long-term success, and compare projected financial returns with other companies within an industry peer group and beyond.

The evolution of an outsourcing project can and should, in many cases, begin with cutting operational costs through labour arbitrage. Over time it should gain operational flexibility, adding and subtracting third-party resources as needed, delivering additional cost savings.

As financial performance improves, the cost savings can be reinvested in strategic initiatives that enable even greater operational efficiency and support future growth initiatives.

This insight into outsourcing performance is crucial in determining future decisions. The research suggests that C-level executives are making such decisions on future business and outsourcing strategies without knowledge of the financial benefits: 78 per cent of those who cut back on outsourcing last year cited “unclear value for money”.

Yet many do not actually have any clear evidence or means to quantify this.

Senior executives, therefore, appear to be making outsourcing decisions based upon short-term cost-cutting – which remains crucial – without measuring outsourcing’s impact beyond the initial labour, skills and cost advantages.

Key business benefits such as innovation and transformation are being ignored by many. Given that outsourcing should be delivering significant operational flexibility and business process improvements, its true value is clearly being missed by many organisations given the widespread lack of measurement practices.

Without clear ways of measuring and monitoring their outsourcing arrangements, company executives could, in effect, be tying up costs that could be released to drive additional initiatives.

The practice of outsourcing IT and business processes is mature, yet the research suggests that the way in which companies measure the positive impact of these arrangements needs to be addressed.


More on the Cognizant and Warwick Business School research report on attitudes to outsourcing can be viewed at http://www.quantifyingoutsourcingbenefits.com/

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

De-cluttering IT

By Colin Rowland, senior vice president, operations, for the Emea region at OpTier

Published: September 28 2009 10:47 | Last updated: September 28 2009 10:47

An IT department was once relatively simple. A server, a few computers, perhaps some firewalls, internet connection and a help desk. Staff came to work to write documents, make phone calls and not much more.

Today, work is supported by computing almost every step of the way. In turn, IT departments vary in size, budget and platform but have come to share one striking element – complexity.

As businesses have become ever more reliant on technology, so IT has built an intricate jigsaw puzzle of technologies.

A typical scenario: no business in its right mind is going to install a hugely expensive infrastructure without taking steps to ensure it works properly. So another system has to be installed to ensure the first one is performing.

This layering of solutions and systems to monitor the solutions has spiralled out of control. Our recent research in the UK found that three quarters of businesses admit they are blinded by the complexity of their IT management set up.

But what surprised us more is the estimated cost. Almost two thirds of respondents admitted that complex and ineffective IT management is costing their company £4.64m each year in downtime and staff time, on average.

So how has it come to this?

It is partly because there is no holistic, end-to-end picture IT that its managers need: CIOs have been forced to take a segmented approach to performance management by implementing partial solutions that monitor individual technology silos. We found that almost one fifth of companies were using more than five tools to monitor the performance of IT.

This partial approach is financially draining and does not give businesses the support they require.

For example, when a performance issue hits online banking, often the first time the IT department knows about it is when customer complaints flood in. In spite of the five monitoring tools, pinpointing the problem will still be like trying to find a needle in a haystack – or multiple haystacks. Industry analyst group Enterprise Management Associates estimates that more down time (54 per cent) is spent finding problems than fixing them.

In seeking to protect investments and ensure they deliver, IT departments have ended up with information overload that hinders resolution efforts.

What businesses need is for their IT departments to be able to assess quickly where the problems are, and avoid them.

IT is made up of many applications and systems each performing small tasks to get user transactions completed. By generating visibility into these transactions IT management can be simplified.

Each transaction from a user “travels” through the system. By capturing and tracking all transactions, across all IT tiers, all the time, organisations can see the impact that transactions have on the business.

But most importantly each business transaction provides clear evidence to how an application is performing and if there is trouble on the horizon.

Another advantage is that transactions also tell the cost side of the IT story; they make it easy to identify and resolve performance problems swiftly but also to optimise the cost of performing those transactions.

An approach that was fit for purpose 10 years ago, simply no longer cuts the mustard. Businesses have to be leaner and meaner – they cannot afford to have a reactive technology infrastructure where the systems manage the business rather than the other way around.

Simplifying IT management is, in many ways, akin to clearing out your wardrobe. It might be painful to part with that tan leather jacket from the 1980s but you know it has to be done.

Copyright The Financial Times Limited 2009. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.