Thursday, November 11, 2010

Moving out of recession: Small spending steps can bring big productivity leaps

By Stephen Pritchard

Published: October 27 2010 09:25 | Last updated: October 27 2010 09:25

As businesses emerged from the last recession, following the dotcom bust in 2001, the recovery in IT spending lagged behind.

Companies that had invested heavily during the good years found they had overspent on IT and had more than enough equipment to support their operations. It was 2004 before investment in technology recovered fully. By at least one measure, IT spending also became less effective during the dotcom induced downturn.

Businesses that had shed staff, or cut back other areas of their operations, found that their per capita IT costs increased.

Move forward to today, and a tentative economic recovery in most mature markets is once again putting a brake on IT spending. But businesses – as well as public sector organisations – are also being forced to look again at their cost bases, and IT is by no means immune from scrutiny.

At the same time, business leaders have to balance two competing demands: creating a leaner IT operation and creating a leaner business.

Although cutting budgets can produce quick savings, most enterprises spend only between 2 and 5 per cent of revenues (turnover) on technology; smaller companies, typically, will invest rather more.

But across the board, a small increase in IT spending can drive far greater gains in overall productivity.

“Steps towards recovery are still tentative,” cautions David Elton, an IT and change management expert at PA Consulting.

“The pressure on IT departments is still about money. There are signs that people are investing but most clients are still concerned about controlling costs.”

Boards remain cautious about a return to unfettered spending, where large sums of money seemingly vanished into long-term IT projects that failed, or failed to deliver the promised results.

This is prompting chief financial officers and chief information officers to look both at newer technologies, such as cloud computing, which can be deployed to reduce costs – and at improved methodologies for delivering IT services. In particular, there is growing interest in applying “lean” processes to IT.

“The CIO’s role is rapidly changing,” says Alexander Peters, a principal analyst at Forrester Research. “The recession accelerated this change but the drivers – social technologies, service oriented applications and the cloud – are strategic and require changes beyond tactical cost-cutting.”

Mr Peters is the co-author of a report that looks at how IT departments can apply “lean” thinking to their operations. In the report, he argues that CIOs can draw on methods developed in fields such as manufacturing, and use them to make IT not only cheaper, but more effective.

Lean thinking includes considering whether an enterprise should build or buy its IT infrastructure and services, moving on to newer, more efficient, platforms and making greater use of standardised processes.

But at its heart, Mr Peters argues, “lean” is about ensuring IT is more closely aligned to the business. This makes for more effective technology, and less waste.

“Best-practice executives view lean as a performance improvement strategy, rather than merely a cost-cutting exercise,” he says.

Bringing IT closer to the business, and ensuring it is more flexible and responsive, are key to lean thinking.

However, it also requires businesses to reconsider the way they run IT, both to cut costs and make it more responsive.

Moving to newer platforms and technologies should also provide businesses with a stronger foundation for a return to growth.

Strategies such as virtualisation – allowing a single computer to host multiple “virtual” machines – and server and storage consolidation, where those machines are run on fewer physical computers, will save money quite quickly, for businesses that have the expertise to implement them.

Some steps will require more initial investment. Installing computer and other equipment that draws less power can save significant sums over its lifetime, but businesses need to find the capital budgets for the hardware.

Research by IBM, for example, suggests that power consumption accounts for 75 per cent of data centre operating costs. Power costs are also growing much more rapidly than staffing, building or real estate expense, or taxes.

The cost of buying computer equipment, and of building data centres, is prompting more companies to look either at software as a service, outsourcing, or cloud computing.

IBM estimates that the construction cost of a 2,000 sq m data centre now runs to between $30m and $50m, putting it out of reach of all but the largest businesses or service providers.

Then there is the challenge of owning and running an asset based on technology that is both complex, and that rapidly becomes out of date.

A wholesale move to cloud computing might not be appropriate, although some commodity services, such as e-mail, archiving and software test and development, are already being hosted in the cloud for large businesses.

Frank Modruson, CIO of Accenture, the consultancy, points out that businesses with older and more complex IT infrastructures may have to update those before they can outsource the technology itself.

But making such investments is perhaps one of the few ways IT departments can free up cash to support new business initiatives, such as new online sales channels or social networking.

“Coming out of the recession, companies have started to redirect spending to the top line,” says Mark Hillman, vice-president for strategy at Compuware, an IT services company.

“They still have cost reduction initiatives in place, such as server consolidation, but they are limiting spending on the back office, to allow them to invest in areas that give them better connections to partners or customers, or in areas that affect their brand.”

Financial data for the Facebook generation
The financial services companies that buy the data services Thomson Reuters provide may have had a tough couple of years but they have not become less demanding.

Thomson Reuters provides financial market data to businesses including banks, brokerages and investment houses. The company supplies this information via traditional trading room terminals, but more traffic is being carried over the internet, in a business worth $15bn annually.

According to Kevin Blanco, vice-president of global application support and engineering at Thomson Reuters, ensuring clients receive good service across a worldwide network is a challenge.

As a data provider to fast-moving financial markets, Thomson Reuters has to meet two targets for its services: the availability and the responsiveness of data feeds.

This is especially critical for internet-based services, since it is these that are growing most quickly.

Thomson Reuters sets a target of 99.9 per cent “uptime” for its web-based products and a maximum eight-second response time.

“Connections over dedicated circuits are expensive,” explains Mr Blanco. “There are some large banks that require dedicated circuits and we maintain them. But the majority of our products and of our strategic initiatives will be web based. There will be very few dedicated workstation installs or dedicated circuits in the future.”

But newly cost-conscious bankers want to maintain service levels to customers and this places demands on the services they buy from suppliers such as Thomson Reuters.

For Mr Blanco, this means maintaining or improving service quality levels, while controlling costs.

Financial services companies have come to expect from web-based services the reliability and responsiveness they got from dedicated links, as well as the ease of use associated with sites such as Amazon or even social media sites.

“Our user base is no longer [just] financial professionals in their 40s and 50s. The primary user is a junior banker who also uses Facebook or MySpace. Our interface and speed have to match that demographic.”

Researchers who study consumers’ online behaviour have found that visitors to websites often abandon a transaction and go elsewhere if a page takes more than two seconds to respond.

“We are not seeing [demand for] two seconds now, but it is certainly four to five seconds,” says Mr Blanco. “But I do feel that the demand will continue for response times to compress, especially for transactional services.”

Controlling latency – the speed at which trades can be completed – and network quality for a company operating global services can be expensive and demand large numbers of skilled staff to diagnose and fix problems.

Like many other IT-dependent businesses, Thomson Reuters is increasingly relying on automation to cut the cost of delivering its technology.

Streamlining systems for updating services or deploying new applications to servers has cut support costs and, vitally, has improved system uptime.

And, Mr Blanco says, Thomson Reuters is making more use of automated monitoring and diagnostic tools to control the quality of its network.

In particular, web performance and monitoring software from specialist vendor Gomez has brought some rapid and significant improvements.

“In our corporate services business, we brought their website availability up to [99.9 per cent] in two months,” says Mr Blanco.

“We’ve also done the same for the rest of the business.”
IT, with its large fixed cost base and three to four year project life cycles, was not well placed to respond to relatively rapid changes in the business climate.


Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

No comments: