Monday, June 23, 2008

FT.com / Technology / Digital Business - How should organisations react to social networking tools? Embrace them, ban them, or …?

FT.com / Technology / Digital Business - How should organisations react to social networking tools? Embrace them, ban them, or …?

How should organisations react to social networking tools? Embrace them, ban them, or …?
By Dick Eve, Change Portfolio Director, for Atkins Group

Published: June 23 2008 14:02 | Last updated: June 23 2008 14:02

A truck driver approaches a bridge that has a weight limit of 5,000kg. He and his truck weigh 4,950kg so he would be able to cross it were it not for his 100kg cargo; a flock of pigeons loose in the back of the truck. He has the bright idea of banging on the side of the truck to scare all the birds into taking flight and then he quickly drives across the bridge. Does it work?

After long debate in the New Scientist one reader said: “The practical engineer’s answer is yes, of course the driver could cross! The surplus weight of 50kg translates to a 1 per cent excess over the bridge’s specified load maximum. An additional impulse force due to the truck bumping over a small stone in the road would be much greater than this 1 per cent, and any civil engineer who designs a structure of any sort with a safety margin anywhere near as small as 1 per cent deserves all the professional liability lawsuits he or she gets.”

Why do I raise this? Because it describes nicely two things about engineers: they are practical and they are (or have to be) risk averse. We are the UK’s largest engineering consultancy, and are involved in some of the world’s most high profile building, infrastructure and transport projects. The culture engendered by this – by taking on board every possible eventuality to ensure the products of our endeavours stand the test of time – inevitably puts safety and risk aversion at the heart of what we do.

What has that got to do with social networking? Well, the adoption (or not) of any software technology is more about the culture of the business than the technology itself. You cannot introduce new technology alone and expect it to be successfully adopted in any business, especially one that is resistant to change and risks.

As an example, Microsoft’s Communicator can be installed on everyone’s computer but without education and a business change programme this generates many mixed views, ranging from “I don’t know anything about it” through, “I already get too much e-mail – I am not using that too!” to users like myself who really embrace it. So I use this for communication with a small community of like-minded individuals but not the whole network. This is strengthening strong ties within my network. The next question is how far should that network extend, should I be allowed to communicate with customers or partners or should communication be restricted to internal users only?

So why would a business restrict this powerful tool to being an internal capability when more and more business is done with one or many partners?

It is probably the uncertainty of the outcome. Could it have a negative effect on the business either through someone spending all day chatting with their personal partner rather than their business partner? Maybe the risk of litigation when someone commits something to their business partner they shouldn’t. Written documents are much more formal: they are reviewed, rewritten and are clear records of events; conversations are transient and normally decisions are confirmed in writing. Social networking today sits in between. There is no real audit. What about records management? Decisions could be made with no record.

Of course, Communicator is just the tip of the iceberg. If we consider a wider range of social networking software (SNS) such as Facebook or MySpace, where I can publish a wide variety of information about my background, interests, skills and activities, there is an opportunity for even more abuse. Innocent or otherwise, the lack of regulation regarding such content could have legal ramifications. So maybe the risk outweighs the benefits.

But there seems to be evidence that weak ties and bridges between networks are increasingly important for innovation and knowledge sharing. A social networking approach can really help to develop these weak ties which will bridge the strong networks and spawn new innovations and approaches.

Our business works largely on the strong networks built over years. If you need advice or a resource you can call on a contact you worked with several years ago and be confident they will share their experience to your advantage. But are we missing a trick here? We will always get the same answers – can we use the weak ties and maybe get some new ideas and innovations, maybe SNS is a quick way to involve the new upcoming stars and help them shape the business.

And last but not least, doesn’t the next generation expect to use these tools in work as well as at home? There will come a day when new stars will not join dinosaurs who don’t offer these techniques.

So, when the business is ready and the culture is right, businesses will have to embrace social networking. Yes, of course we need to be prudent and put in the checks and balances for protection. Social Networking Solutions are now available that have been hybridised for business use which should take some of the pain away but when we do embrace it, let’s not get paranoid; too much restriction will stifle the advantages it can deliver.
Copyright The Financial Times Limited 2008

Thursday, June 19, 2008

Managing your IT portfolio – one size does not fit all

Managing your IT portfolio – one size does not fit all
By Anita Chandraker


Financial Times, 19 juni 2008


Businesses in the financial sector are essentially information businesses, so it is no surprise that they were early adopters of IT. The attraction of technology that could manage and control information was obvious. That early adoption brought significant business advantages from automation to more sophisticated products, as well as an ability to connect with both internal financial and reporting systems and external audiences.

However, while early adopters gain the initial advantages, they can face problems down the line when new developments render the initial technology obsolete or constraining, and, in order to meet ongoing demands from customers and regulators, systems have to be replaced or new functionality bolted on.

As a result, the many companies have a proliferation of new applications integrated with heavily modified legacy applications, not helped by extensive business mergers and acquisitions bringing yet more diversity and complexity to the portfolio. The result is that most large companies have a tangled web of IT systems ranging from 30 year old mainframe systems to modern browser based applications.

The responsibility for solving this puzzle lies with the CIO, who faces a continuous stream of complex decisions and challenges each of which bring financial, business and reputational risks. These decisions range from ensuring that the IT supports the ever-growing demands of regulators for compliance data, finding a way of supporting the next new product without adding yet more complication to an already complex picture. Then there are the resourcing issues around managing ageing systems, at a time when certain technology skills are rapidly disappearing. Against this backdrop, according to Harvey Nash’s recent “Strategic Insights Survey: An IT Leadership Perspective”, what Boards want technology to deliver is enhanced operational efficiency.

The obvious answer is to simplify the portfolio, but it is clear that today’s business realities mean that the costs and risks of a big bang approach are too high for most businesses to contemplate. Instead, the CIO needs a clear strategy which recognises the complexity of the current environment, the future business needs, and provides framework for decision-making that will support the goal to simplify and modernise the IT portfolio.

So what are the key elements in that strategy? The starting point needs to be to identify which elements of the portfolio are the more back room ‘routine’ elements and which are central to the business retaining its competitive edge. The former could be managed by a programme of continuous improvement, the latter are likely to justify significant investment in radical solutions. Both are important, but separating the two helps bring clarity to the overall strategy and to the drivers of investment decisions.

Within these two segments there are a number of options for improving capability in the short to medium term.

The first is to wrap multiple systems with common front ends or process layers. There are solutions available that can effectively glue existing systems together and, at the same time, create new capabilities to fill the gaps in existing applications. This may simplify things from a user perspective but also adds another layer of complexity over the existing environment.

The second approach is to augment current legacy functionality with solutions that can be bolted on to existing systems. There are products in the market that offer specific add-on solutions in new areas of need such as fraud detection, compliance and document management and production. The challenge here is to ensure integration across areas such as data and operations.

A third option lies in porting existing systems to a lower cost platform. There are service providers who will now move older systems, typically running in a mainframe environment to a more cost effective Unix or Windows platform. This can bring significant benefits by simplifying the underlying IT architecture but may not deliver a more agile environment for adding new products.

These options can deliver simplification for the end user and cost reductions but do not solve the underlying issue of complexity or support major business transformation. For that, more radical approaches are needed.

One of those radical options is to rationalise and replace major functional applications. This is not new, but the industry has a poor record on delivering major systems replacement programmes. In order to succeed, this approach cannot simply be an IT-led initiative but needs the business as a whole to embrace the opportunity for major business transformation.

A second option is to take the Service Oriented Architecture (SOA) approach. That means looking at what new common services or building blocks will be required in the future and then set out a plan to develop and integrate these as part of new initiatives, and, in this way, gradually transition towards to the new services while decommissioning the old.

Clearly these more radical options are higher risk undertakings but can solve the fundamental underlying problems of complexity if there is a business-wide commitment to change.

In the current economic climate, and with the complexity of the puzzle, the CIO’s natural response may well be to focus on short term improvement and low cost, low risk options alone. That would be a mistake. Now, more than ever, financial organisations need a clear strategy for simplifying the IT portfolio. The gap between business need and IT fit will not close by itself and businesses need to ensure they choose the right approach for each problem – recognising that no one size can fit all.


Anita Chandraker is a member of PA Consulting Group's management team.

Monday, June 16, 2008

Draper firm fields document tracker

NextPage
Draper firm fields document tracker
Program follows files among computers or drives, enforces policies
By Tom Harvey
The Salt Lake Tribune
Article Last Updated: 06/16/2008 11:53:39 PM MDT

Companies are beginning to deploy new document management tools, created by NextPage of Draper, that allow them to track documents even if they are on an employee's computer, have been forwarded by e-mail or are on a USB drive.

NextPage is rolling out software that aims to solve document problems for large businesses that create millions of electronic records annually but have trouble managing them or even knowing which ones exist or where they reside. When sued or facing regulatory action, the companies often must spend millions of dollars just responding to demands that they produce relevant documents.

"Essentially the business problem we're chasing down is that today, in a highly sensitive legal and regulatory environment, the customers that we serve need to have tight control over all the documents they create," said NextPage CEO Darren Lee.

Lee, in a recent interview, cited a study by Dupont of five lawsuits in which it was involved. During the process of responding to requests for information, the company found 75 million documents.

"As they peeled back those 75 million documents, over half were documents that should never have existed," he said. "They should have been deleted or thrown away based on existing policy."

The cost of identifying and classifying those documents was $12.5 million.

Allan Bendall, president of Strategic Discovery Inc., a San Francisco company that helps companies manage documents, said NextPage offers a distinctive response to such concerns.

Most document management systems depend on centralized control, a system where everyone works out of the same server and must check in and check out documents. But such centralization often rankles employees.

"It's very difficult to get people to do that, particularly in professional services," Bendell said, citing "knowledge workers" who find such systems inefficient and tend to ignore protocols and work off the hard drives of their own computers.

Under NextPage's product, "people can continue to manage their documents independently, while offering some of the benefits of centralized control," said Bendell.
NextPage's solution is software that tracks documents into all of those areas where they might be stored.
"We would know of everything on your hard drive," said NextPage's Lee. "We would know of documents on your e-mail. We would know them on a USB. We'd know them if you sent them to someone. We know every single instance of it. We know every version of it."
That knowledge allows a company to enforce retention policies consistently through its ranks.
A study of 108 chief information officers found that more than 60 percent said that only half of employees cooperate with company retention policies. NextPage sponsored the study.
With NextPage's software, companies are able to set up policies that can range from benign e-mails telling an employee that a document should be destroyed or retained to the drastic: going into a hard drive and deleting a document.
"You can be as light or as draconian as need be" in enforcing retention policies, said Cyndi Tetro, vice president of marketing.
NextPage was started in 1999 when it created products for Internet searches. Those products were sold and the company then began work on its document management system.
tharvey@sltrib.com

Monday, June 09, 2008

Don’t prepare for the world as it is – it’s the future that matters

Don’t prepare for the world as it is – it’s the future that matters
By Richard Brown of Ernst & Young

Published: June 9 2008 09:42 | Last updated: June 9 2008 09:42

We can all make predictions: the financial services world will continue to change, and there will continue to be rapid technological advances, both of which stimulate further demands and increasing expectations from all stakeholders.

Virtualisation, green IT, privacy and identity management are some of the new(ish) ones on the block while cost management, sourcing and standards have never gone away. Securing mobile devices continues to be a subject of conversation at any event where CIOs gather.

Regulatory demands are unlikely to decrease. Technologies will continue to evolve – seeking the right adoption point where they move into business-as-usual. And customers will continue to demand convenience, flexibility and privacy at the same time. Business leaders will expect more for less from IT, a demand fuelled by the desire to free up funds for further growth or simply to reduce costs. And there will undoubtedly be another financial crisis to deal with – the question being not whether but simply what and when?

In reality, CIOs in the financial services sector continue to face a multitude of competing demands from changing business models to moves to offshore business and delivering major change programmes while keeping the lights on – cost effectively, of course. On top of this, the financial services sector has had other specific challenges – increasing regulatory demands and expectations, and consumer demands for flexibility, convenience plus security, to name but a few.

Continued high profile instances of loss or compromise of personal data have caused increased interest not only from the regulators but also from business partners and consumers. The credit crunch and sub-prime are taking their toll on the sector, resulting in some rapid business decisions being made on products and services offered, and a variety of cost cutting measures. More than that, there is great uncertainty about the duration and nature of the current economic situation.

How can this mountain of individual predictions be used to create a plan for the future?

A couple of things are clear. If financial businesses continue to look at individual activities, processes and incidents in isolation, then they will face the future unprepared. Organisations also need to start learning from their experiences – even if it’s just on post-project reviews, it’s a start. And finally it is also necessary to take a hard look at people and their capabilities to help you into the future.

Why are these factors so important, when there are so many others?

Each business and IT priority justifies a major programme of activity in its own right. More important, they all bring with them a host of further considerations from risks to skills, governance, timeframe, and globalisation, which need to be looked at more broadly than the main activity itself.

So can you really see the big picture? And if you can, can you see it any time you like or just once a year at planning time? And how big does the wall have to be to hold the picture? The volume of available data and the speed of consolidation make many things possible. This could mean putting a precise value on system downtime at a precise time, on a particular day, or the opportunity to become a very fast follower.

But is IT really helping to join the dots to spot the gaps and the anomalies? Has IT grown up sufficiently that we can move seamlessly from projects to business as usual, and that asking “what if?” and “so what?” becomes a natural part of doing business?

Many organisations purport to carry out post-project reviews, but in reality some brush problems under the carpet, and sometimes even the best rarely take the lessons further than the project team. In order to get better, financial services companies need to adopt a culture that continuously learns from mistakes. If there is a struggle to learn lessons and apply them, what chance is there to learn from broader business and economic situations?

Strategies, plans, processes and governance will only get you so far. People and their capabilities is the key thing. For example does the IT management team include those who are constantly scanning the horizon? Does it include those with the vision to see potential, and have the courage to seize new opportunities without losing sight of commercial reaity? Regardless of functions and roles do your teams have the right balance of cynics and visionaries?

These may be simple questions, but the financial services industry has faced an unprecedented volume of incidents and change resulting in some knee jerk reactions and shaken confidence, leaving a great deal of uncertainty. And IT is now in a unique position where it really can make or break a business. So these questions deserve well considered responses. Will you stand up to the scrutiny?

Richard Brown is the head of technology security and risk services, Northern Europe, Middle East, India and Africa, at Ernst & Young
Copyright The Financial Times Limited 2008

Sunday, June 08, 2008

Information technology

Information technology
Published: June 8 2008 19:07 | Last updated: June 8 2008 19:07

Who has not, when confronted by the daily exasperation of office technology, questioned the parenthood and purpose of information technology departments? Advocates of computing in “the cloud” hope to make them largely superfluous.

Instead of going to the effort of installing and maintaining computing locally, all those tricksy applications, not to mention storage and data processing, can be provided centrally from shared infrastructure. Merrill Lynch estimates that more efficient management of resources – such as servers – could provide services at a cost five to 10 times cheaper than that provided by a more traditional in-house approach.

The revolution has been a long time coming. Computing on tap as a concept was floated as far back as the 1960s. Sun Microsystems has been actively pushing grid, or utility, computing for almost a decade. What has changed is the rise of viable business models such as software-as-a-service. Salesforce.com is the most high-profile of these companies, but Oracle, Microsoft and SAP are all investing in subscription-based services aimed at small businesses – typically those with fewer than 1,500 employees.

So there are some valuable niches to exploit. On current growth rates, Saas sales should double between 2006 and 2011. And if subscription services can show real economies of scale in distribution and sales – not something that Salesforce.com has yet demonstrated – sky-high valuations for Saas companies might be justified.

But the segment’s sales of about $3bn remain a small fraction of a global $270bn software market. The impact of inertia should not be discounted either. Chief executives tend to dislike replacing equipment that still works. Mainframes were superseded by servers decades ago but IBM still makes and maintains them. Important security and regulatory questions have to be answered before large companies will consider the cost of moving any form of critical data into the cloud. To hope for more than slow, if steady, progress over several years is to build castles in the air.
Copyright The Financial Times Limited 2008

Friday, June 06, 2008

Intelligent information comes of age

Intelligent information comes of ageBy Tom Glocer, chief executive of Thomson Reuters

Published: June 6 2008 13:43 | Last updated: June 6 2008 13:43

In his 1977 book “Global Implications of the Information Society”, the technology visionary, Marc Uri Porat, defined the information economy as occurring when labour related to the creation, processing and dissemination of information exceeds work related to the other three economic sectors – agriculture, industry and service. Based on that definition, Proat tells us that the information economy arrived in the west in 1967, when 53 per cent of labour income in the total workforce was derived from the information sector.

Forty years later, information is at the core of all economic sectors in the developed world, and an increasingly influential element in emerging markets. The amount of information, how we receive and consume it and whether or not we trust it, has been radically altered with the advent of the internet. This revolution has also created a growing global need for “intelligent” information.

What is intelligent information? It is certainly insightful and well written text, but it is also dynamic content delivered in electronic formats. It is self-describing, self-organising and action-oriented. As we pass from Web 2.0 to the semantic web envisioned for Web 3.0, intelligent information will be its common language.

Intelligent information has never been more valuable. Professionals will pay for just the right information delivered at just the right time and place in their workflow. In fact, people such as lawyers, doctors, scientists, accountants and those who power the world’s financial markets, will pay to be given less information, but precisely the right information that helps them make better decisions, faster.

None of us would pay for tomorrow’s weather forecast because the information has been commoditised, it is universally available. If you are a provider of consumer-grade weather information, you have little option other than to monetise your content via advertising. But if you provide very accurate long-range hurricane forecasts businesses such as property and casualty insurers will pay you handsomely for your professional grade information.

With the number of professionals growing, especially in emerging markets such as China and India, the global demand for intelligent information is booming. The world, in a sense, is professionalising, and doing so in real time. At the same time, physical industries are transforming into information businesses. We saw this in financial markets as currencies went off the gold standard and began trading electronically. A similar transformation is underway in the pharmaceuticals industry. Once a chemical compound discovery and manufacturing business, it is now all about decoding, analysing and manipulating the human genome.

As a result, a huge opportunity has emerged for companies that can provide access to intelligent information. The information majors are already establishing their territories. At Thomson Reuters we are staking our claim to the delivery of the kind of critical intelligent information that professional decision-makers must rely upon to do their jobs.

Businesses and professionals in the financial, legal, tax and accounting, media, scientific and healthcare markets require huge amounts of information, typically delivered in real-time, which can be consumed by machines as well as human beings.

To meet these demands, the new class of “information majors” will need to be global, highly innovative, experts in each sector they seek to serve, and able to invest in huge databases and sophisticated search and data mining capabilities.

Copyright The Financial Times Limited 2008

Thursday, June 05, 2008

Technology and financial services – with power comes responsibility

Technology and financial services – with power comes responsibility
By Neville Howard of Deloitte

Published: June 5 2008 10:09 | Last updated: June 5 2008 10:09

Financial services institutions were early adopters of technology and have used it to transform their businesses. The earliest areas of IT spend were on accounting software in the banking and insurance sectors where technology could automate highly manual processes, dramatically increase accuracy and reduce costs.

This spending was initially focused on process automation in areas where benefits were clear and easy to define, such as clearing and settlement. The entire middle office was, in effect, removed, resulting in huge savings.

The trend of replacing people with technology gathered pace during the 1990s and vast technology departments were created, often employing as many staff as traditional non-IT banking.

However, by the end of the 1990s rapid expansion of technology across all areas had led many financial services organisations to create huge IT silos that were not always aligned to core business strategy and were too large to be agile and responsive to rapid changes in market conditions. More and more ways to introduce technology were found and the business case for investment became increasingly blurred.

Fast forward to 2008 and we are now seeing a change in direction. The most progressive financial service institutions are dismantling their IT silos and realigning their IT teams with the business units they serve.

Now the chief information officer will often come from an operations background, bridging the gap between the business and technology. And CIOs are receiving recognition and a place on the board where they can help define business strategy, rather than simply provide a commodity service.

Today, the trend is for technology spend to be focused on business value and clearly aligned to strategy. For example, investment banks are using technology to automate trading where the risk levels can be set and activity monitored. And clients are offered on-line portfolio aggregation where they can move between asset classes at the press of a button and the entire process is completed quickly and accurately using straight-through processing.

Technology has helped the financial services industry become the being it is today. Without technology, traders would still wear bright coloured jackets and leap around energetically, insurance claims would take weeks to settle and banks would not be able to offer products such as offset mortgages. It has also allowed investment banks to exploit tiny arbitrage differences to make vast sums of money, enabled back and middle offices to be dismantled creating huge cost efficiency and enabled the creation of highly complex financial instruments.

Technology can also produce problems: we are going through a banking crisis in which technology played a part. The creation and globalisation of collaterised debt obligations (CDOs) was based on complex packaged debt rapidly sold via cross border, electronic trading.

To move forward, the financial services industry needs to ensure that the business drives technology and that the right people, with the right skills are retained to ensure that governance and controls can be put in place. If this is progressed the technology can remain centre stage and continue to shape, benefit and enhance the financial services industry.

Technology can certainly deliver great benefits, without creating dependency, but with it the need for controls and governance grows ever more important. With power comes responsibility. And the financial services industry is now realising the responsibility that the use of technology entails.

Neville Howard is a partner in Deloitte’s consulting practice.
Copyright The Financial Times Limited 2008

Technology and financial services – with power comes responsibility

Technology and financial services – with power comes responsibility
By Neville Howard of Deloitte

Published: June 5 2008 10:09 | Last updated: June 5 2008 10:09

Financial services institutions were early adopters of technology and have used it to transform their businesses. The earliest areas of IT spend were on accounting software in the banking and insurance sectors where technology could automate highly manual processes, dramatically increase accuracy and reduce costs.

This spending was initially focused on process automation in areas where benefits were clear and easy to define, such as clearing and settlement. The entire middle office was, in effect, removed, resulting in huge savings.

The trend of replacing people with technology gathered pace during the 1990s and vast technology departments were created, often employing as many staff as traditional non-IT banking.

However, by the end of the 1990s rapid expansion of technology across all areas had led many financial services organisations to create huge IT silos that were not always aligned to core business strategy and were too large to be agile and responsive to rapid changes in market conditions. More and more ways to introduce technology were found and the business case for investment became increasingly blurred.

Fast forward to 2008 and we are now seeing a change in direction. The most progressive financial service institutions are dismantling their IT silos and realigning their IT teams with the business units they serve.

Now the chief information officer will often come from an operations background, bridging the gap between the business and technology. And CIOs are receiving recognition and a place on the board where they can help define business strategy, rather than simply provide a commodity service.

Today, the trend is for technology spend to be focused on business value and clearly aligned to strategy. For example, investment banks are using technology to automate trading where the risk levels can be set and activity monitored. And clients are offered on-line portfolio aggregation where they can move between asset classes at the press of a button and the entire process is completed quickly and accurately using straight-through processing.

Technology has helped the financial services industry become the being it is today. Without technology, traders would still wear bright coloured jackets and leap around energetically, insurance claims would take weeks to settle and banks would not be able to offer products such as offset mortgages. It has also allowed investment banks to exploit tiny arbitrage differences to make vast sums of money, enabled back and middle offices to be dismantled creating huge cost efficiency and enabled the creation of highly complex financial instruments.

Technology can also produce problems: we are going through a banking crisis in which technology played a part. The creation and globalisation of collaterised debt obligations (CDOs) was based on complex packaged debt rapidly sold via cross border, electronic trading.

To move forward, the financial services industry needs to ensure that the business drives technology and that the right people, with the right skills are retained to ensure that governance and controls can be put in place. If this is progressed the technology can remain centre stage and continue to shape, benefit and enhance the financial services industry.

Technology can certainly deliver great benefits, without creating dependency, but with it the need for controls and governance grows ever more important. With power comes responsibility. And the financial services industry is now realising the responsibility that the use of technology entails.

Neville Howard is a partner in Deloitte’s consulting practice.
Copyright The Financial Times Limited 2008

Wednesday, June 04, 2008

Alfresco schließt OEM-Partnerschaft mit alfa Media

Alfresco schließt OEM-Partnerschaft mit alfa Media

Der Open Source Enterprise Content Management (ECM)-Anbieter Alfresco und der deutsche Medien-Dienstleister alfa Media haben eine OEM-Partnerschaft abgeschlossen: alfa Media wird die Alfresco-Software komplett in seine alfa MediaSuite, ein modernes Multimedia-Redaktionssystem, integrieren. Mit Alfresco verfügt die alfa MediaSuite damit über ein sehr leistungsfähiges Datenmanagement.

Monday, June 02, 2008

IT in Financial Services: Modernising – innovate or hibernate?

IT in Financial Services: Modernising – innovate or hibernate?
By Peter Redshaw, research vice president at Gartner

Published: June 2 2008 12:45 | Last updated: June 2 2008 12:45

The IT department in a financial services institution (FSI) is a difficult place to be right now. Just as it’s being asked to do more than ever – enabling more personalisation, faster time-to-market, greater agility and tighter compliance and risk management – the sub-prime crisis erupts and its IT budget gets slashed. So the emphasis shifts to efficiency and how it can run ever larger volumes of electronic transactions, for less money, on a creaking infrastructure of legacy applications. Inevitably, the prospect of modernising that old IT portfolio is raised again.

The problem is that there is so much of it and it is all tangled together in a Gordian knot of home-built IT. For many years, the conventional approach in this industry has been to chip away at it a little bit at a time. That approach can work in other industries, such as manufacturing, retail or utilities, where they may eventually rationalise and consolidate their IT portfolios.

However, the trouble with an FSI and its intangible assets is that IT is at the very core of everything it does and part of every customer contact. It is what defines its products and processes, the customer experience, the communications and distribution. Increasingly it is how it innovates. Hence, the tendency is for the IT portfolio at a bank to swell faster than any chief information officer (CIO) can chip away at it and the legacy application set remains stubbornly rooted in its operations.

The alternative “big-bang” approach to IT modernisation is usually dismissed as too risky. IT tends to be even more conservative than the trading or asset management activities it supports. But the barriers to modernising IT at an FSI are mostly nothing to do with IT – the barriers are to do with people, culture, politics, bureaucracy, and so forth. It is much more about project management issues, job security and incentivisation – how many CIOs are sufficiently encouraged to take on risk or to break up their own empires?

Making subtle tweaks to the IT portfolio achieves very little and certainly isn’t going to make a trailing bank suddenly competitive. If a CIO is saddled with a risk-averse culture and a heavily regulated industry (that insists on the banking equivalent of emission controls and crash safety tests), is it really worth modernising IT at all? After all, most banks are still making lots of money, even after the impact of the credit crunch.

IT modernisation needs to address two key issues: one is that simply running-the-bank soaks up about 70 per cent of the IT budget at a typical FSI, and the other is that poor IT hits profitability. High operating expenses mean that very little IT budget is left for innovation, and that means differentiation is being eroded. Poor IT means that product margins are also getting squeezed and that customers show less loyalty, which diminishes the bottom line.

What a CIO needs to do to turn this around is to develop much more sophisticated financial models for calculating the economic value added to the FSI, replacing the current models for calculating the return on investment that technology X offers over technology Y. They need to be able to see if a big-bang approach would radically reduce their cost/income ratio or transform their return on equity. Without a radical boost to shareholder value, why change?

The net result of this would be to polarise the current continuum into two opposite camps – the dichotomy of “innovate or hibernate”. The innovate camp is at the leading edge of technology and embraces the big-bang approach, while the hibernate camp adopts the ultra-conservative approach that avoids IT change until absolutely necessary.

Innovate has developed accurate cost-benefit models that link IT changes to business metrics, so that it can quantify benefits and justify the radical transformations it encourages. Meanwhile, hibernate concentrates on using the “oil-can” by keeping its systems running for the absolute minimum cost (maybe through offshore outsourcing) while building up a war-chest of cash with the money it saves. Ultimately this must be a short-to-medium term approach; hibernators must constantly monitor the market and be prepared to buy up small, smart, innovative FSIs (that may have had the luxury of a green-field start) and then use them as incubators. The hibernator can then gradually migrate its customers and its data over to its protégé once the local contextualisation and scalability is in place.

These are not easy options – the innovators must find the tools needed for more sophisticated financial modelling of IT and the hibernators must efficiently manage global sourcing and spot their potential acquisitions. The vital thing is to avoid the worst-case scenario which for an FSI is to sit in the middle between these two extremes. The in-betweeners will fritter away their IT budget on incremental modernisation for little gain. Far from being fast followers – as they might like to think of themselves – they will be ditherers and laggards.
Copyright The Financial Times Limited 2008

Sunday, June 01, 2008

SharePoint in the Enterprise

Where does Microsoft's ECM tool fit within the overall context of an organization's ECM strategy?

May/June 2008
— Russ Edelman


It should not surprise you to hear that Microsoft Office SharePoint Server 2007 has proven to be the biggest phenomena in enterprise content management, bar none. While some may argue that it is not fully baked, organizations around the world are aggressively implementing SharePoint 2007 for global projects and doing so with large volumes of content.

The observations in this article come from both first-hand experience and conversations with many people who are affected by SharePoint’s unstoppable growth. Evan Richman, Microsoft’s global ECM product manager for SharePoint, comments, “The popularity of SharePoint has even surprised Microsoft. It is clear that SharePoint represents the first truly collaborative ECM platform and, as such, offers a unique value proposition in the marketplace. Traditional ECM solutions are designed to address specific business needs that impact a small percentage of information assets with a low percentage of the workforce participating. In comparison, SharePoint 2007 has been designed to drive broad information worker participation in ECM. It enables organizations to manage previously unmanaged content, thus maximizing the value of an organization’s information assets while supporting broadening compliance challenges.”

Understandably, Richman’s perspective may be interpreted as one with a slight bit of bias. However, Gartner research vice president Toby Bell reinforces Richman’s comments, “SharePoint has risen to become the top search term across Gartner and is generating huge inquiries from our clients.” Bell added, “What is uniquely different about SharePoint 2007 in comparison to its predecessors and other ECM systems is that the business community has become highly vocal and active in pursuing SharePointbased solutions.”

As an additional point of validation, Doculabs’ ECM industry guru, Jeetu Patel, has had similar experiences as the Doculabs team counsels global organizations with their ECM strategies. Patel added, “Earlier versions of SharePoint were dismissed as departmental solutions. In comparison, SharePoint 2007 is now being incorporated into the fabric of international organizations’ ECM infrastructure. These strategies are being driven by either pure SharePoint-based solutions or co-existence strategies that allow SharePoint and legacy ECM systems to live harmoniously together.”

Two Fundamental Paths with SharePoint 2007
Patel’s comments suggest the two primary paths that are being followed by most organizations regarding SharePoint. The first path is followed by organizations that have no existing ECM systems in place. The second path is traveled by organizations that do employ existing ECM systems. In each case there are distinct advantages and disadvantages that must be recognized and addressed. However, regardless of the path chosen, there are also a number of common issues that must be considered in order to achieve the desired outcome. These are summarized in the table on page 48.

Path 1 – SharePoint Only
Despite every ECM practitioners’ hopes, ECM technologies are still not employed and actively used in most organizations around the world. However, the introduction of SharePoint 2007 has served as a powerful catalyst of awareness for the benefits of ECM and this message has been acknowledged by a much broader community than in the past. In fact, despite the competitive concerns introduced to all of the existing ECM vendors, most would be quick to point out that SharePoint has moved ECM to center stage as a strategic investment.

Given the raised awareness of SharePoint 2007, many companies are now considering or implementing the tool as a first step into the world of ECM. While the move into ECM is applauded, it is important that those organizations have a clear understanding of the challenges as well as the benefits. Beyond the summary in the table, there are a number of important factors that should be taken into account when SharePoint is the only ECM platform deployed.

Strategic Initial Successes – While this point isn’t specific to SharePoint, it is appropriate to highlight as many organizations implement SharePoint in a broad capacity during initial deployment. Such a broad approach creates a higher probability of partial or complete failure. Why? First, SharePoint offers a broad set of capabilities. If the functionality is not clearly understood, users are confused and the value is diminished. Second, SharePoint is very easy to initially install and configure. However, when broader requirements must be considered, more planning is typically necessary to ensure that the system is properly optimized. Third, a strong communications strategy is necessary to ensure that users understand the capabilities; that the project’s success is carefully tracked; and that problems are immediately resolved.

WSS vs. MOSS – Windows SharePoint Services (WSS) is the free version of SharePoint that accompanies every version of Windows Server 2003 and greater. It provides a rich set of capabilities for departments or smaller organizations and, as mentioned above, can be installed and configured with relative ease. In comparison, Microsoft Office SharePoint Server (MOSS) is the paid-for-version of SharePoint and offers a much richer set of capabilities for the enterprise. Consequently, understanding the differences and which platform is right for your organization will become very important. In some cases, organizations have elected to use a hybrid of WSS and MOSS. When this path is chosen, take the time to understand the implications of a joint approach.

Governance and Standards – SharePoint’s popularity, to a large degree, has been driven by business people who were able to immediately stand up the product and begin using it. While this grassroots movement is a clear indication of the product’s popularity and value, it also has the potential to introduce challenges around standardization and governance. When deploying Share- Point, it will be important to solidify a governance and standards strategy and that strategy should be designed to evolve as the use of the product evolves.

3rd Party Product Usage – Despite SharePoint’s extensive capabilities, we are regularly reminded that the product does not solve every problem. By design, Microsoft recognized that it would be able to provide a broad and deep level of functionality; however, it would not be able to incorporate every feature request into the offering. As a result, the voids that were left open have been and continue to be filled by 3rd party vendors. With Share- Point 2007, there is an active community of 3rd party vendors; a benefit for the vendors and their customers. However, careful thought should be given regarding the use of certified partners and the balance of 3rd party products versus customizations. In some cases, organizations would have to rely upon a large number of 3rd party vendors for discreet pieces of functionality. In those cases, companies have trended towards relying upon a smaller number of 3rd party vendors for substantive functionality (imaging, search, co-existence) and then developing a few customizations for the simpler requests. This lessens the dependency upon a larger number of the 3rd party vendors.

Path 2 – Coexistence
Given the readership of this publication, most of you will probably need SharePoint to coexist with an existing system. That said, the issues raised regarding Path 1 remain applicable as you stride down the coexistence path—with a few extra issues. The coexistence path is actually comprised of two intertwined sub-paths. First, there is often a genuine need for SharePoint to exist with the existing ECM systems. Second, some coexistence strategies are temporary in nature with the ultimate goal of migrating some or all existing ECM systems to SharePoint. In either case, there are a collection of considerations that should be anticipated when planning an effective coexistence strategy (the items below are common, but do not comprise an exhaustive list).

“Surfacing” Existing ECM Repositories – The notion of “surfacing” an existing ECM repository in SharePoint has become widely accepted as a common approach for allowing users to retrieve files from the existing ECM system through a SharePoint interface. However, surfacing capabilities will vary considerably depending upon the techniques selected. In some capacities, most or all of the functionality will be made available in a secured fashion from the existing ECM system. In other cases, surfacing is nothing more than a read-only view into the existing ECM system that allows for the display of search results and the file. Of course there are variants between the two extremes.

For example, SeeUnity is dedicated to providing this capability as well as migration tools that allow for content to be migrated from the existing ECM systems into SharePoint. Dan Anderson, co-founder and vice president of SeeUnity comments, “Our dualprong approach has allowed companies to leverage the capabilities of the existing ECM systems without having to rush through the migration. This allows for their clients to control a migration to SharePoint, if that is the desired objective.” With prior versions of SharePoint, existing ECM vendors didn’t pay much attention to a co-existence strategy. However, with SharePoint 2007’s unprecedented success, they quickly saw the writing on the wall and either partnered with companies like SeeUnity or developed their own surfacing solution. Now, many ECM vendors see SharePoint as the front end to their backend and they do so through surfacing.

Functionality Coexistence between SharePoint and Existing ECM Systems – Two factors are impacted by this consideration. First, because SharePoint still has some gaps in its capabilities in comparison to existing ECM products, you must take appropriate steps to address the disparities in functionality. For example, a few ECM systems allow for version level security and SharePoint has a broader security model that allows for security at the major and minor levels. If users have become accustomed to a different security model, you’ll need to ensure that SharePoint security will be sufficient and that there are differences between the two models.

Second, content from either SharePoint or the existing ECM system (or both) often needs to be referenced in the other product’s workflow or some other module. For example, a press release in SharePoint may need to rely upon a workflow within the existing ECM system. To accommodate this the respective products must be integrated to support such requirements. As more of the existing ECM vendors fully recognize the emergence of SharePoint, they are aggressively building these types of integrations. It is also worth noting that one of the common integration points being employed is that of the existing ECM repository being used for records management.

Migrations to SharePoint – As referenced earlier, co-existence typically includes some type of migration of content from the existing ECM system to SharePoint. For those unfamiliar with migrating between ECM systems, be prepared for an experience that is more involved than you might imagine. Factors that need to be considered include the timing and sequencing of the migration, metadata mapping when there are variances between the metadata, and the stratification of content in SharePoint to accommodate site structures as compared to one or a few centralized repositories. This does not take any of the testing and training into account that accompanies migrations.

To wrap up, it will be important to discern fact from fiction as you speak to people about SharePoint 2007. With SharePoint’s popularity, many new people have entered the fray. In general, this is good. However, it also introduces a challenge or two. First, if the only exposure they’ve had to ECM is SharePoint, they may be out of touch with the many nuances and issues associated with a successful ECM project. Part of this problem is that they look at SharePoint as pure infrastructure as compared to a complete ECM solution. Additionally, everyone is claiming to be a SharePoint expert these days; with certifications and without. Make sure you go through a diligent qualification process to understand the extent of their capabilities and to determine if they have development expertise against the SharePoint object model as compared to administration/configuration experience. It makes a difference!

--Russ Edelman(russ.edelman@corridorconsulting.com) is president and CEO of Corridor Consulting (www.corridorconsulting.com ). A frequent speaker and industry expert, Russ is also chair of AIIM’s Emerging Technology Advisory Group.