Wednesday, May 26, 2010

The Sustainability Imperative - Harvard Business Review

The Sustainability Imperative - Harvard Business Review: "The Sustainability Imperative"
by David A. Lubin and Daniel C. Esty

Most executives know that how they respond to the challenge of sustainability will profoundly affect the competitiveness—and perhaps even the survival—of their organizations. Yet most are flailing around, launching a hodgepodge of initiatives without any overarching vision or plan. That’s not because they don’t see sustainability as a strategic issue. Rather, it’s because they think they’re facing an unprecedented journey for which there is no road map.

But there is a road map. Our research into the forces that have shaped the competitive landscape in recent decades reveals that “business megatrends” have features and trajectories in common. Sustainability is an emerging megatrend, and thus its course is to some extent predictable. Understanding how firms won in prior megatrends can help executives craft the strategies and systems they’ll need to gain advantage in this one.

The concept of megatrends is not new, of course. Businessman and author John Naisbitt popularized the term in his 1982 best seller of the same name, referring to incipient societal and economic shifts such as globalization, the rise of the information society, and the move from hierarchical organizations to networks.

Our focus is on business megatrends, which force fundamental and persistent shifts in how companies compete. Such transformations arise from technological innovation or from new ways of doing business, and many factors can launch or magnify the process of change. Business megatrends may emerge from or be accelerated by financial crises, shifts in the social realities that define the marketplace, or the threat of conflict over resources. The geopolitics of the Cold War, for example, drove the innovations that launched both the space race and rapid developments in the field of microelectronics—ultimately unleashing the information technology megatrend. Electrification, the rise of mass production, and globalization were also megatrends, as was the quality movement of the 1970s and 1980s. The common thread among them is that they presented inescapable strategic imperatives for corporate leaders.

Why do we think sustainability qualifies as an emerging megatrend? Over the past 10 years, environmental issues have steadily encroached on businesses’ capacity to create value for customers, shareholders, and other stakeholders. Globalized workforces and supply chains have created environmental pressures and attendant business liabilities. The rise of new world powers, notably China and India, has intensified competition for natural resources (especially oil) and added a geopolitical dimension to sustainability. “Externalities” such as carbon dioxide emissions and water use are fast becoming material—meaning that investors consider them central to a firm’s performance and stakeholders expect companies to share information about them.

These forces are magnified by escalating public and governmental concern about climate change, industrial pollution, food safety, and natural resource depletion, among other issues. Consumers in many countries are seeking out sustainable products and services or leaning on companies to improve the sustainability of traditional ones. Governments are interceding with unprecedented levels of new regulation—from the recent SEC ruling that climate risk is material to investors to the EPA’s mandate that greenhouse gases be regulated as a pollutant.

Further fueling this megatrend, thousands of companies are placing strategic bets on innovation in energy efficiency, renewable power, resource productivity, and pollution control. (See the sidebar “Fueling the Megatrend.”) What this all adds up to is that managers can no longer afford to ignore sustainability as a central factor in their companies’ long-term competitiveness.

Fueling the Megatrend
Venture investing in clean tech reached a nearly $9 billion annual run rate in 2008 and shows signs of growing again after a slowdown in 2009. The flow of private-sector investment into the clean tech marketplace has been estimated at more than $200 billion a year—with fast growth not just in the United States and Europe but in China, India, and the developing world. And G20 governments have earmarked some $400 billion of their $2.6 trillion in stimulus funds for clean tech and sustainability programs.

Learning from the Past: Quality and IT

Megatrends require businesses to adapt and innovate or be swept aside. So what can businesses learn from previous megatrends? Consider the quality movement. The quality revolution was about innovation in the core set of tools and methods that companies used to manage much of what they do. Quality as a central element of strategy, rather than a tactical tool, smashed previous cost versus fitness-for-use barriers, which meant the table stakes were dramatically raised for all companies. The information technology revolution was about tangible technology breakthroughs that fundamentally altered business capabilities and redefined how companies do much of what they do. Digital technologies deeply penetrated corporations in the 1980s and 1990s, and the trend accelerated as IT made its way into the daily lives of workers and consumers with the advent of desktop computing and the internet.

In both the IT and quality business megatrends—as in others we’ve studied—the market leaders evolved through four principal stages of value creation: First, they focused on reducing cost, risks, and waste and delivering proof-of-value. Second, they redesigned selected products, processes, or business functions to optimize their performance—in essence, progressing from doing old things in new ways to doing new things in new ways. Third, they drove revenue growth by integrating innovative approaches into their core strategies. Fourth, they differentiated their value propositions through new business models that used these innovations to enhance corporate culture, brand leadership, and other intangibles to secure durable competitive advantage.

The quality story.
The economic downturn of the late 1970s, coupled with the 1979 oil shock, drove a dramatic shift in consumer preferences toward efficiency. Many industries were transformed, perhaps none more dramatically than the automotive sector. Of course, the seeds of change had been planted earlier. In the years after World War II, Japan had rebuilt its industrial infrastructure on a model of high-volume, low-cost factories that mass-produced goods of questionable durability and quality. “Made in Japan” was not considered a brand asset. By the mid 1970s, however, Japanese government and business leaders had seized upon the ideas of Edwards Deming and others who stressed quality as a core value. This incremental, process-oriented approach to systematic improvement fit well with Japanese executives’ views on how to drive change to compete effectively in the global market. Leading firms including Toyota and Honda embraced Total Quality Management (TQM) methods, fundamentally shifting their value propositions. Quality methods called into question the assumptions managers had relied on for decades, namely that high quality and affordability were mutually exclusive.

The focus on quality—initially adopted as a means of reducing defects—delivered a greater advantage to companies that took a holistic view and drove changes across their business operations. The famed Toyota Way applied quality methods to every stage of value creation from concept to customer—and ultimately to intangibles such as brand, reputation, and corporate culture. The reputational harm Toyota is experiencing thanks to the recent recalls underscores how important quality continues to be to the firm’s central value proposition. Toyota’s current troubles also highlight the need for firms to align core elements of strategy. In this case, the dissonance between its long-term quality strategy and a more recent topline growth strategy has seriously undermined Toyota’s model for value creation.

Rey Moore, the former chief quality officer at Motorola, describes a similar evolutionary process at the communications giant. Like most firms, Motorola first used quality methods to improve fault and error detection and thus reduce cost, waste, and risk. As those methods proved valuable, the company began to redesign manufacturing processes and product development functions to proactively reduce risks of product failures, functional inadequacies, and other inefficiencies rather than simply detect them. As quality’s potential business impact grew, Motorola developed Six Sigma methods and a standardized tool kit including items like Pareto charts and root-cause analysis models to take quality to scale. Eventually, quality became a defining attribute of Motorola’s brand and culture and a source of competitive advantage. The same story unfolded at firms in all industry sectors as leading companies rode the quality wave to enhanced growth and profitability—delivering a clear quality premium for their shareholders.

Capturing the Eco-Premium
http://hbr.org/2010/05/the-sustainability-imperative/sb2

The IT story.
When the recession of 1982 hit, pressure mounted at many companies to increase productivity, particularly by using emerging information technology innovations to drive cost savings. The early returns on these efforts were mixed. As with quality, skeptics described IT as a black hole into which firms poured money with little return. But some corporate leaders saw that the strategic application of IT could drive growth and provide decisive advantage. American Airlines, a classic example, captured more than 40% of all U.S. airline transactions thanks to its innovative Sabre reservations system.

A lesser known case is American Hospital Supply’s deployment of a revolutionary online purchasing system, which allowed hospitals to order medical supplies electronically, reducing costs, time, and errors for both the company and its customers. Over the next decade, the Analytic Systems Automatic Purchasing system—better known as ASAP—transformed how AHS delivered value to its customers.

Building on its success improving efficiency and reducing inventory risk, the firm developed service innovations that enabled it to deliver any product from any manufacturer at any time from any desktop computer to any hospital supply room. In the process, AHS amassed an extensive product and price database that gave AHS a clear advantage over less nimble competitors. Finally, AHS used IT to evolve its business model. The company, which had been a single-source materials provider to its hospital clients, began taking over their inventory management and procurement processes. This IT-driven innovation established the AHS brand as the leader in its business with a competitive edge based originally on price and later on service and helped the company grow earnings from $42 million in 1974 to $237 million in 1984.

The IT and quality megatrends show us that firms seeking to gain advantage in sustainability will have to solve two problems simultaneously: formulating a vision for value creation and executing on it. In other words, they must rethink what they do in order to capture this evolving source of value; and they must recast how they operate, expanding their capacity to execute with new management structures, methods, executive roles, and processes tailored to sustainability’s demands.

Tuesday, May 18, 2010

Successful mergers rely on taming the systems

Successful mergers rely on taming the systems
By Alan Cane

Published: May 18 2010 16:39 | Last updated: May 18 2010 16:39

As any business school student knows, the classic motives for a merger or acquisition are economies of scale or scope, synergies and market share.

But information technology is simultaneously the biggest enabler of those aims and the biggest constraint for most large businesses. As Vimi Grewel-Carr of Deloitte, the consultancy, points out: “Successfully merging the technology systems of two organisations is an imperative for the delivery of the benefits.”

Too few companies, however, take this into account when carrying out due diligence: it means the merger process can limp on for longer than is commercially acceptable while incompatible systems are persuaded to talk to each other – with the risk that they might even collapse entirely.

Accenture, the consultancy, reports that almost half the top executives involved in M&A point to a weakness in combining IT operations as the main reason for integration failure.

Andrew Morlet, head of IT strategy and M&A at Accenture, argues that bringing IT into the planning process at an early stage is critical.

Neil Louw, European chief technology officer for the IT services group Dimension Data, points out that as IT is now the production engine for most organisations, a thorough understanding is required – how it works, what its limitations are and where the differences lie between the two organisations. Without this “the acquisition could easily fall flat on its face”.

But he says that in the majority of cases, the IT team is only brought in at the planning or even post-acquisition stage: “Having chief information officers and their teams involved from the due diligence stage of any acquisition is more important than having a tight timetable during the merger process.

“Involvement from the IT department at this stage enables a focus on the right technical questions that other members of the M&A team ignore.”

Giles Nelson, chief technology strategist for the computer services group Progress Software, says companies are increasingly hampered by a lack of an M&A process, over-integration, loss of IT staff and a failure to realise assets and economies of scale.

“Working to specific timetables and quick decisions are of paramount importance. IT people are good at vacillating,” he says.

“Leadership and objectives are vital to ensure that things move forward apace. Quickly identify the top three areas of integration – for example, giving customer service access to all customer information systems – and get something done quickly.”

Success depends on a clear vision and a determined management. One battle-scarred veteran of the very public merger of two UK companies recalls the difficulties: “We would arrange a meeting between the two IT teams. We would go through the plans for integration, making sure everybody understood and agreed what they were being asked to do. Then they would go off and carry on just as before.”

Hugely frustrated, he left to pursue other objectives – and still prefers to remain anonymous.

Mark Nutt, general manager for computing services company Morse, warns that too many companies have too little knowledge of what IT assets they possess and how they work together: a first step, therefore, is a full IT audit before settling on a platform.

He recommends an agile and scalable architecture, with software development taking place in short phases with continuous feedback and modification.

“The problem is that this is still relatively rare,” he says. “Often, organisations have a diverse range of legacy platforms that have been in place for a number of years, resulting in an overly complex environment. In the event of a great upheaval such as a merger, the complexity and potential for increased downtime present a significant challenge.”

Lack of knowledge of the software assets – that is, the software licences – each party holds can also spell trouble.

Martin Mutch, chief executive of the Rocela consultancy, which specialises in Oracle software, says most companies fail to understand or even consider the implications of M&A on their licensing of enterprise software. Yet it constitutes a large part of the IT budget, he says.

A number of technology options are available to companies seeking to integrate incompatible systems.

There are, for example, commercially available architectures such as Avaya’s “Aura”.

Michael Bayer, president of field operation for the company in the EMEA region, describes it as “an open-systems based architecture designed to integrate applications and devices across multi-vendor, multi-location and multi-modal businesses”.

“By taking an open-standards approach,” he says, “existing systems can be left in place and will interoperate easily with the new partner systems, meaning that systems do not clash and there is no need for a complete IT overhaul.”

David Davies of Corizon, which describes itself as the “enterprise mashup company”, not surprisingly advocates this technology, which unites two or more external sets of data or functionality to create a new, web-based service.

It “allows companies quickly and simply to combine different systems into a single, virtual desktop that makes it easy for employees to carry out their tasks”, he says.

Cloud computing – internet-based, on-demand computing – offers yet another approach.

Ewen Anderson, managing director of Centralis, the computing services group, says: “If neither organisation has a clear best of breed advantage (in IT), moving to a shared service or externally hosted cloud solution may well be a better option than trying to co-habit or merge.”

Mike Altendorf, European director for EMC Consulting, agrees, saying a private cloud can be the perfect bridge: “It facilitates the secure movement of information both between and within organisations.

“It centralises and consolidates the infrastructure from both companies in a time-frame the merger requires. We have even seen the cloud acting as a holding function to enable companies to migrate their systems.”

But even if one party’s systems are clearly superior, there can be dangers. Philip Keown, partner at Grant Thornton, the chartered accountant, warns that each head of IT will champion their own systems, “just as every parent will say their baby is beautiful”.

“In any merger situation, the integration into a single culture is key and this is as important for IT people as it is for the rest of the firm.

“Businesses need to look at the resources, technical and managerial skills needed in the merged organisation and then decide which people best fit the roles.”

Yet another approach is advocated by Colin Rowland of OpTier, the computing services group.

He argues that management should see IT from a business transaction perspective, which, he says, enables managers to see the impact of every action the business takes and gain early warning of problems.

“Rather than stumbling around in the dark, you are creating operational intelligence to get the job done right and ensure IT is underpinning the success of the M&A, not undermining it.”

But returning to the initial question: how important is IT in the merger process?

Mr Keown of Grant Thornton says it depends on how important IT is to the business as a whole. For some companies, IT is the business; for others, getting it wrong might be an irritant rather than a calamity.

He adds: “If a business reached a position, say three years down the line, in which two supposedly merged companies are both still using their legacy systems, then you have to question why they merged in the first place.”

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Technology takes a lead in cutting carbon

Technology takes a lead in cutting carbon
By Jessica Twentyman

Published: May 18 2010 16:39 | Last updated: May 18 2010 16:39


Compelling viewing: in an otherwise sluggish year for the IT sector, the worldwide market for videoconferencing technology grew 16.7% in 2009


IT teams have been battling to overturn the data centre’s reputation as a vast and inefficient contributor to the corporate energy bill and to its carbon footprint.

Widespread adoption of virtualisation technology – running multiple systems on each piece of hardware – has allowed big reductions in the number of power-hungry and under-utilised servers, which are replaced by fewer, larger machines capable of processing several workloads and operating closer to full capacity.

This has allowed many companies to report significant reductions in the amount of power and cooling needed to keep their data centres running.

But the transformation of IT from sinner to saint still has some way to go, according to Gary Hird, technical strategy manager at the John Lewis Partnership, the UK retailer.

As he and his colleagues have met – and even exceeded – their goals for data centre efficiency, they have started looking to other areas where technology can help reduce the company’s environmental impact.

For example, IT staff are working on transport optimisation and fuel monitoring, and are making improvements to the company’s demand forecasting system, with the aim of reducing food waste.

In a recent blog, Doug Washburn, an analyst with Forrester Research, an IT market research company, described how the scope of green IT is expanding: “While the industry’s initial and continued focus is on the data centre, organisations are realising they have bigger opportunities in distributed IT and, even more so, outside IT altogether.”

He pointed to Forrester data that show about 55 per cent of the IT department’s power use is consumed by other assets, such as PCs, monitors, printers and phones.

More importantly, he says, the IT industry is only responsible for about 3 per cent of the world’s greenhouse gas emissions – making the case for using technology to reduce environmental impact across broader business operations compelling.

Forrester now distinguishes between “green for IT” (the effort to reduce the environmental impact of IT operations) and “IT for green” (the use of technology to drive sustainability beyond the IT department).

Videoconferencing is one early example of using “IT for green”. Visual collaboration – whether conducted via dedicated, state-of-the-art telepresence suites or simple desktop PCs or laptops equipped with webcams – has become commonplace, reducing travel budgets and miles travelled.

In an otherwise sluggish year for the IT sector, the worldwide market for videoconferencing technologies achieved 16.7 per cent growth in 2009 and is expected to grow from $1.9bn last year to more than $8.7bn in 2014, according to IDC, the analysis company.

“The videoconferencing market is in the midst of a transition – from meeting over video as an option of last resort, to an alternative that’s preferred over travelling,” says Jonathan Edwards, an IDC analyst.

Events such as the travel chaos in northern Europe, caused by volcanic ash from Iceland highlight the benefits.

Danske Bank, for example, avoided drastic upheaval in spite of having a team from its Danica life assurance group stranded with its CIO in Bangalore.

Using the newest of the company’s 17 telepresence suites across 10 countries, they were able to work “just as if they were back in Denmark”, says Tom Soderholm, Danske Bank’s head of collaborative user technologies.

The company began rolling out Cisco telepresence suites in May 2008, supplemented by PC-based e-meeting technology from Microsoft, as part of its wider goal to become carbon-neutral in time for the COP15 United Nations climate conference, held in its hometown of Copenhagen last December.

Replacing business trips with telepresence sessions, he says, has led to a 15 per cent reduction in greenhouse gas emissions from air miles since 2008.

Each month, Mr Soderholm meets colleagues in Danske Bank’s travel management department to calculate how many miles have been replaced by videoconferencing sessions – and then with colleagues in the social responsibility department to calculate the CO2 reductions achieved as a result.

“We’re now discussing replacing our corporate travel policy with a corporate meetings policy, where travel is only one option. Travel should be the last resort,” he says.

For other companies, moving goods, rather than people, is the priority.

At Kimberly-Clark, Peter Surtees, European supply chain director, has been working with the company’s IT department on a roll-out of transport management software to reduce the miles travelled by its haulage contractors delivering tissues, paper towels, nappies and other products from its manufacturing plants to retailers.

The system, from i2 Technologies, a US supply chain management software specialist acquired by JDA Software last November, has enabled the company to optimise allocation of delivery contracts to a wider range of smaller and niche operators.

“The more carriers we have, the more likely we’ll find a contractor with trucks scheduled to return empty from deliveries. If we can fill those returning trucks, there’s less ‘empty running’, and so fewer carbon emissions,” says Mr Surtees.

The company estimates that it is saving £1m annually in transport costs and securing more competitive deals from its expanded list of haulage contractors. It is reducing distances travelled on its behalf by 380,000 miles a year, with an annual saving of 540,000kg of CO2.

Analytical tools are also being used by IT teams. These enable them to measure energy consumption and greenhouse gas emissions – a task that has so far been accomplished by manual effort and spreadsheets.

Big IT vendors are accelerating this process by adding environmental modules to their enterprise resource planning (ERP) software. SAP, for example, acquired carbon monitoring tools specialist Clear Standards in May last year; Oracle has teamed up with IBM to offer its own carbon monitoring product; and Microsoft last year launched its Environmental Sustainability Dashboard for users of its Microsoft Dynamics ERP suite.

“With the dashboard, we wanted to make it easy for companies to extract environmental intelligence from information that, in many cases, they already collect,” says Jennifer Pollard, senior product manager for Microsoft Dynamics.

Data from electricity bills, including units, quantity and price, might be fed into the dashboard directly from financial accounting applications. The dashboard is implemented on clients’ behalf by Microsoft’s global partner network.

In France, for example, André Krief, a manufacturer and distributor of kosher meat products, is working with Prodware, a local implementation specialist, to measure the water and electricity consumed in its manufacturing processes and its resulting carbon emissions.

Gilles Krief, the company’s managing director, says the aim is to use the dashboard to simulate the impact that using different machinery in its processing plants would have on overall emissions.

Other suppliers are suggesting software-as-a-service (SaaS) as a quick and easy way for companies to manage carbon. Cloud Apps, for example, was launched in April 2010 by Simon Wheeldon, the company’s director for the EMEA region and a former Salesforce.com executive.

While the company’s cloud applications aim to manage carbon right across the business – including IT, buildings, business travel, freight transport and so on – he sees an important role for IT in collecting data from different systems and ensuring its accuracy.

“Some of the data that companies will want to feed into Cloud Apps are readily available, in the form of half-hourly meter readings from utilities companies. But some of it is hidden away in core business systems in HR, finance and facilities management departments.

“Close integration and the help of IT staff will be needed to gather it all, in order to get the most accurate picture possible of a company’s overall footprint,” he says.

Edmond Cunningham, an IT and sustainability expert at PA Consulting Group, agrees: “IT leaders are in a unique position to reinvent themselves as green advocates or visionaries, and not just within their own departments.

“The knowledge around how to make green decisions is still not readily available in most companies and IT can play a role in providing the information and data required.”

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Monday, May 17, 2010

Outsourcing van inkoop: Het hoe, wat en waarom - Financieel Management

Outsourcing van inkoop: Het hoe, wat en waarom - Financieel Management

Outsourcing van inkoop: Het hoe, wat en waarom
Geplaatst op 04-02-2010 door Maarten Smits van Oyen


Outsourcing en `terug naar de core´ zijn één en hetzelfde. Een proces dat al decennialang gaande is. Bedrijven die hierin verregaande keuzen gemaakt hebben en er zeer succesvol mee zijn geworden, hebben zich daarbij gericht op uitbesteding van productie, logistiek, IT en facilitair. Outsourcing van inkoop is daarentegen een nog onbekend fenomeen. In de VS en de UK is dit een sterk groeiende sector, ondanks, of misschien wel dankzij de economische situatie. Everest Research geeft aan dat inkoopuitbesteding een vijf keer zo groot effect heeft als uitbesteding van andere businessprocessen.


Procurement outsourcing is het snelst groeiende backoffice- onderdeel van BPO (business process outsourcing). Deze markt was in de VS in 2008 al 700 miljoen dollar groot en voor 2009 wordt een groei naar circa 1,1 miljard verwacht. Voor de periode tot 2012 wordt in het Black Book of Outsoursing 2009 een verdere versnelling van de groei voorspeld. Een recent onderzoek van Everest Research in de VS toont aan dat bijna de helft van de bedrijven overweegt in de komende drie jaar een deel van de inkoop uit te besteden.


PROCESVERBETERING
Bij procurement outsourcing gaat het niet om de outsourcing van een groot aantal medewerkers. De besparingen zullen daarom moeten komen uit procesverbetering, beter management van de leveranciers en technologische innovatie. De economische neergang heeft bedrijven gedwongen om kosten te verminderen en prestaties te vergroten. Nu de markten weer enigszins opleven, blijft de noodzaak bestaan om de kosten goed in de hand te houden. De concurrentie zit tenslotte ook in de ‘survival modus’. Inkoop is zonder twijfel een belangrijke hefboom om dit doel te bereiken.

Ondernemingen geven van iedere verdiende euro ongeveer 50 cent weer uit aan leveranciers. Menige bedrijfsmanager twijfelt eraan of zijn organisatie daadwerkelijk beschikt over de vaardigheid, expertise en infrastructuur om de inkoop goed te managen over alle categorieën heen. Deze kwaliteiten zelf ontwikkelen kan te tijdrovend en/of te duur blijken. Doen we er niks aan, dan blijft veel potentieel voor verbetering, zowel in de zin van onderpresterende operationele processen als te grote uitgaven, onbenut.

Als antwoord daarop beginnen bedrijven steeds meer hun suboptimale inkoopprocessen uit te besteden, met name voor die producten en/of diensten die onvoldoende goed gemanaged worden. Begin 2009 vroeg de Aberdeen Group 750 senior inkoop-, supply chain-, financieel en operations managers in de VS naar hun mening over en ervaringen met inkoopuitbesteding. Dit onderzoek toonde drie belangrijke zaken aan.

1. Inkoopuitbesteding is bij de meerderheid van de bedrijven al onderdeel van de strategie en levert een belangrijke bijdrage aan zakelijke en aandeelhouderswaarde.

2. Ondernemingen die (delen van) de inkoop al uitbesteden, hebben een snelle en goed meetbare vermindering van hun kostenstructuur gerealiseerd door beter gebruik van synergie, beter leveranciersmanagement en door operationele procesverbeteringen.

3. De houding ten opzichte van en de baten behaald uit inkoopuitbesteding werden vooral bepaald door de structuur en de mate van volwassenheid van de eigen inkooporganisatie, maar nauwelijks door de branche of bedrijfsgrootte.


WAAROM UITBESTEDEN?
Uitbesteding van inkoop betreft vooral de ondersteunende inkoop. Een belangrijke reden om dit onderdeel door derden te laten uitvoeren is dat het net zo gefragmenteerd is als de veelheid aan ondersteunende disciplines. Hierdoor valt moeilijk kennis van zaken op te bouwen over iedere discipline apart, waardoor inkoop pas bij het proces wordt betrokken als er besteld gaat worden of bij de contractondertekening. Hier zit verbeterpotentieel. In de praktijk blijkt de belangrijkste reden om deze vorm van BPO toe te passen de snelle ROI te zijn. Maar kostenverlaging mag niet het enige doel zijn.

Er zullen ook kwalitatieve doelstellingen aan ten grondslag moeten liggen. Uitbesteden betekent ook dat er niet meer (in IT en mensen) geïnvesteerd hoeft te worden. Overwegingen voor de uitbesteding kunnen zijn: het aantrekken en vasthouden van de goede mensen is een crime, het ontbreekt daardoor aan specifieke kennis en capaciteit, of de ondersteunde kosten moeten minder vast en meer variabel worden. Ook kan het doel zijn een regieorganisatie voor ondersteunende disciplines (bijvoorbeeld IT, facilitair) te creëren die ervoor zorgt dat uitbestede taken conform vooraf afgesproken prestatieniveaus geleverd worden door externe partijen. En die organisatie (provider) wordt pas betaald als de leveranciers die prestaties ook daadwerkelijk waarmaken. Tenslotte kan men overwegen ondersteunende functies uit te gaan besteden en hiermee, door uitbesteding van de ondersteunende inkoop, de eerste ervaringen op willen doen.


WAT WORDT UITBESTEED?
Inkoopuitbesteding is vanzelfsprekend geen zwart-witbeslissing. Er zijn vele opties, afhankelijk van de vraag hoeveel verantwoordelijkheid de organisatie zelf in huis wil houden of wil overdragen. Het uitbesteden van inkoop richt zich gewoonlijk op de inkoop van die product- of dienstcategorieën die nu onvoldoende gemanaged worden. Het kan gaan om één tot alle van de volgende deelprocessen ten behoeve van één of meerdere producten, diensten of hele categorieën:

• de leveranciersselectie en -contractering
• het operationele inkoopproces: bestellen, afroepen, factuurontvangst en betaling
• het contract- en leveranciersmanagement. De voordelen van het uitbesteden van inkoop zijn onder meer:
• financieel – uitbesteding maakt het mogelijk om structureel kosten te verlagen en tegelijkertijd besparingen te realiseren op de inkooporganisatie zelf door procesverbetering, optimale inzet van best practices en schaalvoordelen. De extra middelen kunnen worden geïnvesteerd in de strategische delen van de organisatie;
• kwalitatief – door uitbesteding wordt de kwaliteit van de inkoopfunctie, van de inkoopdienstverlening en van de prestatie van leveranciers verbeterd, waardoor bedrijfsresultaten toenemen;
• strategisch – uitbesteding maakt het mogelijk om meer tijd (van de schaarse inkooptalenten) en middelen te besteden aan taken die de concurrentiekracht versterken;
• operationeel – vereenvoudiging en automatisering van operationele inkoopprocessen (afroepen, bestellen, betalen van facturen), onder meer door het gebruik van moderne oplossingen, combineert een hoog gebruiksgemak met lagere kosten en betere dienstverlening.


DEUTSCHE BANK
Ondanks een omvang van 65.000 werknemers was de inkoop bij Deutsche Bank vrijwel geheel gedecentraliseerd. Afdelingsmanagers kochten in bij een veelheid aan leveranciers zonder enige onderlinge afstemming. “De bank realiseerde zich dat de energie met name ging zitten in de klanten- en de primaire processen, niet in het stroomlijnen van backoffice-functies zoals procurement.”

Er moest iets gebeuren om de inkooporganisatie en -processen in lijn te brengen met de rest van de organisatie. Verschillende oplossingen passeerden de revue. Allereerst bleek de nadruk te liggen op technologische oplossingen, gericht op het automatiseren van het operationele procure-to-pay-proces. Er werd overwogen om zelf een IT-oplossing te bouwen of dit door een externe partij te laten doen. Uiteindelijk werd besloten dat dit te lang zou duren. Vervolgens dacht men aan de afschaf van een standaardoplossing (applicatie).

Maar de conclusie was al vrij snel dat het probleem niet zozeer in de technologie was gelegen en dat een externe partij waarschijnlijk als katalysator zou kunnen dienen om de gedecentraliseerde processen aan elkaar te verbinden en te transformeren. Het gevoel ontstond dat de uitbesteding van het complete procure-topay- proces het snelst tot resultaten zou leiden. Al gauw was duidelijk dat de risico’s met de provider zouden moeten worden gedeeld om de doelstellingen te kunnen realiseren: besparen en stevige investeringen in technologie/ infrastructuur voorkomen.

De provider zou door zijn expertise een continu verbeterproces in technologie en processen in gang moeten zetten. Na selectie van de provider besloot Deutsche Bank alle bestaande processen, zoals ze op dat moment waren, direct over te doen aan de provider. In ongeveer zes weken werd deze overdracht in de acht regio’s waar de bank aanwezig is, uitgevoerd. In iedere regio bleef een klein team van twee à vier eigen medewerkers aanwezig om gedurende en na het transitieproces de relatie met de provider te managen.

Het was een radicale aanpak, want Deutsche Bank wilde op een fundamenteel andere wijze gaan inkopen. Niet om de headcount te verminderen, hoewel dat wel een consequentie van de keuze was, maar vooral om de werkwijze drastisch te veranderen. Door deze samenwerking werd de provider in hoge mate opgenomen in de organisatie van de bank. De resultaten waren naar de mening van de VP Global Procurement binnen de bank positief.

“De verwachte besparingen werden behaald en er is nu een grote hoeveelheid managementinformatie beschikbaar die ons in staat stelt beslissingen te nemen gericht op leveranciers en klanten. Ook is er een veel beter inzicht in de eigen processen en de wijze waarop we succesvoller kunnen opereren.” De sleutel voor het succes in de samenwerking met de serviceprovider is ‘samenwerking’, volgens de VP Global Procurement van Deutsche Bank.

“Je moet bereid zijn om samen oplossingen te vinden voor problemen. En je zult verrassingen en uitdagingen tegenkomen in de samenwerking. Daarvoor heb je ook de juiste eigen mensen nodig, zij die ook in lastige tijden een relatie kunnen uitbouwen en de moraal hoog kunnen houden.”


CASE UPS
UPS in de VS besloot in 2002 zijn indirecte (ondersteunende) inkoop uit te besteden wegens gebrek aan expertise in huis. De gedecentraliseerde organisatie van ondersteunende inkoop maakte dat deze vooral werd gezien als het vinden van een leverancier bij een vraag, zonder inhoudelijke kennis van de aanbodmarkt, van de eigen organisatie of van de vraagstelling. Door de gedecentraliseerde inkoop was er ook onvoldoende schaal om te investeren in hoogwaardige oplossingen, middelen en mensen.

In twaalf maanden tijd werd de ondersteunende inkoop door UPS geoutsourct. De besparingsdoelstelling van 60 miljoen dollar werd met 20 procent overtroffen. Met nieuwe sourcingen inkooptechnologie beschikbaar en goede, nieuwe leverancierscontracten geïmplementeerd werd een gebruik van vrijwel 100 percent van de contracten bereikt. Van belang was ook een vloeiende overgang van werknemers, waardoor de onderneming zich nog verder kon richten op de corebusiness en groei.


ONDERSCHEIDENDE FACTOR
Tussen 1990 en 2008 was Neil Deverill de eerste chief procurement officer (CPO) bij Electrolux, Zweden, waarna hij bij de UK Treasury hoofd werd van UK Government Purchasing. Vervolgens was hij CPO bij Philips, verantwoordelijk voor de wereldwijde inkoop, en tenslotte fungeerde hij bij Anglo American Mining (Johannesburg) als CPO. Volgens hem biedt een economische recessie inkoopprofessionals een ‘once-in-a-lifetime’ kans om boven het maaiveld uit te komen en hun waarde te tonen aan de gehele onderneming. Maar kunnen ze deze verantwoordelijkheid niet aan, dan kan het voor hen een ongemakkelijke tijd worden.

“Ik hou van economisch lastige tijden en nog meer van recessies. Dan worden inkoopbeslissingen meer op basis van rationele argumenten genomen en onderscheiden de professionals zich van de runner-ups. Iedere inkoopgeneratie zou die situaties een keer moeten meemaken, het is jammer dat ze zo weinig voorkomen”, aldus Deverill. Het effect van een financieel lastige bedrijfssituatie is dat werkkapitaal wordt verminderd en dat de kosten worden aangepast aan wat de markt kan en wil betalen.

In deze periode moet ervoor gezorgd worden dat de belangrijkste leveranciers kunnen blijven leveren. Helaas zouden deze waarden in alle economische situaties moeten gelden. Zowel in goede als in slechte tijden. Dan zouden we beter in staat zijn om te gaan met economisch lastige tijden. Dankzij druk vanuit de top van de organisatie is inkoop in de moeilijke perioden altijd in staat extra besparingen te vinden en meer baten te halen uit de samenwerking met leveranciers.

Maar waarom blijkt inkoop niet in staat dat in te bedden in de manier van werken, onafhankelijk van de stand van de economie? Als inkoop de loftrompet over zijn resultaten afsteekt in zware tijden, zal de ‘business’ zich waarschijnlijk regelmatig de vraag stellen waarom inkoop niet in alle ‘jaargetijden’ een dergelijke resultaatgerichtheid laat zien.


» Dit artikel is op internet te vinden via http://www.financieel-management.nl/content/view/13859

Monday, May 10, 2010

Understanding the data mountain means getting smarter

Understanding the data mountain means getting smarter
By Stephen Leonard, IBM’s chief executive in the UK and Ireland

Published: May 10 2010 17:28 | Last updated: May 10 2010 17:28

Modern information is unlike any that has gone before – it is voluminous, extremely fast and widely varied in format.

It comes structured and unstructured; from within a company and without; it arrives on a daily, hourly and real-time basis.

At the same time, powerful tools using advanced mathematics combined with vast computing power can start to integrate financial data with information about customers, supply chain, and workforce capabilities.

And in a world of intelligent objects, greater granularity is making information even harder to fathom. Containers and pallets are tagged for traceability – as well as medicine bottles, poultry and fruit, adding more detail to the information ecosystem.

This makes using information a daunting task. But it means the business of making decisions is shifting from intuitive and experiential to fact-based, which should lead to better decision-making.

To make progress, however, each enterprise needs to look at how prepared it is to help integrate, standardise and analyse the information flooding in – a real potential problem.

Working harder and longer is not the answer. The key is working smarter, and that means having the right information and insight to drive smarter business outcomes.

Working smarter means front line business leaders know where to find the new revenue opportunities and which product or service offerings are most likely to address each market requirement.

It means business analysts can quickly access the right data points to evaluate key performance and revenue indicators in building successful corporate growth strategies. And, it means corporate risk and compliance units can recognise regulatory, reputational and operational risks before they become a problem.

Until now, acquiring, configuring and fine-tuning a system to analyse information to solve problems and uncover breakthrough insights has required technical skills out of reach for many companies.

Now, many organisations are embracing analytics technology to gain business advantage and better serve their clients. Our research shows that one in three business leaders frequently makes critical decisions without the information they need; 53 per cent don’t have access to information across the organisation needed to do their jobs.

There are, of course, challenges in undertaking analytics-driven transformations. First, and most critical, is data quality.

While 100 per cent accurate data is impossible in enterprises, it should be remembered that analytics is a journey, with successive iterations of the analytical cycle providing the gradual improvements required.

Enterprises too often assume analytics is a highly technical or a quantitative subject that should be left entirely to technology teams.

In fact, successful analytical transformations are driven by the core business leadership. The chief executive or chief operating officer should be driving analytics initiatives across the enterprise. They should ensure that every tool, technology or statistical technique becomes part of achieving the larger business goal.

A further challenge comes from trying to meet every business challenge using analytics. It is important to ensure the business does not lose confidence in the initiative by keeping the rewards incremental and demonstrating the impact of analytics at every stage. Once the basic framework is in place, value can be delivered incrementally and periodically.

For the intelligent enterprise, the future means knowing, not guessing. It means taking information and turning it into insight that can enable business leaders to make real decisions, rather than simply hope for the best.

Analytics has applications that range from helping financial markets to perform better and recognise fraudulent behaviour, to helping doctors make better diagnosis and treatment decisions.

It can also help police in crime reduction, by aggregating street level information across myriad sources. Wherever there are huge amounts of data, analytics can determine the best course of action.

Forward thinkers have, for some time, been drawing on the potential of the technology that is proliferating in the physical world and being embedded in all kinds of systems, interconnected and infused with intelligence.

In the coming decade, these ”smart” pioneers will have at their disposal new analytics technology that will become an increasingly vital source of intelligence, influence planning and take informed decision-making to new levels of sophistication.

Leaders are also beginning to acknowledge that smarter systems sometimes require change in economic and social mindsets. For example, societies may need to shift long-held views on citizen privacy, as systems generate (and share) far more personal data than before.

Societies need to ask themselves whether the benefits of smarter systems outweigh perceived civil infringements, changes in lifestyle or even up-front investment. Welcome to the “decade of smart”.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Sunday, May 09, 2010

Respondenten kiezen voor CRM-systeem van Microsoft · Marketingfacts

Respondenten kiezen voor CRM-systeem van Microsoft · Marketingfacts: "Respondenten kiezen voor CRM-systeem van Microsoft"

De website van Microsoft wordt het best beoordeeld door respondenten die zich online oriënteren naar een Customer Relationship Management (CRM) systeem. Dat blijkt uit de Web Performance Scan die onderzoeksbureau WUA! in maart uitvoerde. Microsoft.com liet Decrmspecialist.nl en CRM2care.nl achter zich. De respondenten gaven in totaal 205 individuele beoordelingen over 87 verschillende websites.

Omzet
Eind april werd bekend dat de omzet uit advieswerk over klantrelatiebeheer in 2009 zo’n 545 miljoen euro bedroeg. Een groei van 9% ten opzichte van een jaar eerder, maar in 2008 werd nog een groei van 17% geconstateerd. ‘De gerapporteerde groei van 9 procent is beduidend kleiner dan de 17 procent die in 2008 werd gerapporteerd, maar is nog steeds opmerkelijk positief gezien de huidige economische ontwikkelingen’, liet directeur Wil Wurtz van CRM Association weten.

Het onderzoek: Microsoft
Microsoft wordt door slechts 22,5% van de respondenten gevonden, maar weet vervolgens wel te overtuigen. Wat de eerste indruk betreft staat Microsoft.com op de eerste plaats, met als gevolg dat bijna 90% van de bezoekers besluit door te klikken op de website. Ook op het onderdeel vertrouwen scoort Microsoft goed met een 7,4. Uiteindelijk zou 10% van de respondenten op basis van de zoektocht kiezen voor een CRM-systeem van Microsoft, wat neerkomt op zo’n 44% van de respondenten die de website bezochten. Maandag kondigde Microsoft nog aan dat de online software as a service (SaaS) versie van Microsoft Dynamics CRM in de tweede helft van dit jaar in het Nederlands beschikbaar komt.










De verschillen zijn klein in de door WUA! opgestelde ranglijst


Hete adem
Zoals uit bovenstaande ranglijst is op te maken, wint Microsoft het onderzoek met een miniem verschil, en voelt het de hete adem van meerdere concurrenten in de nek. De tweede plaats in de door WUA! opgestelde ranglijst is voor Decrmspecialist.nl. Met een vindbaarheid van 15% loopt de website direct veel potentiële klanten mis, vooral gezien het feit dat de website goed scoort bij de bezoekers. Decrmspecialist.nl scoort een eerste plaats voor gebruiksvriendelijkheid, en wordt ook gewaardeerd om de vormgeving en informatiewaarde. De website wordt door 7,5% van de respondenten als favoriet aangewezen.
CRM2care.nl eindigt op de derde plaats, en doet dit ondanks een vindbaarheid van slechts 12,5%. De website scoort goed op kwaliteit en beoogde betrouwbaarheid, en houdt zo de nodige concurrenten achter zich in een branche waarin de onderlinge verschillen erg klein zijn.

Onderzoeksopzet
Met de Web Performance Scan brengt WUA! het online oriëntatieproces van de consument in kaart. Voor dit onderzoek nodigde WUA! veertig respondenten uit, allen tussen de 20 en 48 jaar oud en ervaren met het online oriënteren naar producten en diensten. De deelnemers kregen de volgende casus voorgelegd: ‘Je bent op zoek naar een Customer Relationship Management (CRM) systeem voor je bedrijf met meer dan 150 werknemers. Ga op het internet op zoek naar een aanbieder.’ De respondenten mochten zelf voorkeuren voor factoren als prijs en software features bepalen. Ze kregen twee uur de tijd om zich online te oriënteren, en vulden daarbij uitgebreide vragenlijsten in over de websites die ze tegenkwamen. Met een deel van de respondenten is na afloop van deze zoektocht een groepsdiscussie gehouden, die op video is vastgelegd. Aan de hand van alle verzamelde data berekende WUA! ten slotte de scores voor deelaspecten als gebruiksvriendelijkheid en informatiewaarde, om zo tot de uiteindelijke Web Performance Scores te komen.

Conclusie
Hoewel de keuze voor een CRM-systeem niet zal worden gemaakt op basis van een oriënterende zoektocht op het internet, hebben de onderzoeken van WUA! aangetoond dat het voor organisaties van ongekend belang is over een goede online profilering te beschikken. De goede scores van Microsoft en Decrmspecialist.nl voor respectievelijk de eerste indruk en gebruiksvriendelijkheid ondersteunen deze theorie. Een goed functionerende en aantrekkelijke website heeft in alle branches het vermogen om potentiële klanten over de streep te trekken. Microsoft doet het wat dit betreft momenteel het best, maar van een overtuigende overwinning kan niet gesproken worden. In september zal een nieuwe meting uitgevoerd worden aan de hand van dezelfde casus; we kijken uit naar de te constateren verschuivingen.

Friday, May 07, 2010

Companies make money from information – and so do the criminals

Companies make money from information – and so do the criminals
By Marcus Whittington, SentryBay’s chief operating officer

Published: May 7 2010 12:46 | Last updated: May 7 2010 12:46

Within 24 hours of the earthquake in Haiti, criminal gangs had switched from standard online scams to sending out e-mails purporting to be from aid organisations seeking donations for relief funds.

The most shocking thing about this is not its cynical opportunism, but the speed with which criminals can switch their activities to an unfolding human tragedy.

Online fraudulent activity is now sophisticated and able to exploit almost any situation in hours. The level of verisimilitude that online criminals have developed is astonishing.

Consider the theft of data from individuals via phishing. This involves the sending of fake e-mails to a user purporting to be from a legitimate organisation and soliciting sensitive information. Phishing relies on ”social engineering”, and as the Haiti example demonstrates, phishers continually adapt their approaches to snare unsuspecting users.

There are several key reasons for the success and growth of phishing:

● Phishing sites can be set-up in minutes by copying a real web page and sending out mass e-mails to a pre-purchased list.

● Phishing sites often take days or weeks to be closed down by a company (or relevant ISP).

● Until recently, there has been no effective solution that can protect web-based applications from phishing.

● And even experts can have a hard time differentiating a real site and its phishing twin.

Another, potentially more damaging example of this increased sophistication is key logging, where criminals use spyware to silently monitor user’s activity and steal data. Key logging is one of the most dangerous computing threats, because users and companies are almost always unaware that information has been stolen.

This sophistication is being driven by changes in the way the online criminals operate: hackers and their ilk are becoming more patient and persistent when looking for ways to steal personal data which they then use to set up fraudulent identities to apply for bank accounts, credit cards and loans.

A report from Lucid Intelligence, which tracks and records vulnerable organisations and attacks on individuals and businesses, says that the incidence of personal data being stolen and sold on the internet is rising dramatically and the prices asked for stolen personal data and passwords has increased. What was once the goal for hackers has become the first step in a much larger plan.

At the end of 2009, Lucid’s database held 138m occasions of personal data being sold on the internet. This data is either used to enable large scale fraud, or more likely, developed and sold on – creating a black market supply chain in stolen identities.

What does this mean for businesses?

We know that some of the world’s largest organisations are already vulnerable: staff casually surfing the internet or responding to phishing or scamming e-mails at work can unwittingly open a door that criminals can step through.

In particular, we believe that smaller financial institutions will increasingly fall prey to fraudsters because they lack the more stringent controls and policies that larger businesses must adopt.

This is however, preventable. There are three key steps every organisation could take straight away:

1. Audit processes internally to ascertain how employees are allowed to access online systems. Are security checks stringent? Do staff understand what phishing is and how to guard against attacks?

2. Implement anti-phishing and anti-spying software that guards against attacks in real time.

3. Set policy and enforce it. This may be via technology, educating the workforce or stronger measures but an unenforced policy is a waste of time – and a dangerous one.

Businesses too often take an old, out of date view of their enemy online. They still believe it is a bored teenager trying to crack anything marked “confidential”, or at worst, a disgruntled ex-employee with access to the company website because passwords have not been changed.

The truth is very different. If a business makes money from information, so can criminals. The greater the potential for profit, the more attractive that company becomes as a target.

Don’t expect to be immediately able to identify the threats these criminals can pose. And don’t think they move too slowly to strike at your operations. Such thoughts are tantamount to wearing a sandwich board proclaiming yourself invincible and daring anyone to a round in the ring.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Wednesday, May 05, 2010

The best way to run a contact centre is not to have a contact centre

The best way to run a contact centre is not to have a contact centre
By Dave Paulding, Interactive Intelligence’s regional sales director for the UK, Middle East and Africa

Published: May 5 2010 10:21 | Last updated: May 5 2010 10:21

When a customer needs to contact a company or a government body, the over-riding demand is that the query should be answered fully, accurately and promptly. The agent answering the phone should respond instantly, be friendly but clear, and know all there is to know about the organisation and its products and services.

The agent in the contact centre – who quite clearly cannot know everything about the organisation and its users – appreciates that instant access to data is the difference between a swiftly completed call with a happy customer and yet another dissatisfied tirade.

So it is in everyone’s best interests to resource the contact centre with the best possible technology and the best possible staff.

And I want to propose a radical solution: the best way to run a contact centre may be not to have a contact centre. With universal access to broadband you can put the information anywhere, and with voice over IP telephone technology you can route calls seamlessly. So why not have your contact centre staff work from home?

The advantages to the staff are clear. Commuting time and stress are eliminated, as indeed are the costs (and for the company the reduced emissions look good on the green audit). Staff working unsociable hours no longer have to worry about travelling to otherwise deserted industrial areas in the middle of the night.

Eliminating commuting also makes them immune to transport disruption.

More important, they can plan their hours around their lifestyles. People can return to work after career breaks, and may be happy to work split shifts to cover peaks, something that is virtually impossible when people have to “go” to work.

Research suggests that an employer who can offer home working attracts staff with a wider pool of skills; the ability to work flexibly attracts staff who are better educated. The retention rate for home-based staff is boosted too: typically call centres see a 20 per cent annual turnover, which is halved when you allow staff to work from home.

Does this work?

A successful US company called VIP Desk offers customer care for premium brands. Its client companies sell to affluent, educated customers who expect to deal with similarly well-informed staff if they need support. It has no call centre: all its staff work from home. It attracts the best staff and saves the cost of a bricks and mortar call centre.

What are the downsides?

Managers are inevitably going to be worried about what their staff are doing. Are they meeting their hours? Are they being productive? In short, can they be trusted when they are out of sight, out of mind?

In reality, supervisors do not manage staff in call centres by standing over them with stop watches. Built into the technology platform are sophisticated algorithms for presence management which provide detailed metrics on staff performance. Provided the technology platform is extended to the home worker the same metrics will apply.

For some staff the inverse worry may apply. “If the manager cannot see how hard I am working,” goes the theory, “how will I be recommended for promotion?” Confidence in the presence management system has to work both ways.

In all of this, the technology is actually the easy part. Unified communications can link workers, wherever they are, over a single connection – in an office or contact centre that would be through the local ethernet; for remote workers a single broadband connection carries both the voice and the data.

Security can be layered on to the connection without impairing performance: we have clients working on sensitive military projects, for example.

That means the same technology can be used for mobile staff as well as home-based contact agents. Salesmen or consultants on the road can connect into the corporate network from a client’s premises, a hotel room, an airport lounge or a Starbuck’s. Given a broadband connection they can work just as if they were in their home office.

Flexible working is good for businesses, good for staff and good for the environment. The technology is ready: all that is needed is the trust and conviction of good management.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Monday, May 03, 2010

ITcommercie - Microsoft Dynamics CRM Online definitief na de zomer beschikbaar

ITcommercie - Microsoft Dynamics CRM Online definitief na de zomer beschikbaar: "Microsoft Dynamics CRM Online definitief na de zomer beschikbaar"

maandag 3 mei 2010 11:39
Vorige week werd op het Microsoft Convergence evenement in Amerika definitief bekend gemaakt dat de out-of-the-box online versie (saas) van Microsoft Dynamics CRM in de tweede helft van dit jaar in het Nederlands beschikbaar komt. Tevens werden er een aantal portal uitbreidingen in de vorm van accelerators gelanceerd. Een exacte datum is nog niet bekend gemaakt.


Het team van Microsoft dat zich met de ontwikkeling van CRM-oplossingen bezighoudt was al veel eerder klaar voor het lanceren van de online versie. Door de veranderende cloudstrategie die Microsoft een tijd terug heeft gekozen werd verdere uitrol vertraagd. Windows Azure en standaardisatie van processen via de cloud kregen prioriteit. Dynamics CRM Online was al wel voor enkele klanten in de Verenigde Staten beschikbaar en vanaf de tweede helft van dit jaar dan ook eindelijk in Nederland. Hiermee bevestigt Microsoft definitief de uitspraak van Brad Wilson, verantwoordelijk voor CRM bij Microsoft, tegenover ITcommercie eind vorig jaar. Microsoft gaat met deze oplossing direct de concurrentie aan met Salesforce.com en SugarCRM.



Inmiddels heeft Microsoft naar eigen zeggen 22.000 klanten en 1.1 miljoen gebruikers. Deze klanten hebben ook een aantal nieuwe uitbreidingen gekregen waaronder een verbeterde integratie met de ERP-oplossing Dynamics GP. Daarnaast zijn er een aantal nieuwe portal accelerators beschikbaar waaronder event management, eService en partner relationship management (PRM). Hiermee kunnen volgens Microsoft 0klanten hun wensen en integratie met processen via het internet beter invullen.


Auteur: Redactie ITcommercie
Bron: ITcommercie