Thursday, November 11, 2010

Sociale software en Microsoft SharePoint 2010

Nadat in de lente van dit jaar Microsoft SharePoint 2010 beschikbaar is gekomen, zijn er ondertussen al redelijk wat implementaties uitgevoerd. Tijdens het inrichten van de eerste SharePoint 2010 intranet omgevingen is duidelijk geworden dat steeds meer klanten vragen naar de mogelijkheden rond ‘knowledge (en social) networking’ in SharePoint 2010. In dit artikel zal de opkomst van ‘sociale software’ binnen organisaties worden besproken en de rol die SharePoint 2010 hierin kan spelen.

Het kan best lastig zijn om het begrip ‘sociale software' goed onder woorden te brengen. De voornaamste oorzaak hiervan is dat sociale software vele verschijningsvormen kent en omgeven is door buzzwords met vage definities. Daarnaast heeft de gemiddelde burger reeds zijn persoonlijke ervaringen met sociale software op het internet of kent het van vrienden of familie. Hierdoor is vaak een eenzijdig beeld ontstaan rond een dynamisch en veelzijdig fenomeen.

Geschiedenis

De basis van sociale software is gelegd rond de eeuwwisseling, met internet standaarden zoals XML, RSS en webservices. Hiermee is het fundament gelegd voor de ‘tweede generatie van het internet', (de afgelopen jaren aangeduid met de mode term ‘Web 2.0'). Zo'n 8 jaar geleden (tot 2002) was het internet toch voornamelijk een plek waar mensen (toen ruim een half miljard) informatie aan het consumeren waren, slechts een heel klein deel van de gebruikers publiceerde ook content op het internet. Rond het jaar 2003, 2 jaar na de ‘internet bubbel' kwam er een aantal zaken tezamen. De gemiddelde snelheid van een internetverbinding thuis nam snel toe en het aantal mensen op het internet verdubbelde binnen 3 jaar naar 1 miljard gebruikers. Daarnaast werden platformen zoals Blogger (weblogs) en MySpace (Social Networks) in rap tempo volwassen.
In de jaren die volgden is het gebruik van internet sites waarmee eenvoudig content (tekst, afbeeldingen, video, etc.) gedeeld kan worden extreem toegenomen. Denk hierbij aan het succes van online platformen zoals Facebook, Twitter en YouTube. De laatste jaren werden deze platformen zakelijk steeds meer gebruikt en zijn steeds meer bedrijven gaan onderzoeken hoe deze technieken binnen de organisatie kunnen worden toegepast.

De kracht van sociale software

Wat maakt sociale software waardevol en wat zijn de voordelen voor organisaties? De term ‘Social' (en services zoals Twitter en Facebook) hebben bij enkele mensen nog een negatieve associatie. Hier kunnen verschillende oorzaken aan ten grondslag liggen, zoals ervaringen uit het verleden of vooroordelen over ‘zinloze communicatie (chatten)'. Een ander feit is dat veel zelfgekroonde goeroes zichzelf omgedoopt hebben tot ‘social media experts' en een grote hoeveelheid ruis in de media is verkondigd.
De kracht van sociale software is redelijk eenvoudig te benoemen en is voornamelijk te vinden rond het onderwerp 'laagdrempeligheid kunnen delen'. Sociale software maakt het mogelijk om binnen een minuut een bericht te plaatsen dat leesbaar is voor een groot publiek of een video toe te voegen in een (Wiki) pagina. Een ander kan vervolgens eenvoudig een reactie plaatsen of een foto toevoegen waardoor gezamenlijk (zonder technische kennis) content gecreëerd kan worden. Hier ligt de kracht, en deze kracht is met SharePoint 2010 ook beschikbaar voor intern gebruik.
Veel organisaties hebben de afgelopen jaren tevergeefs geprobeerd een actieve strategie te voeren op het gebied van kennismanagement. In slecht enkele gevallen hebben deze inspanningen ook voldaan aan de initiële verwachtingen. Als er één gebied te benoemen is waar sociale software een grote bijdrage kan leveren, dan is dat wel op het gebied van kennismanagement.
"Sociale media heeft de afgelopen jaren gezorgd voor een explosie aan waardevolle informatie op het internet, en met de juiste aanpak is dit ook binnen organisaties mogelijk".
Sociale software biedt een platform gericht op het individu: 'de medewerker', waarbij het bij 'Intranet 1.0 software' aanvankelijk ging om het geven van een platform aan de afdeling Marketing en Communicatie. In de afgelopen jaren kregen ook andere afdelingen de mogelijkheid om content toe te voegen aan het intranet met behulp van Content Management Systemen. En ook al kwam er met behulp van deze systemen (zoals SharePoint) de mogelijkheid van het publiceren van documenten bij, het laagdrempelig kunnen delen van informatie door medewerkers was maar voor enkele bedrijven weggelegd. Met de komst van SharePoint 2010 komt hier een breed scala aan mogelijkheden bij, waarmee het vullen en verrijken van informatie binnen een intranet (of extranet) eindelijk echt eenvoudig en laagdrempelig begint te worden. Een goed doordachte informatiearchitectuur en implementatie is hierbij natuurlijk wel een belangrijke randvoorwaarde.
En ja, we moeten ook nog even stilstaan bij de sceptici. Laagdrempeligheid betekent natuurlijk ook dat mensen kunnen gaan melden wat ze tijdens de lunch hebben gegeten, eventueel aangevuld met een foto van het dagmenu en het voltallige kantinepersoneel. En ja, voor de meeste mensen binnen de organisatie zal deze informatie geen directe waarde hebben. Gelukkig bestaan er dan ook vele manieren om data te filteren, zoals op onderwerp of afdeling, waardoor iedereen zijn eigen informatiebehoefte kan samenstellen. Hiernaast bestaan er krachtige hulpmiddelen zoals 'tagging, rating en recommendations', die kunnen bijdragen aan het verkrijgen van waardevolle content. Als laatste is het zo dat binnen een sociaal intranet vanzelf ‘sociale omgangsvormen' optreden, zodra iedereen de ‘etiquette van interne sociale netwerken' leert en deze afspraken een onderdeel gaan vormen van de bedrijfscultuur.
Onderzoeksbureau Gartner heeft begin 2010 een rapport gepubliceerd met de titel "Social Software is an Enterprise Reality". In dit rapport doet Gartner de voorspelling dat omstreeks 2014 e-mail voor 20 procent zal zijn vervangen door sociale software als het voornaamste communicatiemiddel van professionals. De algemene verwachting is dat de komende jaren de meeste bedrijven sociale netwerken in gebruik zullen nemen. De meeste experts verwachten dat interne sociale netwerken effectiever zullen zijn dan e-mail voor sommige zakelijke toepassingen, zoals het communiceren van status updates binnen projecten en het lokaliseren van expertise.

SharePoint 2010

Met het aanschaffen van SharePoint 2010 wordt een gereedschapskist in huis gehaald welke een hoop waardevolle sociale software componenten bevat. SharePoint 2010 biedt verbeterde mogelijkheden rond Wiki's, Blogs en social bookmarks. Verder zijn de mogelijkheden rond gebruikersprofielen sterk verbeterd, waardoor functionaliteiten die we kennen van Facebook en Linkedin (zoals noteboards en newsfeeds) binnen de organisatie kunnen worden ingezet. Eén van de meeste krachtige toevoegingen aan SharePoint 2010 is de ‘Term Store', waarmee organisaties een ‘SharePoint brede' taxonomie en/of folksonomie kunnen opzetten.
De Taxonomie (hiërarchische classificatie methode) zal bekend zijn bij veel mensen, maar de folksonomie is een relatief nieuw begrip. Folksonomie is ontstaan tijdens de Web 2.0 revolutie en is een samentrekking van de woorden Folk (mensen) en Taxonomie. Het betreft hier een vorm van ordening op basis van consensus door het volk (de medewerkers). Het mooie van een folksonomie is dat het een ‘bottom-up' aanpak is voor het realiseren van een bedrijfsbrede taxonomie. In de praktijk zal een dergelijke aanpak vele malen sneller te realiseren zijn dan een Top-Down aanpak (en kan het veel uren vergaderen besparen). Een folksonomie kan (nadat hij goed gevuld is) binnen SharePoint wel centraal beheerd en opgeschoond worden. Binnen SharePoint kan ‘alles dat een URL heeft' (Sites, pagina's, documenten en andere objecten) getagged worden, waarbij de tags ook te herleiden zijn naar een persoon.
De sociale features van SharePoint 2010 kunnen voor organisaties belangrijke voordelen brengen, in eerste instantie op het gebied van transparantie en kennisdeling. Andere voordelen zijn er te behalen op het gebied van snellere en efficiëntere toegang tot expertise binnen de organisatie, het verlagen van communicatiekosten en het verbeteren van de (afdeling overstijgende) samenwerking. Al met al zaken waar de meeste organisaties op zitten te wachten...


Read more: http://www.computable.nl/artikel/ict_topics/ecm/3618782/1277020/sociale-software-en-microsoft-sharepoint-2010.html#ixzz14d2tbcEx

Moving out of recession: Small spending steps can bring big productivity leaps

By Stephen Pritchard

Published: October 27 2010 09:25 | Last updated: October 27 2010 09:25

As businesses emerged from the last recession, following the dotcom bust in 2001, the recovery in IT spending lagged behind.

Companies that had invested heavily during the good years found they had overspent on IT and had more than enough equipment to support their operations. It was 2004 before investment in technology recovered fully. By at least one measure, IT spending also became less effective during the dotcom induced downturn.

Businesses that had shed staff, or cut back other areas of their operations, found that their per capita IT costs increased.

Move forward to today, and a tentative economic recovery in most mature markets is once again putting a brake on IT spending. But businesses – as well as public sector organisations – are also being forced to look again at their cost bases, and IT is by no means immune from scrutiny.

At the same time, business leaders have to balance two competing demands: creating a leaner IT operation and creating a leaner business.

Although cutting budgets can produce quick savings, most enterprises spend only between 2 and 5 per cent of revenues (turnover) on technology; smaller companies, typically, will invest rather more.

But across the board, a small increase in IT spending can drive far greater gains in overall productivity.

“Steps towards recovery are still tentative,” cautions David Elton, an IT and change management expert at PA Consulting.

“The pressure on IT departments is still about money. There are signs that people are investing but most clients are still concerned about controlling costs.”

Boards remain cautious about a return to unfettered spending, where large sums of money seemingly vanished into long-term IT projects that failed, or failed to deliver the promised results.

This is prompting chief financial officers and chief information officers to look both at newer technologies, such as cloud computing, which can be deployed to reduce costs – and at improved methodologies for delivering IT services. In particular, there is growing interest in applying “lean” processes to IT.

“The CIO’s role is rapidly changing,” says Alexander Peters, a principal analyst at Forrester Research. “The recession accelerated this change but the drivers – social technologies, service oriented applications and the cloud – are strategic and require changes beyond tactical cost-cutting.”

Mr Peters is the co-author of a report that looks at how IT departments can apply “lean” thinking to their operations. In the report, he argues that CIOs can draw on methods developed in fields such as manufacturing, and use them to make IT not only cheaper, but more effective.

Lean thinking includes considering whether an enterprise should build or buy its IT infrastructure and services, moving on to newer, more efficient, platforms and making greater use of standardised processes.

But at its heart, Mr Peters argues, “lean” is about ensuring IT is more closely aligned to the business. This makes for more effective technology, and less waste.

“Best-practice executives view lean as a performance improvement strategy, rather than merely a cost-cutting exercise,” he says.

Bringing IT closer to the business, and ensuring it is more flexible and responsive, are key to lean thinking.

However, it also requires businesses to reconsider the way they run IT, both to cut costs and make it more responsive.

Moving to newer platforms and technologies should also provide businesses with a stronger foundation for a return to growth.

Strategies such as virtualisation – allowing a single computer to host multiple “virtual” machines – and server and storage consolidation, where those machines are run on fewer physical computers, will save money quite quickly, for businesses that have the expertise to implement them.

Some steps will require more initial investment. Installing computer and other equipment that draws less power can save significant sums over its lifetime, but businesses need to find the capital budgets for the hardware.

Research by IBM, for example, suggests that power consumption accounts for 75 per cent of data centre operating costs. Power costs are also growing much more rapidly than staffing, building or real estate expense, or taxes.

The cost of buying computer equipment, and of building data centres, is prompting more companies to look either at software as a service, outsourcing, or cloud computing.

IBM estimates that the construction cost of a 2,000 sq m data centre now runs to between $30m and $50m, putting it out of reach of all but the largest businesses or service providers.

Then there is the challenge of owning and running an asset based on technology that is both complex, and that rapidly becomes out of date.

A wholesale move to cloud computing might not be appropriate, although some commodity services, such as e-mail, archiving and software test and development, are already being hosted in the cloud for large businesses.

Frank Modruson, CIO of Accenture, the consultancy, points out that businesses with older and more complex IT infrastructures may have to update those before they can outsource the technology itself.

But making such investments is perhaps one of the few ways IT departments can free up cash to support new business initiatives, such as new online sales channels or social networking.

“Coming out of the recession, companies have started to redirect spending to the top line,” says Mark Hillman, vice-president for strategy at Compuware, an IT services company.

“They still have cost reduction initiatives in place, such as server consolidation, but they are limiting spending on the back office, to allow them to invest in areas that give them better connections to partners or customers, or in areas that affect their brand.”

Financial data for the Facebook generation
The financial services companies that buy the data services Thomson Reuters provide may have had a tough couple of years but they have not become less demanding.

Thomson Reuters provides financial market data to businesses including banks, brokerages and investment houses. The company supplies this information via traditional trading room terminals, but more traffic is being carried over the internet, in a business worth $15bn annually.

According to Kevin Blanco, vice-president of global application support and engineering at Thomson Reuters, ensuring clients receive good service across a worldwide network is a challenge.

As a data provider to fast-moving financial markets, Thomson Reuters has to meet two targets for its services: the availability and the responsiveness of data feeds.

This is especially critical for internet-based services, since it is these that are growing most quickly.

Thomson Reuters sets a target of 99.9 per cent “uptime” for its web-based products and a maximum eight-second response time.

“Connections over dedicated circuits are expensive,” explains Mr Blanco. “There are some large banks that require dedicated circuits and we maintain them. But the majority of our products and of our strategic initiatives will be web based. There will be very few dedicated workstation installs or dedicated circuits in the future.”

But newly cost-conscious bankers want to maintain service levels to customers and this places demands on the services they buy from suppliers such as Thomson Reuters.

For Mr Blanco, this means maintaining or improving service quality levels, while controlling costs.

Financial services companies have come to expect from web-based services the reliability and responsiveness they got from dedicated links, as well as the ease of use associated with sites such as Amazon or even social media sites.

“Our user base is no longer [just] financial professionals in their 40s and 50s. The primary user is a junior banker who also uses Facebook or MySpace. Our interface and speed have to match that demographic.”

Researchers who study consumers’ online behaviour have found that visitors to websites often abandon a transaction and go elsewhere if a page takes more than two seconds to respond.

“We are not seeing [demand for] two seconds now, but it is certainly four to five seconds,” says Mr Blanco. “But I do feel that the demand will continue for response times to compress, especially for transactional services.”

Controlling latency – the speed at which trades can be completed – and network quality for a company operating global services can be expensive and demand large numbers of skilled staff to diagnose and fix problems.

Like many other IT-dependent businesses, Thomson Reuters is increasingly relying on automation to cut the cost of delivering its technology.

Streamlining systems for updating services or deploying new applications to servers has cut support costs and, vitally, has improved system uptime.

And, Mr Blanco says, Thomson Reuters is making more use of automated monitoring and diagnostic tools to control the quality of its network.

In particular, web performance and monitoring software from specialist vendor Gomez has brought some rapid and significant improvements.

“In our corporate services business, we brought their website availability up to [99.9 per cent] in two months,” says Mr Blanco.

“We’ve also done the same for the rest of the business.”
IT, with its large fixed cost base and three to four year project life cycles, was not well placed to respond to relatively rapid changes in the business climate.


Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Cloud computing in businesses

By Richard Waters in San Francisco

Published: November 1 2010 00:15 | Last updated: November 1 2010 00:15

Cloud computing may be one on the most talked-about IT trends of recent years, but it has yet to make much of a mark inside big business. Like many new tech trends, the hype has far outweighed the business realities.

If that is to change, then it could well be projects such as recently undertaken by the tax division of ADP, the big US payroll processing company, that explain why.

Extracting data from its customers’ individual systems to prepare employee tax returns has been an expensive proposition, requiring separate engineering in the case of each customer to create the interface with ADP’s own systems.

As a result, it has only been economic to sell the tax filing service to large companies, typically with more than 1,000 employees, says Lori Schreiber, general manager of ADPs tax services division. But inserting a computing service delivered from the "cloud" into the middle of this process has now changed the economics of the business.

In ADP's case, the cloud service in question, from IBM, is a standardised way of "mapping" information from client systems so that it can be "read" by ADP's own systems.

As a result, says Ms Schreiber, ADP can now sell the tax filing service to medium-sized companies it could not profitably reach before. It has also been able to change the way it prices its service, potentially making it more attractive.

"It allowed us to promote it as more of a standard model, rather than charging for it as a professional service where we bill by the hour," Ms Schreiber says.

If cloud computing is to become more than an empty promise, it is this type of new business potential that will account for the shift.

IBM, which has just revamped its cloud computing strategy to base it around services like the one sold to ADP, says this highlights the way the new technology is likely to be felt in the day-to-day business world.

"Taking the operating cost out of service delivery" is one of the big opportunities for companies in many industries, says Mike Daniels, head of IBM's services division. The key, adds Erich Clementi, head of strategy for the company's cloud business, is the "extreme standardisation" made possible by the central delivery of a service. By streamlining Individual processes like this, businesses will be able to create more flexible services, and at a lower standard cost, he says.

As the ADP case suggests, this could open up new business opportunities. For companies in industries like telecommunications, financial services and media and entertainment, pushing some parts of their processes into the cloud will make it possible to "reach markets that weren't reachable before," says Frank Gens, an analyst at IDC. "It will become a fundamental part of the model for all companies trying to reach emerging markets."

Until now, most of the attention in cloud computing has been on the so-called "public clouds" run by companies like Amazon.com and Salesforce.com - centralised services where companies can buy computing resources in much the way they buy electricity.

Services like these have mainly appealed to start-up companies or those looking to create new businesses from scratch. Starting with a blank sheet of paper, designing a company's processes with no "on-premise" systems can be highly appealing.

But for most companies - with large sunk investments in IT systems and an understandable aversion to handing over control of their most important corporate data - this is too big a step to take.

Much of the focus of the big tech companies is now on refining these services to make them appeal to established companies. Mr Daniels compares it to the emergence of e-business in the early days of the internet: after a brief flurry of excitement over the potential of pure-play dotcoms to topple business leaders in many industries, the new technology was applied to enhance the operations of established businesses. It was Walmart, not Pets.com, that won the day, he says.

"The belief is, the money will really be in the enterprise loads, and no one has really untapped that yet," adds Paul Maritz, chief executive officer of VMware, which makes some of the key software for data centres that deliver cloud services.

The key to unlocking this potential are what the tech industry calls "hybrid clouds" - combinations of on-premise and remote, third-party systems that can be combined to create a service, much as ADP found with its tax-filing service.

To make this work, companies need to isolate individual processes that they can outsource, and accept a much higher level of standardisation in these areas, Mr Daniels says. He compares it to the standardisation that has already been imposed on many service functions inside companies, like human resources.

The same constraints are now being placed on the IT departments’ application programmers, he says. They will lose some choice in the platforms they build on and will have to choose from a narrower "catalogue" of IT services, but with significant benefits to their companies in terms of operating flexibility and cost.

These standardised services, in turn, will evolve to suit the needs of particular industries, bringing what IBM says will be a new addition to the IT lexicon: "industry clouds."

This is all a long way from the model of fully-outsourced, "public clouds that first drove interest of the new technology architecture. To the tech purists, it will smack of compromise, surrendering some of the scale benefits promised by fully centralised computing.

There's no question, you lose a lot of the economies of the public cloud," says Mr Gens. "As soon as you say ‘private', you're talking a higher price point."

Long term, that makes the full cloud computing model an appealing one. But for the foreseeable future, the gains seen by most businesses will come from more modest and achievable goals.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.