Wednesday, March 24, 2010

Business technology: a mess – or a thing of beauty?

Business technology: a mess – or a thing of beauty?
By Stephen Pritchard

Published: March 24 2010 13:00 | Last updated: March 24 2010 13:00

Business technology all too often divides its users: does it bring value to the business and therefore needs tending; or is it out of control and in need of being tamed?

Those who use IT in their daily jobs – today, almost all white collar workers – complain that it is slow and restrictive. Complex workplace systems certainly lack the simplicity, and often the power, of consumer applications such as Amazon, eBay, Google or iTunes.

Business management, for its part, frequently sees IT as a necessary evil. Even managers who favour investment in IT might have little hard understanding of how IT works in their business.

“Our surveys say that CEOs see technology as valuable,” says Mark Raskino, vice president and Fellow at Gartner, the industry research firm. “But when you ask them what IT is doing, you get vague and mushy answers.”

He argues that business leaders spend too much time looking backwards at the last generation of technology, such as customer relationship management or business process management, rather than emerging trends such as social computing, mobility and sustainability.

At the same time, management teams are often fearful of IT and see change or investment as a risk. As a result, he points out, companies’ core business systems may be several decades old.

IT systems have grown through additions, patches, mid-life upgrades and modifications, and the result is often a sprawl of interconnected applications, with duplication and inefficiency – and possibly systems that no one uses.

“The current IT wave started 50 years ago with Cobol, but we have yet to have a proper refresh [of many systems],” says Mr Raskino. “Companies change their HQ or factories or even their locations but a lot of core IT has yet to go through its main refresh. It has been in the same data centre for 30 years.”

Over the next few years, some companies will have to tear down their old systems and move to an entirely new IT set up. Others, especially newer or fast-growing companies, will move their systems online, through cloud computing.

Mr Raskino likens the process of managing IT to gardening. Too often, the metaphor for IT is engineering or architecture, and this suggests a degree of design and permanence that is unrealistic.

“IT won’t remain orderly,” he says. Architecture gives you one insight but if you leave IT it gets out of shape very fast. It is more an organic thing that needs constant renewal and refresh,” he suggests.

How businesses go about this – without undue risk or cost but also in a way that delivers the capabilities the business will need tomorrow – is a matter for debate.

PA Consulting, for example, has helped its clients set up “guerrilla” IT teams to glook at business units. These teams aim to solve a business problem quickly. If a task cannot be completed in three months, the teams will not take it on.

Such teams will not fix all of a business’s IT problems, concedes Karl Boone, an IT change management specialist and a member of the firm’s management team. “It is not a long-term fix,” he says. “It will add complexity, but [some] complexity is inevitable.”

The key point is for IT to be able to show that it understands the business, hence “embedding” IT staff in the business unit, and that it can deliver incremental improvements at a time when there is little appetite for large, monolithic IT upgrades.

Large projects, though, have not gone away and for some businesses they will be the only way either to clear out older IT systems, or move to new ways of working that bring genuine competitive advantage. As the financial climate improves, the challenge will be to manage the transition process.

“Almost every business has enormous exposure in operations, production and service delivery to the health of their core IT systems,” says Gary Curtis, co-head of Accenture Technology Consulting.

“Today IT is far more than simply the back office: it is embedded either in the product, or the delivery of the product. But the focus on the health of core IT systems has pushed management focus away from optimising business processes. And that will only continue as IT becomes even more critical to the delivery of products.”

Yet businesses, Mr Curtis suggests, spend far less time planning and managing IT projects than they do on capital investments in plant, machinery or property of a similar value. This lack of management involvement and due diligence leads both to IT project delays and failures, and to complexity and IT “mess”.

“If you take a manufacturing business that is capital intensive, such as auto makers or aero engines, it costs $300m-$400m to build a new plant. That business case is done rigorously and is pressure tested and examined by everyone. But the same companies are not good at doing that with IT. You don’t have the same level of scrutiny, whether it is a desktop refit, or a new financial or ERP system.”

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Tuesday, March 23, 2010

The Problem with the Data-Information-Knowledge-Wisdom Hierarchy - The Conversation - Harvard Business Review

The Problem with the Data-Information-Knowledge-Wisdom Hierarchy - The Conversation - Harvard Business Review: "The Problem with the Data-Information-Knowledge-Wisdom Hierarchy"

9:00 AM Tuesday February 2, 2010
by David Weinberger | Comments (19)

The data-information-knowledge-wisdom hierarchy seemed like a really great idea when it was first proposed. But its rapid acceptance was in fact a sign of how worried we were about the real value of the information systems we had built at such great expense. What looks like a logical progression is actually a desperate cry for help.

The DIKW hierarchy (as it came to be known) was brought to prominence by Russell Ackoff in his address accepting the presidency of the International Society for General Systems Research in 1989. But the actual first recorded instance of it was in 1934:

Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in the information?

Those lines come from the poem "The Rock" by T.S. Eliot. (And for now we can skip over the 1979 reference in the song "Packard Goose" by Frank Zappa.) The sequence seems to have been reinvented in the late 1980s, independent of these poetic invocations.

The DIKW sequence made immediate sense because it extends what every Computer Science 101 class learns: information is a refinement of mere data. Information thus is the value we extract from data. But once the idea of information overload started taking root (popularized in Alvin Toffler's 1970 Future Shock), we needed a way to characterize the value we extract from information. So we looked for something that would do to information what information did to data. Ackoff suggested knowledge as the value of information, and we collectively nodded our heads.

But, the info-to-knowledge move is far more problematic than the data-to-info one. Ask someone outside of the circle of information scientists what "information" means and you'll find that it's a hollow term. It thus was available for redefinition. But "knowledge" is one of the most important words in our culture, with a long and profound history. In the DIKW hierarchy "knowledge" slips its mooring, and that matters.

So, what is "knowledge" in the DIKW pyramid? For Ackoff, knowledge transforms "information into instructions." Milan Zeleny, who came up with the hierarchy a couple of years before Ackoff, says that knowledge is like the recipe that lets you make bread out of the information-ingredients of flour and yeast (with data as the atoms of the ingredients). The European Committee for Standardization's official "Guide to Good Practice in Knowledge Management" says: "Knowledge is the combination of data and information, to which is added expert opinion, skills and experience, to result in a valuable asset which can be used to aid decision making."

The emphasis in all these cases is on knowledge being "actionable" because of the business context, and on knowledge being a refinement of information because that's how we extracted value from data. That may be a useful way of thinking about the value of information, but it's pretty far from what knowledge has been during its 2,500 year history. Throughout that period, Plato's definition has basically held: Knowledge has been something like the set of beliefs that are true and that we are justified in believing. Indeed, we've thought that knowledge is not a mere agglomeration of true beliefs but that it reflects the systematic and even organic nature of the universe. The pieces go together and make something true and beautiful. More, knowledge has been the distinctly human project, the exercise of the highest and defining capabilities of humans, a fulfillment of our nature, a transgenerational treasure that it is each person's duty and honor to enhance.

But, nah, we needed a word to explain what good comes from our massive investment in computers, so we grabbed ahold of "knowledge" and redefined it as we had to. Then we threw "wisdom" into the mix. Bah.

And humbug. The real problem isn't the DIKW's hijacking of the word "knowledge" but its implication that knowledge derives from filtering information. It doesn't. We can learn some facts by combing through databases. We can see some true correlations by running sophisticated algorithms over massive amounts of information. All that's good.

But knowledge is not a result merely of filtering or algorithms. It results from a far more complex process that is social, goal-driven, contextual, and culturally-bound. We get to knowledge — especially "actionable" knowledge — by having desires and curiosity, through plotting and play, by being wrong more often than right, by talking with others and forming social bonds, by applying methods and then backing away from them, by calculation and serendipity, by rationality and intuition, by institutional processes and social roles. Most important in this regard, where the decisions are tough and knowledge is hard to come by, knowledge is not determined by information, for it is the knowing process that first decides which information is relevant, and how it is to be used.

The real problem with the DIKW pyramid is that it's a pyramid. The image that knowledge (much less wisdom) results from applying finer-grained filters at each level, paints the wrong picture. That view is natural to the Information Age which has been all about filtering noise, reducing the flow to what is clean, clear and manageable. Knowledge is more creative, messier, harder won, and far more discontinuous.

Harnessing Your Staff's Informal Networks - Harvard Business Review

Harnessing Your Staff's Informal Networks - Harvard Business Review: "Harnessing Your Staff’s Informal Networks"

If your smartest employees are getting together to solve problems and develop new ideas on their own, the best thing to do is to stay out of their way, right? Workers can easily share insights electronically, and they often don’t want or appreciate executive oversight. Well, think again. Though in-house networks of experts—or “communities of practice”—were once entirely unofficial, today they are increasingly integrated into companies’ formal management structures.

Independent, off-the-grid communities have proliferated in recent years, and many companies have counted on them to deliver creative solutions to challenges that bridge functional gaps. But in the past few years, outside forces—technological advances, globalization, increased demands on employees’ time—have begun to undermine communities’ success. Consider the rise and fall of an informal group of experts at a large water-engineering company located just outside London. Starting in the early 1990s, they began meeting weekly to discuss strategies for designing new water-treatment facilities. The gatherings were so lively and informative that they actually drew crowds of onlookers. (The company can’t be named for reasons of confidentiality.)

The community initially thrived because it operated so informally. United by a common professional passion, participants would huddle around conference tables and compare data, trade insights, and argue over which designs would work best with local water systems. And the community achieved results: Participants found ways to significantly cut the time and cost involved in system design by increasing the pool of experience that they could draw upon, tapping insights from different disciplines, and recycling design ideas from other projects.

Too much attention from management, went the thinking, would crush the group’s collaborative nature. But the very informality of this community eventually rendered it obsolete. What happened to it was typical: The members gained access to more sophisticated design tools and to vast amounts of data via the internet. Increased global connectivity drew more people into the community and into individual projects. Soon the engineers were spending more time at their desks, gathering and organizing data, sorting through multiple versions of designs, and managing remote contacts. The community started to feel less intimate, and its members, less obligated to their peers. Swamped, the engineers found it difficult to justify time for voluntary meetings. Today the community in effect has dissolved—along with the hopes that it would continue generating high-impact ideas.

Our research has shown that many other communities failed for similar reasons. Nevertheless, communities of practice aren’t dead. Many are thriving—you’ll find them developing global processes, resolving troubled implementation, and guiding operational efforts. But they differ from their forebears in some important respects. Today they’re an actively managed part of the organization, with specific goals, explicit accountability, and clear executive oversight. To get experts to dedicate time to them, companies have to make sure that communities contribute meaningfully to the organization and operate efficiently.

We’ve observed this shift in our consulting work and in our research. This research was conducted with the Knowledge and Innovation Network at Warwick Business School and funded by the Warwick Innovative Manufacturing Research Centre and by Schlumberger, an oil-field services company. To examine the health and impact of communities, we did a quantitative study of 52 communities in 10 industries, and a qualitative assessment of more than 140 communities in a dozen organizations, consisting of interviews with support staff, leaders, community members, and senior management.

The communities at construction and engineering giant Fluor illustrate the extent of the change. Global communities have replaced the company’s distributed functional structure. While project teams remain the primary organizational unit, 44 discipline- and industry-focused communities, with 24,000 active members, support the teams. The communities provide all functional services—creating guidelines for work practices and procedures; publishing technical documents; and offering career development, access to expert advice, and help with technical questions. They are the first and best source for technical knowledge at Fluor.

Here’s one example of how this works: Not long ago, a Fluor nuclear-cleanup project team had to install a soil barrier over a drainage field once used to dispose of radioactive wastewater. But environmental regulators mandated that Fluor first locate and seal a 30-year-old well, now covered over, to prevent contamination of the groundwater table. Poor historical data made it impossible to tell if the well really existed, and ground-penetrating radar also failed to discover it. Simply removing the contaminated soil to find the well would have been costly and risky for workers.

Monday, March 22, 2010

Healthcare dips a toe into the digital age

Healthcare dips a toe into the digital age
By Geoff Nairn

Published: March 22 2010 16:51 | Last updated: March 22 2010 16:51

Healthcare is beginning to shake off its Cinderella image after lagging behind other sectors both in levels of IT investment and in its perception of IT as a strategic tool.

Governments are now looking to IT to rein in their soaring healthcare costs and improve care quality as healthcare decision makers belatedly realise that technology is one of the few weapons that can make an impact on the sector’s chronic problems.

Healthcare has always suffered from limited resources and conflicting demands and that is especially true for technology, with IT managers having to counter the view that money would be better spent on ”front-line” medical technology such as CAT scanners – particularly if the latter generate new revenue.

In addition, a risk-averse culture is deeply ingrained. While pockets of innovation exist, usually at well-funded university medical centres, healthcare providers are reluctant to allocate scarce funds for improvements. This leaves IT staff spending more time maintaining old systems than developing new ones.

According to Forrester Research, healthcare enterprises in North America spend just 22 per cent of their IT budgets on new IT initiatives, compared to 28 per cent for businesses in other sector.

But in spite of this unpromising diagnosis, the outlook for healthcare IT is looking brighter. ”Across the OECD, there is a universal acceptance that we should be investing more to digitalise healthcare,” says Andy Mullins, head of health at PA Consulting.

Decision-makers also now recognise that IT investments must be co-ordinated and focused on key technologies that can be deployed across a whole healthcare system.

That is good news for IT vendors, who have long complained how the limited investment available for healthcare IT is spread across a patchwork of unco-ordinated pilot programmes.

”We often joke that healthcare IT suffers from ’pilotitis’,” says Neil Jordan, managing director of Microsoft’s worldwide health group. ”Historically, there has been very little industrialisation and a lack of long-termism so it has been very hard to get technology to work at scale.”

He says the key to creating a modern information-based healthcare system is to break down the barriers that have traditionally separated IT systems. Data integration technologies play an important if unglamorous role in making ”joined-up” healthcare a reality.

During the Sars respiratory disease epidemic in Hong Kong in 2003, for example, data integration technology from Informatica was used to bring together data on 6m patients from 43 hospitals and 121 outpatient clinics.

Microsoft is also pursuing the data integration opportunity with Amalga, a data aggregation platform that integrates clinical, administrative and financial data from disparate systems.

The issues of scalability and integration are now coming to the fore as more governments set national or regional targets for care outcomes and cost savings.

For example, President Barack Obama has identified nationwide electronic healthcare records as one of the key technologies to help contain the staggering cost of healthcare in the US. This one initiative, if adopted by 90 per cent of doctors and hospitals, could save $77bn a year, according to a study by Rand Corporation.

The target is for most Americans to have electronic health records by 2014. But in 2009 only 5 per cent had an advanced EHR system capable of looking up medical histories and ordering tests electronically.

Fearing that many physicians will drag their feet on the EHR issue, there are federal funds to subsidise the cost of equipping practices with EHR systems. And those who still refuse to buy an EHR system will see their Medicare and Medicaid payments reduced from 2015.

Other countries have embarked on initiatives to create an information-based healthcare system but several have been hit by delays, cost overruns and political bickering.

In 2006, Germany began to field test a nationwide eHealth card scheme which would ultimately embrace 80m users, 300 health insurance companies, 2,200 hospitals and 188,000 physicians and dentists.

As well as containing patient data and insurance details, the smartcard can hold electronic prescriptions and, more controversially, be used to access a patient’s health history online.

The project was meant to be running nationwide in 2009 but ran into opposition from many GPs who could not see any advantage in this new way of working. Security concerns were also raised.

After spending €1.7bn on the initiative, the German government has now decided to put the project on hold. The cards will only be used in a few pilot areas as a simple identification card.

In UK, the National Programme for IT has run into a barrage of criticism not just for its huge cost – £12.7bn since 2002 – but for the delays and the lack of consultation with suppliers and medical professionals. Some IT suppliers have abandoned their involvement.

One of the key planks of NPfIT is the summary care record, a simplified EHR, which is supposed to be running nationwide by early 2011. But as of December 2009, only 152 GP practices and two of the 168 hospital trusts were using summary care records.

”The summary care record is a good idea, but I am not sure that having all medical records online at all times is really necessary. A lot of physicians are not qualified to interpret specialist data and there is a risk that the GPs get information overload,” says Robin Hughes, founder of Ascensus, a UK-based vendor of practice management software..

This highlights a new area of opportunity for healthcare IT, namely using business intelligence and other analytical technologies to help clinical staff and other decision-makers extract useful information from vast quantities of health-related data.

”I would say that business intelligence is a more critical technology than electronic patient records, which are nothing more than a very expensive way of replacing patient notes,” says Adrian Downing, healthcare director at Concentra, a UK-based BI and consulting firm.

One of the welcome side effects of the introduction of nationwide EHR systems is that it will allow decision makers to tap into a rich source of aggregated population data to support operational and clinical research.

This would, for example, allow researchers to look at longitudinal patient data in the search for early indicators of those in the population most at risk of developing serious medical conditions.

That not only saves the health system money – fewer emergency admissions – but spares the patient the stress and risk of an operation.

”If you can spot people with hypertension before it becomes a stroke, you can actually improve the life of the patient and knock a zero off the cost of treatment,” says Mr Hughes.

Data analysis technologies are also playing an increasingly important role in improving the operational efficiency of healthcare systems.

”We need to give clinicians more information about how to spend resources more effectively,” says Mr Mullins of PA Consulting.

Healthcare providers have embarked on initiatives to cut the spiralling cost by boosting efficiency, benchmarking performance and generally making practitioners more accountable. But these initiatives need to be monitored and analysed to ensure they are achieving their goals.

Mr Downing gives the example of a UK private hospital which wanted to reduce the high cost of hip operations by reducing the average hospital stay from 10 to seven nights. The reduction in stays was achieved but not the expected cost savings.

Deeper analysis of the data with business intelligence software revealed why: patients who were sent home early needed much more physiotherapy and that offset the money saved on beds.

Another critical application for data analysis technologies, at least in the private sector, is in fraud detection. Healthcare fraud costs the US more than $140bn a year but only 10 per cent of fraudulent claims are detected and only 10 per cent of the frauds detected ever get recovered.

Some fraud is blatant – submitting claims from dead doctors, for example – but other forms are more subtle, such as a physician misrepresenting a patient’s condition to obtain higher reimbursments.

”The problem is that you cannot just look at claims data to spot fraud, you have to also look at demographics, the age and sex of the patients, the kind of hospital and so on,” says Richard Ingraham, head of healthcare marketing at Teradata, the data warehousing company.

”Traditionally, this data would be kept in silos so you have to bring it together to get the complete picture.”

The final big challenge is developing new cost-effective ways of delivering healthcare. Telecare technologies make it possible for many elderly or chronically ill patients to stay in their homes while being regularly monitored for warning signs.

Telemedicine has long been used to bring healthcare to people in remote areas. Today its use is expanding to allow more people to access scarce skills.

At first sight, San Diego might not seem to be an obvious choice for a telemedicine project. ”Irrespective of whether it’s an urban or rural area, you have access issues with healthcare,” says Kaveh Safavi, vice president of Cisco’s Global Healthcare Practice.

Cisco’s system, which is a variant of its TelePresence high-definition video-conferencing technology, is being used to deliver primary and speciality care to two community health centres in San Diego.

The HealthPresence system has already been tested in a pilot project in Scotland, where it is used to provide remote consultations for patients in outlying community hospitals rather than requiring them to travel to the main accident and emergency hospital in Aberdeen.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Thursday, March 18, 2010

Google Sites Vs SharePoint

Google Sites


De google sites zijn gemaakt om samen te werken met een team. MAAR Doordat het niet is geintegreerd met Google docs ben je bij lange na niet zo flexibel als in Sharepoint(je hebt minimaal 2 applicaties nodig om een samenwerking aan te gaan kijk maar eens naar deze google site


De Google sites geven je ongeveer hetzelfde als SharePoint Team Sites met lijsten paginas en search. Als je naar iets anders op zoek bent is SharePoint de juiste keuze in deze.


Even een standard vergelijking:


Met betrekking tot beschikbaarheid is google opgezet in een paar minuten. Hiervoor ben je met SharePoint iets langer bezig dit noemen ze BPOS Bovendien is dit niet gratis. Waar google wel Gratis is. Ook is er een mogelijkheid om SharePoint op eigen servers te installeren. Meer werk en beheersmatig is hier controle op nodig. Dus als je echt voor prijs gaat is Google hier de winnaar.


Zie voor de vergelijking op onderdelen:
http://watissharepoint.nl/

If Google can be hacked, is anyone safe?

If Google can be hacked, is anyone safe?
By Seth Berman and Lam Nguyen of Stroz Friedberg

Published: March 5 2010 16:34 | Last updated: March 5 2010 16:34

Businesses pour millions of dollars into a never-ending “virus-antivirus” arms race, all the while wondering: “If the technology titans can be hacked, what are the chances that my own data is secure?”

IT sophistication means we can now watch individual data packets as they enter and exit systems; we can scan files for known viruses, as well those yet to be written; we can examine corporate networks to see who’s online, what they are doing and how they are doing it. Yet we are still vulnerable.

At the heart of this insecurity is the “zero-day exploit”. It is derived from a programming concept that refers to day one of a software development project; known as the zeroth day.

Thus, a zero-day exploit takes advantage of the window of time between when developers are made aware of a problem and when the complete software fix can be developed and distributed.

Zero-day exploits, by definition, are vulnerabilities that have not been addressed by hardware and software manufacturers. Thus, there are no virus signatures to be downloaded or software patches to be updated, leaving the bad guys with the upper hand.

Add to this the complexity and sophistication of today’s attacks and it becomes easier to understand why industry giants such as Google can be hacked.

Recent reports indicate the Google attacks started on social networking sites. The attackers watched key Google employees to identify their friends and associates and hacked these accounts.

Then they used information gained to contact other employees and, appearing legitimate, lured unsuspecting victims to nefarious websites, which provided the doorway through the company’s firewall.

These attacks have become known as ”chained exploits”, a series of vulnerabilities and weaknesses that when used in tandem, can break even the most secure systems.

In another example, a corporation detected a calamitous virus infection which plagued more than 500 computers in its network. It thought it had dealt with it successfully but six months later the company identified suspicious traffic indicating the presence of another virus.

After forensic analysis, it was discovered the “new” malware had been installed during the earlier attack. Once the second virus was in place, it didn’t matter to the hackers that the first virus had been destroyed.

Chained exploits not only create new vulnerabilities, they can lead to a false sense of reassurance by allowing the first virus to serve as a decoy, leaving the impression that efforts to destroy the first virus have solved the problem.

So what can you do mitigate this risk? First, assume that no matter how good your firewalls, infections will happen from time to time. Maintaining diligent and timely patch management of applications, operating systems, and network devices is a must – but not sufficient.

Where possible, restrict access to sensitive information to as few people in the company as possible – that way a breach of one person’s computer, won’t open the keys to the kingdom.

You also need an emergency response plan in place before a virus attack to assess whether it is the sort of attack that can be dealt with using commercially available virus detection software (which will be true in most cases), or if the infection is systemic or is affecting an especially sensitive system that constitutes a breach of your central infrastructure.

In that case you might need to decompile the virus’s code to understand exactly what it did, how it operated, and seek expert advice on finding and containing the damage.

The recent cyber attacks also reveal a changing motive – the hackers wanted to steal intellectual property or corporate secrets. Indeed, some recent hacks involved viruses that automatically copied every e-mail sent or received by key individuals to a shadow address, giving the hackers a clear view of company secrets.

In short, this new wave of hacking is corporate espionage. The implication is clear: previously, the financial risk of hacking was primarily of damage to a network and perhaps reputation. Now the risk is far greater – the new target is the business information upon which a company relies.


Stroz Friedberg provides digital forensics, incident response and electronic disclosure in the UK and the US. Seth Berman is a managing director in its London office; Lam Nguyen is a director of digital forensics in its Boston office.
Copyright The Financial Times Limited 2010. You may share using our article tools. Please don't cut articles from FT.com and redistribute by email or post to the web.

Join the revolution – no more servers

Join the revolution – no more servers
By Brian Thomson, managing director at Rackspace for the Emea region

Published: March 8 2010 16:46 | Last updated: March 8 2010 16:46

This is going to be a year of change as businesses look for more efficient ways to procure and manage IT, freeing up the time and money needed to focus on innovation.

Organisations are tired of managing technology in-house, particularly servers, leading to a shift towards IT outsourcing. Two recent reports from Gartner highlight that ”the future of IT lies outside the IT department” as business demand for IT-driven growth and innovation drives the need for a different skill set, and “virtualisation and cloud computing will transform IT in 2010”.

In line with this demand for reducing IT costs and increasing innovation that will ultimately drive market share, there is a growing trend towards Computing-as-a-Service models, such as software-as-a-service, cloud computing and hosting.

This enables companies of all sizes to buy computing capacity on a pay-as-you-go basis, being charged only for what they use, without up-front capital cost, and freeing them from the shackle of the server.

Gartner estimates that 44m servers are in use worldwide and the great majority of these are deployed internally. New business models mean that instead of purchasing, maintaining, and upgrading these servers, businesses can simply consume computing resources from service providers.

A new era of computing

The business case for this consists of zero capital expenditure and minimal incremental headcount. It also takes away much of the risk from projects, since the responsibility for delivering the overall solution is given to service providers.

Computing-as-a-Service is going to become the mainstream delivery architecture for corporate computing in the next two years, transforming IT departments by bringing flexibility, security, performance and resilience to a function that has traditionally acted as an inflexible monopoly.

Cloud computing, an element of this “no more servers” approach, is currently being used to deliver both new and existing applications. As more solutions are delivered ”cloud ready”, buying and deploying servers in-house no longer makes business sense. For revenue-generating applications, risk can be reduced still further by building the IT infrastructure wholly or partially in the cloud.

Computing-as-a-Service can be used in many different ways – it can be a combination of a cloud infrastructure and dedicated servers, enabling companies to benefit from the flexibility and cost savings of the cloud, but also the increased security and stability of managed hosting, depending on the needs of the business.

Innovate to get ahead

Since the mainframe, innovations continue to extend the availability of computing capacity to more businesses. Each development also increases the amount of computing resource available to business, while driving down costs.

But adoption of “new” approaches is often slow. In part this is due to businesses demanding a return on their past investments before adopting a new approach. The larger the firm, the larger the sunk cost, which means the later the adoption.

This is why the first beneficiaries of each new computing innovation are typically smaller companies because they have less legacy equipment or bureaucracy stalling progress.

And as larger enterprises outsource more, there will be fewer systems to depreciate, decommission, or cling to in the hope of ROI, thus fuelling the process.

This year will see businesses demand more from their service providers, as a “No More Servers” approach must come hand-in-hand with excellent customer service and support.

Those joining the revolution, from small businesses to enterprises, will be empowered to achieve unprecedented productivity at minimal costs. In an ever-increasingly competitive market, what business couldn’t benefit from this?
Copyright The Financial Times Limited 2010. You may share using our article tools. Please don't cut articles from FT.com and redistribute by email or post to the web.

What it means to be smart in the age of information

What it means to be smart in the age of information
By Peter Siggins, mobile business specialist at PA Consulting Group

Published: March 11 2010 17:11 | Last updated: March 11 2010 17:11

Walking along Constitution Avenue in Washington DC, I was debating with my son whether the metro or the bus would be quickest at getting us to Silver Spring in North West DC.

The debate was short. In 30 seconds he said: ”Dad, take the bus”.

He had checked the times on his iPhone and communicated with a friend on Facebook as we walked – a simple and clear demonstration of how mobile technology has come of age, giving ready access to intelligent information, everyday and everywhere.

The rate of adoption of mobile internet applications has surprised most market commentators. In the mid-1990s, AOL and Netscape reached 17m subscribers in just over two years, and this was considered a dramatic pace of increase.

In a similarly short time the mobile internet has reached 57m subscribers through the iTouch and iPhone, driven by the remorseless rise in mobile phone subscriptions. These subscriptions are expected to reach 5bn this year, equivalent to three quarters of the world’s population.

This is empowering people to carry out small tasks at any time, anywhere and it is changing the way they behave as consumers.

The changes are far-reaching. Mobile applications are not only creating another channel but also providing the opportunity to launch new products and services, create more intimate relationships with the consumer, and enable significant productivity improvements.

On the flip side, new mobile-enabled business models have significant disruptive power, representing a new threat to established business. Amazon’s Kindle is a case in point for the publishing industry.

The business sectors which are embracing the opportunities that mobile presents range from healthcare to retail banking to energy.

Healthcare providers are using mobile to reduce expensive infrastructure by giving patients with complex diseases more involvement in the management of their conditions without having to visit hospital. In the UK, the Telehealth project uses monitors to track and transmit the patient’s vital signs from home to clinicians remotely, reducing costs by a projected £1bn.

This is just the start. As the pressure on healthcare providers grows and reimbursement models, by which companies agree to refund costs of unsuccessful treatment, become more prevalent, mobile monitoring will play a key role in checking whether drugs have been taken accurately.

In retail banking there is a slew of mobile products and services designed to improve the customer experience. These include simple two-way SMS services that offer instant information and reduce the cost of customer service.

More advanced mobile services using mobile web and downloadable applications not only replicate the online banking experience but are also capturing new revenues from payments and other added value services.

At the same time, new entrants such as Mint.com and Square are providing financial services unique to the mobile device in ways that threaten to disrupt the traditional banking models.

The energy sector is investing heavily in mobile, helped by governments. Under US administration’s stimulus package, Congress has authorised investment of about $4.5bn to propel smart-grid development. This allows for remote monitoring of power usage and more efficient management of the grid.

A number of companies have seen the potential of this approach, with Pepco Holdings recently unlocking $168m for their smart grid projects. Utilities in California and Texas alone are spending $6bn on advanced digital meters and related systems, a key building block in a smart electric grid.

For any business there are many obstacles to overcome before the full benefits of mobile business can be realised. Implementing a mobile business strategy requires considerable innovation and experimentation, working in partnership with a complex ecosystem of network operators, device manufacturers, application developers, as well as internal functions such as IT.

This also presents significant leadership challenges, first and foremost in managing risk and reputation, but also in helping the organisation embrace the potential of mobile through providing top-down sponsorship and leadership of its implementation.

In many ways this is not new, the availability of information for consumers has long been a driver of business development: Benjamin Brandreth (of Brandreth Vegetable Pills) pioneered the use of the giant billboard 150 years ago. Huge billboards screamed to the public from the tops of building and the roadside, to ”buy my pills” and changed the way consumers found out about products.

Today’s smart information age offers the opportunity to achieve a similar transformation. It is not too fanciful to imagine products being sold through an interactive poster, with people browsing, buying and paying while they ride the metro.

The prizes for getting mobile right are significant. Mobile business is starting to deliver on expectations, and offers a wealth of future possibilities, as demographics change and technology savvy kids grow to become the consumers of tomorrow. Smart business leaders are the ones seizing the opportunities mobile offers.
Copyright The Financial Times Limited 2010. You may share using our article tools. Please don't cut articles from FT.com and redistribute by email or post to the web.

Selling via social media brings inconsistency dangers

Selling via social media brings inconsistency dangers
By Mark Thorpe, UK managing director for Stibo Systems

Published: March 15 2010 11:44 | Last updated: March 15 2010 11:44

Social media has caused an explosion in new ways to market and sell products, bringing with it complexities in managing product information – and new ways for everything to go wrong.

Many businesses have been quick to use Facebook, YouTube and bloggers for their huge potential as marketing channels, helping to increase online sales by targeting a large user group.

But organisations should be aware of the dangers. Keeping data consistent between existing channels such as catalogues, stores and websites can already cause problems, so how do they begin to deal with Facebook, Twitter and YouTube as well?

Making the most of social media

Dell, for example, seems to be getting it right, having made more than $3m in sales via Twitter alone. Dell regularly tweets its latest promotions to consumers that follow the brand online.

This is a great way of conducting “passive” sales, after consumers have already expressed an interest in the brand, decided to engage with it, and are therefore happy to be targeted with special offers.

Another good example is GEMaffair.com, a US retailer which recently held a contest on Twitter asking followers to enter by re-tweeting [forwarding, or re-broadcasting] a promotional message, which contained a coupon. A viral bonanza of re-tweets ensued.

One lucky tweeter won a $100 jewellery set. The online jeweller received 18 orders that day associated with the contest that more than covered its costs.

The dangers of social media marketing

However, as well as promoting benefits to consumers, social networks also exacerbate company mistakes by spreading news fast. Connected users can update anywhere, anytime, and a negative story can spread at the touch of a button.

Best Buy, for example, mistakenly advertised a Samsung flat screen TV for less than $10 back in August. Word spread through the online retail community via social media at an alarming rate, leaving Best Buy trying to limit the damage.

Multi-channel mix

Shopping habits are adapting to the new channels available: many consumers see a product in a store and then buy it later – and sometimes cheaper – online. The opposite is also true when shoppers see something online, which encourages them to make their way to the store to see it in “real life” before parting with their cash.

With increasing numbers of people fluctuating between online, catalogue and in-store shopping, comes an increase in the complex task of keeping product data consistent across the board.

There is no point tweeting a link to a product, or posting on a fan group on Facebook, if that same product is available at a lower price in a store or via a catalogue.

How to ensure consistent product information

Organisations are becoming more strategic in their data management processes. Many are setting up data governance teams to interact with all departments, which are led by business staff rather than those in IT.

This is a positive step as it is only when senior business people start to take ownership for master data issues that a genuine improvement in the field will occur.

Business leaders are recognising the importance of centralising data and following a standardised process. Putting product information in one place, where all departments can easily access it, is essential and saves time and money.

Final thoughts

It is important for organisations to get the basics right, before launching themselves into the world of social media.

Social media is always evolving, but its potential for driving sales will remain. Businesses need to be 100 per cent on the mark when it comes to what they are saying about their products – wherever they happen to be saying it.
Copyright The Financial Times Limited 2010. You may share using our article tools. Please don't cut articles from FT.com and redistribute by email or post to the web.

Options open up for the minimialist business

Options open up for the minimialist business
By Stephen Pritchard

Published: March 17 2010 15:17 | Last updated: March 17 2010 15:17

Twenty years ago, a laptop computer represented a significant investment for any business. Today, it costs little to buy an entry-level laptop capable of running day-to-day business applications, reasonably demanding graphics or computational software, and even entertainment applications.

According to Eilert Hanoa, chief executive of Mamut, a Norwegian software vendor specialising in small and mid-sized businesses, the cost of providing an employee with a laptop and a phone is now as low as one pound a day – an historically low figure for equipping a member of staff with the tools of their trade.

As a result, smaller businesses have also benefited from the way both the internet and mobile networks have changed their communications.

Companies can set up their telephone links, as well as links to the internet, using inexpensive mobile phones and cellular data “dongles”.

For more permanent set-ups, technologies such as voice over the internet (VoIP) have largely replaced conventional phone lines for a growing number of businesses. According to Forrester, the industry analyst firm, 14 per cent of all companies have deployed VoIP to the desktop, and a further 34 per cent are considering doing so.

Cheap, ubiquitous and increasingly quick internet access opens up a further set of options to businesses when it comes to running their IT: moving to “cloud” services.

Rather than hosting back office applications on their own computer hardware, businesses can rent space on remote servers, or even rent applications on a per-user basis from companies such as Salesforce.com and NetSuite.

Even traditional desktop applications such as word processing are moving towards the cloud, through services such as Google Docs and Microsoft’s Office Web suite. It might not be going too far, to suggest that a functioning business needs little more than some bandwidth, a web browser, and a way to print documents.

“It’s becoming rather like ordering a dish off an a la carte menu,” says Rob Lovell, chief executive and founder of ThinkGrid, a company that provides hosted desktop applications to SMEs over the internet. “You can have your e-mail, phone and desktop applications anywhere you want.”

But the flexibility and productivity gains offered by information technology have to be balanced against the time, skills and money needed to manage the technologies.

As Andy Mulholland, global chief technology officer at Capgemini, the consultants, points out businesses now spend, on average, just 5 per cent of their time on back office processes, mostly as a result of efficiencies brought by IT.

But for smaller companies, and owner-managed businesses in particular, IT is also seen as a burden. ThinkGrid’s Mr Lovell says: “They are spending more time fixing it when it is broken. They don’t want to install or manage it.”

This is leading to a new generation of companies taking a radically different approach to technology, seeing it less as an asset to be bought and managed, and more as a service to be rented or paid for as it is used.

Companies such as ThinkGrid, Salesforce.com and NetSuite base their business models on the premise that companies want IT specialists to run their IT, so they can focus on running the business. Larger enterprises have long turned to outsourcing service providers to do exactly that.

The cloud, though, breaks down IT services into bite-sized chunks that make them affordable to smaller businesses.

“All applications will move to the cloud,” predicts Zach Nelson, NetSuite’s chief executive.” Businesses will move away from software applications that force them to buy, run and maintain expensive operating systems, Mr Nelson suggests.

But not all businesses are moving at the same pace: “We do see traditional businesses migrate from on premise software to cloud, but we also meet entrepreneurs who would never run their own software,” he says. “They play games over the net, so they ask why they should run their own ERP (enterprise resource planning).”

Independent analysts agree that today’s technologies allow businesses, especially smaller companies, to do away with a whole tier of IT infrastructure.

“If the question is: ‘is it feasible to have a laptop with a net connection on tap and a range of services delivered over the internet?’, then that is the case,” suggests Joslyn Faust, an analyst in Gartner’s small to midsize business research group.

At the same time, it is not yet the case that a business can move into an office with a broadband connection, plug into an Ethernet socket or connect up to a mobile broadband service and find all their applications and services on tap.

One reason is that, according to Karl Boone, a Boston-based IT delivery expert at PA Consulting, it still requires a high degree of skill to pick the right cloud-based services. “There is still a view that IT is something that runs on boxes but what businesses need are applications. But they do need help with which applications to run,” he explains.

Businesses might also need guidance on how to connect a range of cloud-based applications together – in itself a challenging task, and one that cloud service providers have only recently started to address, through hosted applications that can mediate between programs.

Organisations also need to think about where their data is stored, and how well protected it is against virus attack or data theft, especially if they are handling personal or customer data.

Then there is the question of data ownership, and what would happen if a cloud-based software provider were to fail, or ceased trading. Simply having a local copy of the data on a PC or laptop is of limited use, if the business is unable to access the application itself.

A further concern is that access to a broadband connection cannot be guaranteed. While Mr Lovell at ThinkGrid points out that five or even 10 users could connect to his service over a conventional domestic or small business broadband link, even advocates of a minimalist approach to IT concede that businesses need a robust broadband service.

This might need to include SDSL (symmetrical digital subscriber line) or redundant ADSL connections, especially if they also use their broadband lines for streaming media or telecommunications.

As a result, some businesses might prefer to continue to run some applications themselves, or look at applications that work offline.

“That’s why we’ve adopted a hybrid software and services model, with the application running locally but 100 per cent in sync,” explains Mr Hanoa of Mamut. “To play both horses you have to be able to access the software offline, but also still access it remotely.”

None the less, the pressure on companies – especially smaller companies – to stretch their budgets further and reduce both the cost and complexity of IT, is set to continue through 2010.

“Small business has two problems: a lack of access to capital, and a lack of access to skills,” says Sean Poulley, vice president of online collaboration services at IBM. “Anything you can do to restrict these two issues is welcome. And smaller businesses want to align their [IT] costs more closely with their business needs.”
Copyright The Financial Times Limited 2010. You may share using our article tools. Please don't cut articles from FT.com and redistribute by email or post to the web

Sorting out the printing device ‘zoo’

Sorting out the printing device ‘zoo’
By Jessica Twentyman

Published: March 18 2010 16:30 | Last updated: March 18 2010 16:30

Straight-talking James De Watteville doesn’t mince his words when it comes to describing his company’s sprawling estate of printing devices. ”It’s a zoo,” he says, ”and it needs sorting out.”

That “zoo” is populated by around 1,000 printers, serving 8,000 people across 22 different sites in the UK. Some of the devices are ancient, some are unreliable. A complex mesh of maintenance agreements and in-house resources are dedicated to their care and attention. It is far too expensive.

Mr De Watteville is chief information officer of insurance company RSA (formerly Royal Sun Alliance), and he describes a situation that will sound familiar to many IT bosses, according to Louella Fernandes, an analyst with IT market research company Quocirca.

”Many print environments still have a tangled mix of old and new equipment, both over and underutilised, leading to high costs due to the variety of consumables (both storage and procurement) as well as the financial and environmental costs associated with paper, ink, toner and energy consumption,” she says.

”Often, management and procurement of the printing infrastructure is distributed across locations and geographies, with no centralised tracking system to monitor and analyse costs.”

The true cost of unmanaged printing can startle businesses, she says. Industry estimates suggest companies spend between 1 and 3 per cent of their annual revenues on maintaining printer estates. Analysts at Gartner, meanwhile, estimate that using managed print services [MPS] can reduce that by between 10 and 30 per cent.

”Under MPS, a service provider takes primary responsibility for meeting the customer’s office printing needs, including the printing equipment, the supplies, the service and overall management of the printer fleet,” Gartner’s report explain.

”The main components provided are a needs assessment, selective or general replacement of hardware, and the services and supplies needed to operate the new and/or existing hardware.”

MPS services also extend to the kinds of high-volume printing requirements typically provided by an internal reprographics department or, alternatively, a third-party printing provider, says Carsten Bruhn, vice president of Ricoh Global Service Europe.

In a recent conversation with a large organisation in Finland, he was told that materials such as marketing documents were typically outsourced to a third party, who insisted on minimum print volumes for each project. Now, under that company’s MPS deal with Ricoh, it outsources that requirement to Ricoh, which prints as many – or as few – copies as needed, allowing the company to customise materials for customers in different markets.

Mr De Watteville at RSA hopes that a newly signed MSP contract will alleviate many of the headaches he and his team face. During the first quarter of 2010, printer manufacturer Kyocera Mita will begin replacing RSA’s printer estate with 375 multi-function devices, reducing the ratio of printers-to-people from 1:8 to 1:21.

Kyocera Mita will also take over the management of the refreshed estate, relieving RSA staff of the burden. Mr De Watteville says that, as a result, the company’s spend on printing over the next five years will shrink from £7.5m to £2.5m.

At the same time, the new MPS contract will help RSA meet key environmental targets, says Paul Pritchard, RSA’s UK head of corporate responsibility. ”Fewer printers mean reduced energy consumption – but it will help us drive some important behavioural changes in the area of paper consumption as well.”

The company’s efforts to encourage staff to print on both sides of paper have only been partially successful, he explains. Some of the current machines don’t even allow it but the new multi-function devices will have double-sided printing as the default option.

According to Gartner, MPS have so far only penetrated between 5 and 20 per cent of the target market. Very large organisations are the ones likely to gain most benefits from MPS, they say, ”because they probably have a widely dispersed printing environment with big potential for print optimisation”. But all can benefit.

However, companies appear to struggle to choose the right MPS provider, says Gartner analyst Ken Weilerstein. Definitions of the services on offer vary, he says, and MPS providers’ websites typically provide only vague indications of what the vendors do. ”In part [this is] because MPS tends to be more customised than print hardware.”

The problem, he says, is getting worse as new kinds of MPS providers emerge, including most of the printer/copier manufacturers and dealers and even office supplies wholesalers.

To emphasise its role in the market, Hewlett-Packard has launched a global business unit called Managed Enterprise Services (MES), with a view to boosting its MPS offering. This will be led by Bruce Dahlgren, an HP senior vice president, and draws heavily on resources acquired in HP’s purchase of EDS.

HP has also introduced a “Payback Guarantee” for its MPS customers. Mr Dahlgren explains: ”In conversations with customers, we’re hearing that a lack of visibility means that the big costs associated with printers are not being addressed.“ Under the scheme, any company that does not make the cost savings that HP’s consultants project for them within 12 months will be refunded the shortfall.

Other companies make similar offers. Xerox, for example, guarantees total cost of ownership reductions within year one of deployment. Unlike other MPS providers, Xerox does not force a refresh of the printer fleet, but will take over management of other manufacturer’s printers if the customer requests it, according to Andy Jones, director and general manager of Xerox Global Services.

US-based manufacturer Ingersoll Rand has recently signed a nine-year MPS contract with Xerox to deal with its complex estate of printers that fall into three different categories, according to John Kalka, vice president of deployment in Ingersoll Rand’s Office of the CIO. Some printers are owned by Ingersoll-Rand; some are leased; and some are already under a MPS contract with Xerox.

Under the new Xerox contract, printers in the first and last categories will be replaced only on an as-needed basis, while those that are leased will be decommissioned. ”We needed a strategic partner we could make responsible for both quality and cost in a truly variable cost model based on usage,” says Mr Kalka.

By bringing their printer estates under MPS, a further benefit is that no document will appear in a printer tray until the person requiring it is at the printer and has authorised it. At RSA, for example, staff will swipe their employee badge on the printer’s reader. Other companies use Pin codes to achieve the same thing.

That kind of service is addressing an issue that many businesses continue to overlook, says Graham Long, vice president of Samsung’s European printing operation. Unsecured printing environments, he claims, pose serious risks for European companies that could result in legal action, lost clients and lost revenue – and yet few business are aware of the reputational and financial risks their printing infrastructure poses.

In research recently conducted by the company among 4,500 workers across Europe. More than half of respondents (56 per cent) said that they see confidential company documents, printed by someone else, sitting on the printer at least once a month. One in five claim to see such documents every day.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Wednesday, March 17, 2010

Vakwereld :: Nieuws

Vakwereld :: Nieuws: "Mindjet voor visuele weergave Microsoft SharePoint-gegevens"

Mindjet, leverancier van samenwerkings- en individuele productiviteitstools die ideeën, informatie en mensen visueel verbinden, introduceert MindManager for SharePoint. Deze nieuwe visualisatie-oplossing stelt individuen en teams in staat taken, documenten en andere lijsten in Microsoft Office SharePoint Server 2007 interactiever weer te geven, zodat de gegevens eenvoudiger zijn te vinden, te structureren en te bewerken. Op die manier verlopen projecten en bedrijfsprocessen sneller.

Dankzij MindManager for SharePoint zijn intuĂŻtieve, gebruiksvriendelijke weergaves te maken van uiteenlopende SharePoint-data. Via de standaard-functionaliteit van MindManager zijn gegevens uit verschillende Office SharePoint Server 2007 sites en lijsten eenvoudig te vinden, te filteren en bij te werken. De gebruiker kan zelf kiezen welke weergave het prettigste werkt en kan aangeven welke taken, documenten, issues, agenda-items, afbeeldingen en links worden getoond.

Door SharePoint Server 2007-informatie visueler weer te geven, is aanwezige kennis beter te benutten, krijgt men sneller toegang tot informatie en kan men gemakkelijker tot nieuwe inzichten komen in meerdere projecten. Bovendien krijgt de gebruiker de mogelijkheid om de relaties tussen verschillende projecten en processen te zien, verdeeld over meerdere SharePoint-sites en -lijsten.

Interactieve informatieweergave
De integratie tussen MindManager en SharePoint Server 2007 werkt in twee richtingen. Gebruikers kunnen dashboard-maps maken die taken, documenten en agenda-items voor een of meerdere sites weergeven. Daarnaast is alle beschikbare SharePoint Server 2007-content eenvoudiger te doorzoeken. De zoekresultaten worden visueel uiteengezet en de getoonde gegevens zijn binnen MindManager direct te benaderen. Hierdoor kunnen gebruikers op een dynamische manier met elkaar samenwerken via hun desktop en is via MindManager's browser informatie te bekijken en te bewerken.

Microsoft Office en Outlook integratie
MindManager for SharePoint legt informatie op een visuele manier vast en organiseert deze gegevens zodat ze ook gebruikt kunnen worden in de strategische planning, presentatie en uitwerking van een project. Gebruikers kunnen plannen exporteren naar Microsoft Project 2007, het ontwerp en de uitwerking van het project in Microsoft Office Word opmaken en hun ideeën vervolgens omzetten in Microsoft Office PowerPoint-presentaties.

"De huidige CIO's moeten meer rendement behalen op hun huidige technologie-investeringen en zakelijke activa op een betere wijze benutten", zegt Scott Raskin, CEO van Mindjet. "Het is geen eenvoudige taak, vooral omdat bedrijven steeds meer worden verdeeld. MindManager for SharePoint vormt een aanvulling op de sterke collaboration-software van Microsoft door informatie op een visuele manier weer te geven. Dit verbetert projectmanagement en teamsamenwerking, stimuleert de innovatie en levert dus betere resultaten op."

"Microsoft Office SharePoint Server 2007 is wereldwijd het meestgebruikte collaboration-platform dat het delen van informatie binnen organisaties vergemakkelijkt", zegt Owen Allen, Sr. product manager bij Microsoft. "Partners zoals Mindjet zijn in staat de out-of-the-box mogelijkheden van SharePoint Server 2007 uit te breiden en gebruikers te helpen bij het besparen van tijd, het sneller oplossen van zakelijke problemen en processen en innovatie te versnellen."

MindManager voor SharePoint is per direct beschikbaar in drie talen: Engels, Duits en Frans.

Datum: 25 februari 2010
Categorie: Algemeen
Bron: Mindjet

Persberichten.com - Actuele persberichten over ICT, Telecom en Internet business

Persberichten.com - Actuele persberichten over ICT, Telecom en Internet business: "e-Spirit breidt FirstSpirit 4.2 uit met nieuwe Release-versie"

Nieuwe functies voor redacteuren en ontwikkelaars

Dortmund, 17 maart 2010 – e-Spirit brengt met „FirstSpirit Versie 4.2 Release 2“ (4.2R2) een nieuwe versie van haar Content Management Systeem uit. Daarmee worden aan editors, interface-ontwikkelaars, en beheerders binnen de reeds vrijgegeven Minor-versielijn 4.2 talrijke nieuwe functies en verbeteringen ter beschikking gesteld. Tot de hoogtepunten behoren de uitbreiding in de Nederlandse taal voor de FirstSpirit-toepassingen, evenals de ondersteuning van Internet Explorer in de geĂŻntegreerde preview van JavaClient. Een update van 4.2 naar 4.2R2 veroorzaakt t.a.v. het project geen migratiekosten.

Nieuwe eigenschappen van FirstSpirit 4.2R2 op een rij:

•Taaluitbreiding Nederlands
De FirstSpirit-toepassingen JavaClient, WebClient, server- en projectconfiguratie, evenals server-monitoring zijn nu ook beschikbaar in het Nederlands (naast de talen Duits, Engels, Frans, Spaans, Italiaans en Russisch). De taalinstelling voor de contextmenu’s, opschriften en dialogen kan via de startpagina worden geselecteerd.
•Microsoft Internet Explorer voor de geĂŻntegreerde preview
Naast Mozilla Firefox kan op Windows-systemen met directe ingang ook Internet Explorer voor de preview-afbeelding in FirstSpirit JavaClient worden toegepast. Editors kunnen naar eigen goeddunken overschakelen tussen de beide browsers, om de weergaves van de websites in beide systemen te controleren.
•Full-text search in de FirstSpirit online-documentatie en Access-API
4.2R2 beschikt over een full-text search, waarmee ook de inhoud van de online-documentatie en de FirstSpirit Access-API doorzocht kan worden. Daarbij kunnen de zoekresultaten naar doelgroepen (beheerders, gebruikers, ontwikkelaars) en de hits voor ontwikkelaars aanvullend op onderwerpen filteren.
•Gebruik van FirstSpirit onder Mac OS X
Met 4.2R2 implementeert e-Spirit belangrijke eigenschappen en optimalisaties voor de in toekomstige versies geplande vrijgave van de toepassing van FirstSpirit onder Mac OS X. Het Look & Feel van JavaClients en het gebruik voor de server- en projectconfiguratie, werd aangepast aan het bekende Mac-menubeheer. In plaats van de tot nu toe gebruikte Windows-toetsenbordafkortingen, kunnen nu de onder Mac OS vertrouwde shortcuts met de „apple-toets“ worden toegepast. Daarnaast werd de geĂŻntegreerde preview onder Mac OS tot 64-bit-systemen uitgebreid.
•Filteren, zoeken en beperking van getoonde gegevens
Om het werk van editors te verlichten, die een groot aantal gegevens moeten verwerken, wordt in Release 2 een beperkte en gefilterde aankondiging van gegevens ingevoerd.

Nieuw e-Spirit Release-management

Met de invoering van deze Release-versie verandert e-Spirit de tot nu toe gevolgde praktijk van het Release-management: in plaats de tĂłt de FirstSpirit versie 4.1 gebruikelijke „build-versies“, die naast het verhelpen van storingen, ook kleinere nieuwe functies in zich hadden, wordt sinds de vrijgave van de versie 4.2, binnen de Minor-versie-lijn duidelijk onderscheid gemaakt tussen twee software-versie-types: bugfix-versies bevatten uitsluitend storingsoplossingen, Release-versies bevatten bovendien ook een serie nieuwe functionaliteiten. De reden voor de invoering van Release-versies is, dat e-Spirit daarmee ook binnen de Minor-versie-lijnen op vastgelegde tijdstippen nieuwe eigenschappen bekend kan maken, maar wel met behoud van de software-stabiliteit.

Verdere informatie over e-Spirit en FirstSpirit kan men vinden op internet via www.e-Spirit.com.

Einde persbericht

Over e-Spirit and FirstSpirit™
Als fabrikant van het high-end Content Management Systeem FirstSpirit™, is e-Spirit technologieleider in de Europese markt. Het bedrijf werd in 1999 opgericht door voormalige leden van Fraunhofer ISST in samenwerking met de IT-dienstverlener Adesso. Naast licenties en FirstSpirit™ integratie, biedt e-Spirit tevens uitgebreide ondersteuning voor complexe internet- en intranetprojecten. Klanten profiteren van de consultancy- en implementatie-expertise van partnerbedrijven.

FirstSpirit™ is een high-end oplossing voor Content Management en kan worden geintegreerd met complexe systeemomgevingen en portals. Slimme klanten gebruiken het Content Management Systeem (CMS) om een brede selectie aan content te publiceren via vrijwel alle beschikbare kanalen (internet, intranet en extranet, PDF, e-mailnieuwsbrief, portals, DTP, technische documentatie). Gebruikers verwachten een hoge mate aan gebruiksvriendelijkheid, performance, integratievermogen en investeringveiligheid. Behalve de uitgebreide out-of-the-box functionaliteit, integreert e-Spirit ook innovatieve oplossingen van haar technologiepartners in de modulaire basisstructuur van FirstSpirit™.

Perscontact
Sandra Högemann/Oliver Jäger
e-Spirit AG
Barcelonaweg 14, 44269 Dortmund, Duitsland
+49 (0)231 28661-73 / -66
Presse@e-Spirit.com
www.e-Spirit.com

FT.com / Technology - Facebook becomes bigger hit than Google

FT.com / Technology - Facebook becomes bigger hit than Google: "Facebook becomes bigger hit than Google"

Facebook becomes bigger hit than Google
By Chris Nuttall and David Gelles in San Francisco

Published: March 16 2010 13:15 | Last updated: March 17 2010 00:10

















Social networking website Facebook has capped a year of phenomenal growth by overtaking Google’s popularity among US internet users, with industry data showing it has scored more visits on its home page than the search engine.

It is the first time that Facebook.com has enjoyed a weekly lead over Google.com. The lead may be slim, but it has become inevitable as Facebook’s popularity has grown rapidly from just over 2 per cent of visits a year ago. Heather Dougherty of Hitwise said that Facebook had “reached an important milestone” with the weekly figures.

Facebook’s membership has more than doubled in the past year, passing the 200m mark last April and 400m in February.

“The true value of Facebook and social networks is just becoming clear to marketers,” said Augie Ray, analyst at Forrester Research.

Although Facebook is enjoying rapid growth, it is only beginning to cash in on its success. Revenues at the social media company are estimated to be in the range of $1bn to $1.5bn this year, while Google took in $23.7bn last year.

Google has responded to the ascendancy of the social networking site with its own Buzz service last month. Buzz allows users to add status updates, friends, pictures, videos, location information, comments and links to other networking sites. Buzz, though, has struggled with privacy concerns just as Facebook has been criticised for encouraging members to reveal personal data to search engines.

The Hitwise figures only cover visits to the Google.com site, meaning that services such as Gmail, YouTube, Google Maps and searches carried out in a box in a browser toolbar are excluded. Taking all Google properties into account, the internet company accounted for 11.03 per cent of US website visits last week, compared with 10.98 per cent for Yahoo properties and 7.07 per cent for Facebook, according to Hitwise.

Facebook’s trajectory suggests that it will soar ahead of Google.com in the coming months. However, social networking sites have fallen in the past. Google.com had led since September 2007, when it overtook News Corp’s MySpace.com.

Internet users worldwide spent more than five-and-a-half hours a month on social networking sites such as Facebook and Twitter in December 2009, an 82 per cent increase over the previous year, according to the Nielsen Company research firm.

US users spent nearly six-and-a-half hours on Facebook compared with fewer than two-and-a-half hours on Google.
Copyright The Financial Times Limited 2010. You may share using our article tools. Please don't cut articles from FT.com and redistribute by email or post to the web.

FT.com / Technology - SAP aims to dispel its old school image

FT.com / Technology - SAP aims to dispel its old school image: "SAP aims to dispel its old school image
By Richard Waters in San Francisco"

Published: March 17 2010 03:13 | Last updated: March 17 2010 03:13

Joint approach: co-chief executives Jim Snabe (left) and Bill McDermott

If the new bosses of SAP are to put Europe’s biggest software company back on track, they will need to pull off nothing less than a cultural revolution.

Reeling from a botched fee increase that brought a rebellion from customers and precipitated the abrupt departure last month of LĂ©o Apotheker as chief executive officer, SAP is now looking to force rapid internal change.

“I don’t think we did a lot of things wrong in the past – we just have to do them a lot faster,” Jim Snabe, co-chief executive officer, said during a visit to Silicon Valley this week.

“Speed” and “innovation” are now the words that come first from the lips of the new management team, which includes Bill McDermott, the American software salesman who shares the chief executive title, and Vishal Sikka, the chief technology officer who has just been elevated to a board position.

For a company whose name is a byword for stolid reliability, and whose systems are known for their rigidity and inflexible user interfaces, this points to a much-needed cultural shift.

Some of SAP’s biggest technology bets of recent years have failed to yield the results it hoped for. One of the main ones – on a new generation of more flexible software based on so-called “service-oriented architecture” – has failed to bring much benefit so far since corporate customers have not seen enough benefit in retro-fitting their old systems to adopt this technology, admits Mr Sikka.

Meanwhile, SAP’s first big push into “cloud computing” – an on-demand system for medium-sized businesses known as Business By Design – has been bedevilled by delays. “They’ve been working on this for many years, and whatever they’ve been doing, it hasn’t produced much,” says Michael Cusumano, a professor at the Massachusetts Institute of Technology and an expert in software development techniques.

With technology change accelerating and corporate users wanting to get technology into the hands of a wider group of workers much more quickly, SAP is under pressure to change its ways.

The clearest symbol of this internal upheaval has been the adoption of a new approach to software development.

According to Mr Snabe, SAP first started dabbling a year ago with so-called “agile” development techniques, which eschew the traditional rigid project planning it has traditionally used. Instead, the agile method relies on programmers working with customers on small elements of a project, using rapid trial-and-error methods to reach a solution before moving to the next part.

“They’re very late with this,” says Mr Cusumano. “The old approach isn’t working – it probably shows they have reached a crisis.”

Even analysts who claim SAP’s new methods are already bringing it closer to its customers concede that this is only the start of a bigger overhaul. “SAP has been saddled by the perception that it’s old-school,” said Josh Greenbaum at Enterprise Applications Consulting.

As SAP’s new chiefs try to turn around its methods, meanwhile, they are facing twin technology upheavals that could open up big new markets for the company – if it can find a better way to respond than it has so far.

The rise of cloud computing and software delivered over the internet has so far found SAP wanting, due to the delay in Business By Design, which the company says will finally launch later this year.

Yet most of its big customers expected to move to the cloud only gradually, combining their existing systems with services delivered over the internet. That could play to SAP’s strengths, since it should be better positioned than companies that operate only online to handle and integrate all aspects of a customer’s data.

A second upheaval is the advent of “in-memory” databases – the hardware advances that are pushing corporate data much closer to the processors that need to work on it. Without the need to call on separate database systems to retrieve data, corporate systems could work far more efficiently and quickly, says Mr Sikka. Eventually, he adds, that could prompt the renewal of SAP’s core applications, representing a new generation of technology as important as the company’s original breakthrough resource planning systems.

Big technology visions such as these are nothing new for SAP. But as its new leaders are well aware, they now need to prove they can make them real much faster.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Monday, March 15, 2010

Rapport Aspen Institute: SAP loopt voorop op gebied van social media

Rapport Aspen Institute: SAP loopt voorop op gebied van social media: "'s Hertogenbosch - 11 maart 2010 - Een nieuw rapport van het Aspen Institute, getiteld 'Leveraging the Talent-Driven Organization' biedt inzicht in hoe een aantal bedrijven gebruikmaken van social networking tools om open manieren van leren, communiceren en internationale samenwerking te verbeteren. Daarnaast zetten ze deze middelen in voor de ontwikkeling van nieuwe producten en real-time oplossingen voor klanten. Met commentaar van leiders van internationale bedrijven, waaronder Mark Yolton, Senior Vice President van SAP, betiteld het rapport het SAP Community Network (SCN) als het hoogstwaarschijnlijk meest gebruikte sociale media platform tot nu toe door een onderneming."

An early look at SharePoint 2010 | Windows - InfoWorld

An early look at SharePoint 2010 | Windows - InfoWorld: "Microsoft SharePoint 2010"

Microsoft SharePoint 2010 is a tremendous improvement over previous versions for both developers and IT professionals, enabling the next generation of collaboration

Wednesday, March 03, 2010

Wissensmanagement 3.0

Wissensmanagement 3.0: "BarCamps"

Die wirtschaftliche Entwicklung der letzten beiden Jahre hat gezeigt, dass sich Unternehmen in einem zunehmend unüberschaubareren und damit schwieriger planbaren Umfeld befinden. Hier kann die Komplexitätstheorie einerseits helfen, die eigene Situation besser beurteilen zu können und andererseits als Metapher dafür dienen das Führungsverhalten so anzupassen, dass Kreativität und Veränderungsbereitschaft im Unternehmen optimale Bedingungen vorfinden. Dies wird gleichzeitig als neues Paradigma des Wissensmanagements verstanden, dessen Vorboten in solch "komplexen" und "chaotischen" Lernarenen wie KnowledgeCamps sich abzeichnen.