Sunday, December 31, 2006

EContentMag.com: Content Technologies, Circa 2007

EContentMag.com: Content Technologies, Circa 2007

Large mergers this year at the top of the content technologies marketplace (IBM swallowing FileNet, Open Text buying Hummingbird, Autonomy taking Verity) have led some to opine that these markets are finally maturing. I'm not so sure.

Document Collaboration

Document Collaboration

Tuesday, December 05, 2006

// NUCLEUS RESEARCH //

// NUCLEUS RESEARCH //

Nucleus Research Predicts Top 10 IT Trends for 2007

Expect a greater focus on enabling end users, stronger threats to traditional IT vendors

Wellesley, Mass.,—December 5, 2006—Nucleus Research today announced its Top 10 IT Predictions for 2007. The annual report has accurately predicted major IT trends for enterprise end users and vendors for the past three years. Nucleus predictions are based on analysis of both vendors and thousands of corporate end-user case studies.

“We see an exciting new year ahead – especially for companies embracing new technologies – with two larger trends emerging from our top 10 predictions. First we see the empowerment of the enterprise end-user as on-demand offerings and SOA drives more functionality down to the end user. Second, we see a growing challenge to traditional IT vendors through new technologies and models, including data virtualization, the integration of data mining tools and the ongoing evolution of IP and wireless technologies,” said Rebecca Wettemann, Nucleus Research.

Nucleus predicts the following for 2007:

1. Adoption of on-demand solutions and expansion of existing on-demand solution environments will continue as IT increasingly embraces on-demand as a way to reduce ongoing management hassles and accelerate project delivery.

2. Broader adoption of service-oriented architectures (SOA) will drive greater ROI and change how IT infrastructure is evaluated.

3. Data visualization and analysis tools for business users will drive greater adoption and more rapid decision making, while challenging the traditional business intelligence approach.

4. Content management consolidation will drive down prices and make rapid deployment, usability, and ongoing TCO key decision factors for migration efforts.

5. More and more organizations will look to integrated data mining to achieve ROI by reducing marketing costs and better targeting customer interactions.

6. Agent, filtering, and collaboration technologies (Knowledge Management) will become more prevalent in CRM, ERP, and other applications – as will other features designed to make them more intuitive and easier to use.

7. To win, cement, and expand customer relationships, successful vendors will enhance their services offerings to provide blueprints, templates, and other prescriptive guidance – free of charge.

8. As technology options become more accessible and affordable for organizations we’ll see a rise in the adoption of low cost and quick returns solutions like predictive analytics and on-demand in the public and nonprofit sector – driving greater demands for accountability in government IT spending.

9. BPO 2.0 leverages on-demand technology and secure remote network access to employ a distributed labor force, enabling more flexible, higher quality, more agile, and more focused customer interaction on a global basis.

10. Continued adoption of IP telephony and free tools like Skype will challenge traditional carriers’ pricing models and drive public broadband efforts.”

Nucleus analysts are available to discuss any of the IT predictions listed above. The full report is available at www.NucleusResearch.com.

About Nucleus Research

Nucleus Research is a global provider of IT advisory and research services that provides CFOs, CIOs and their staffs with the real-world information they need to maximize the business returns from their technology investments. For more information, visit www.NucleusResearch.com.

Contentmanager.days 2007 und weitere ECM-WORLD Veranstaltungen 2007 angekündigt

Contentmanager.days 2007 und weitere ECM-WORLD Veranstaltungen 2007 angekündigt

FT.com / Technology - Highlights and predictions: farewell to Bill Gates; hello to Second Life

FT.com / Technology - Highlights and predictions: farewell to Bill Gates; hello to Second Life

FT.com / Companies / IT - Microsoft ‘may need a rethink’ on Zune

FT.com / Companies / IT - Microsoft ‘may need a rethink’ on Zune

Thursday, November 09, 2006

Nvidia Strengthens Mobile Focus With Deal to Buy PortalPlayer

This strategic agreement focuses on growth opportunities beyond the PC. PortalPlayer's proficiency in audio combined with Nvidia's mobile video expertise will position Nvidia for video-focused, portable multimedia devices.

Nvidia Strengthens Mobile Focus With Deal to Buy PortalPlayer

Tuesday, November 07, 2006

Microsoft may launch Office tools online (FT)

Bill Gates hinted that Microsoft would launch simple online versions of some of its Office desktop software tools as he hit back at suggestions that new internet services from Google and others might start to eat into one of his company’s core businesses.

His comments followed a rash of new services from Google and other internet companies to handle things such as word processing, spreadsheets and online calendars.

Google also changed course last month by starting to combine some of these its applications into an integrated package, echoing the “suite” approach that helped to turn Microsoft’s Office into the dominant desktop software application.

Mr Gates told the Financial Times that Microsoft would itself match services such as those offered by Google. He argued strongly, though, that this would only ever represent a small part of the market.

Asked whether Microsoft planned to launch online “productivity” tools like those in Office, he said: “We’re going to cover 100 per cent of the productivity needs – our track record is to keep innovating.”

Mr Gates said the present generation of e-tools were still at a very basic level and broadly comparable to Works, a collection of simple applications that Microsoft sells for PC users, and so did not threaten the core Office business.

“The web-based things aren’t an advance over what Works has been for a long, long time,” he said. “We don’t think the market will shift to a Works-like level.” Works is only estimated to account for a small percentage of Microsoft’s “information worker” business, which produced overall sales of $12.4bn last year.

Until now, Microsoft has steered clear of launching web-based productivity applications in direct competition with Google.

Its “Office Live” service, which is due to come out of its test phase later this month, is a collection of tools for small companies to create and run their own websites, rather than being related to the desktop software suite whose name it bears. While maintaining that most office workers would continue to use full versions of the Office desktop software, Mr Gates suggested that they would in future find it easier to access their work from any machine – one of the advantages of the online services.

“There’s a difference between actually running an application on a server versus letting a document be found on a server,” he said. “We’re going to make a push to let you keep documents on a server.”

Since most office workers use full-functioning PCs, it made sense to take advantage of that local computing power with applications such as those in Office, even if their documents are held centrally, he added.

Waiting for the new web revolution (FT)

When it comes to corporate technology, the internet services revolution cannot be rushed.

That, at least, is the message from Bill Gates. A year ago he and Ray Ozzie, now Microsoft’s chief software architect, laid out a radical new vision for Microsoft, one that was based increasingly on delivering software-enabled services rather than traditional software packages.

Yet for much of the software that companies use – from desktop “productivity” programs such as Office to the back-office applications that underpin their marketing departments, supply chain operations and other parts of their business – this is still a long way off, according to Mr Gates.

Speaking this week to the Financial Times, Mr Gates sketched out a vision for corporate software that looks less radical than that promoted by many others in the software and internet industries.

The danger, if he is wrong, is that Microsoft risks squandering its entrenched position in corporate desktop software, as well as its ambition to turn back-office applications into its next big growth business.

“What Microsoft faces today is what happened to the mainframe,” warns Joe Wilcox, an analyst at Jupiter Research – that it will be replaced by a lower-cost, more efficient model of computing, this time based on the internet.

It is represented by companies such as Salesforce.com, whose back-office applications are delivered online, and by Google, whose suite of productivity and collaboration tools for office workers has been multiplying.

According to Mr Gates, tech companies have made the mistake before of believing in overnight transformations. At the beginning of the decade, for instance, all the talk was of “application service providers”, companies that would deliver services online as if they were water or electricity. “Intel was going to build all these datacentres, there were tons of start-ups,” he says. Most foundered.

When it comes to back-office “enterprise resource planning” applications, he adds: “We’ll have some things on-premise, some things published out on the web. We think few companies will be purely on-premise, or purely on the web.”

Getting the balance right will be key to success for one of Microsoft’s most important new businesses. Its business applications business, started through acquisition five years ago and now generating revenues of about $1bn a year, is “something that will grow faster than the rest of the business for many, many years”, Mr Gates says.

“Like every Microsoft business in the first few years, we’re learning, we’re putting the pieces together.”

So far in this area, Mr Gates is following a classic Microsoft game plan. Part of it involves tying the applications more closely to Microsoft’s main asset – its desktop software. By using the Office suite of desktop tools as a way to access back-office applications, Microsoft hopes to stimulate wider usage, particularly among the smaller businesses that are its main target.

Microsoft is also playing the same “fast-follower” role it has in other markets where other companies have set the early pace.

For instance, when it comes to Office Live – a collection of online services for small businesses – “it looks like they came up with many of the ideas after they looked at the Salesforce.com website”, says Bruce Richardson, an analyst at AMR Research. Yet this me-too approach has worked for Microsoft in other markets before.

The shape of this services-and-software vision for corporate software has yet to come fully into focus, though Microsoft continues to inch forward. Earlier this week, for instance, it announced plans to make its customer relationship management software available as a service in the second half of next year, with other software to follow.

It also laid out more ideas for how small businesses would be able to combine both on-premise software and internet services (supplied by Microsoft) to create new composite applications – or “mash-ups”, in the jargon of the Web 2.0 internet movement.

For instance, a marketing executive, reviewing details of a client relationship on his Microsoft software, might link directly into a hosted “collaboration” service to start a conversation with colleagues, then connect to Microsoft’s online keyword advertising system to launch a campaign.

“There are a lot of pieces in motion that haven’t landed yet,” Mr Wilcox says.
Mr Richardson adds: “Everything is still in a state of flux.”

Microsoft’s growing range of software and services for smaller businesses are “all aimed at the same desktop, and it gets a little confused”. He says, though: “That’s the way Microsoft likes it.”
In other areas, Microsoft has shown a similar desire to attack on all fronts at once. In its assault on the digital living room, for instance, it has spread its bets across the Xbox 360 games console, set-top boxes and special “media centre” PCs.

As the shape of corporate software and technology services evolves, that may prove a smart strategy – though it risks leaving Microsoft at a disadvantage to pure internet-based companies in at least one respect. “The Salesforce.com story is pure and unencumbered,” Mr Richardson says.

Copyright The Financial Times Limited 2006

Beauty parade for Web 2.0 start-ups (FT)

Beauty parade for Web 2.0 start-ups

By Richard Waters in San Francisco
Published: November 5 2006 20:27 Last updated: November 5 2006 20:27

A “crowdsourcing” company that lets software developers vote on which product they will create next, a “social sharing” start-up that promises to get to “the very end of the Long Tail”, a maker of online Post-it notes.

These may sound like parodies of new internet companies emerging from Silicon Valley’s latest bout of internet euphoria. In fact, they are all start-ups that will be paraded this week at the Web 2.0 conference in San Francisco, an annual event that has turned into a celebration of the Valley’s recovery from its post-dotcom slump.

While promoters of the new wave of internet start-ups claim this is not turning into another bubble, it is reminiscent of the last boom in at least one respect. “There is a great deal of hype,” says Mitchell Kertzman, a partner at Hummer Winblad, a Valley venture capital firm.

And where there is hype, opportunism flourishes. Many of the companies emerging from this start-up wave, like the last, look as if they were created with an eye to being sold on quick. But this time the aim is not to “flip” them to Wall Street investors, but to sell them to a Google or Yahoo.

With the mania in full swing, the amount of venture capital money finding its way into US internet companies has jumped to levels not seen since the boom.

Defining exactly what it is that characterises this new wave of internet euphoria, however, is not easy. “Web 2.0 means so many things to so many people,” says Steve Ballmer, chief executive of Microsoft. “There’s a technology aspect, a community phenomenon, an advertising business model.”

The new internet companies are built on low-cost technologies such as open source software and cheap commodity hardware. Many – such as photo-sharing site Flickr – employ tools designed to stimulate online community behaviour. Also, thanks to the rise of online advertising networks, the new start-ups often have a way to generate revenue immediately.

Young internet companies once rushed to see how much cash they could raise, much of it to be spent on advertising. But the new entrepreneurs boast instead about how little they need. In spite of that, the sheer number of new arrivals suggests there will be many casualties. “For every YouTube, there have probably been 20 or 30 companies funded that won’t be worth anything,” Mr Kertzman says. “If a company doesn’t take off virally and get ‘hot’ on its own, the only tool you have is consumer marketing, which is very expensive.”

Meanwhile, the cash flooding back into consumer internet start-ups has had an inevitable effect. Geoff Yang, a venture capitalist at Redpoint – which backed MySpace – estimates that valuations of private internet companies have risen 30-40 per cent in the past six months.

However, the public markets have not experienced similar upswings, and the dearth of initial public offerings in the US suggests that few of these new companies will ever make it to Wall Street. Once Google and Yahoo tire of acquisitions, the Web 2.0 hangover could be acute.

Copyright The Financial Times Limited 2006

Thursday, November 02, 2006

My Own Private Google (Line56)

Google offers ways to personalize the search experience; part of a recent pattern of experimenting with search

Wednesday, November 01, 2006

Google Boosts Stake in Web Collaboration With JotSpot Wiki Buy (Gartner)

JotSpot's technology and organization fit well with Google by providing a key collaboration piece for Google's Web 2.0 workplace strategy. But JotSpot users should prepare for some disruption during the next 12 months.

Google Boosts Stake in Web Collaboration With JotSpot Wiki Buy

Google Boosts Stake in Web Collaboration With JotSpot Wiki Buy

Google Acquires JotSpot (Line56)

Google acquires wiki application company; product fits into collaboration portfolio, but model isn't necessarily proven

Google has acquired JotSpot, the wiki software company, for an undisclosed sum.

While Google's recent blockbuster acquisition of YouTube focused attention on video search and sharing, the JotSpot acquisition puts the focus on another part of Google's strategy: collaboration or, as Google calls the larger category, "communicate, show, and share." This category already includes tools for blogs, calendaring, document processing (including spreadsheets), G-mail, photo sharing, and more.

Wednesday, October 25, 2006

Google startet Suchmaschine zum selber Basteln (contentmanager.de)

Seit gestern ist die neue kundenspezifische Suchmaschinen-Plattform 'Custom Search Engine' von Google online...

Tuesday, October 24, 2006

Autonomy signals share buy-back (FT)

Autonomy, the search software company, is likely to spend surplus cash on a share buy-back for want of suitable takeover targets.

Friday, October 20, 2006

Microsoft IE7 Is a Strong Response to the Firefox Challenge (Gartner)

Microsoft is often at its best when facing a strong competitor. With Internet Explorer 7, Microsoft is a "fast follower" of competing browsers like Firefox, but it also offers several innovations.

Foto Maarten

Monday, October 16, 2006

Wikipedia founder plans rival (FT)

Wikipedia founder plans rival
By Richard Waters in San Francisco
Published: October 16 2006 22:08 Last updated: October 16 2006 22:08

One of the founders of Wikipedia is days away from launching a rival to the collaborative internet encyclopaedia, in an attempt to bring a more orderly approach to organising knowledge online.

Wikipedia – which is available to be written and edited by anyone on the internet – is one of the most visible successes of mass collaboration on the web, with many of its 1.4m articles appearing high in search results.

However, its openness has also drawn charges of unreliability and left it vulnerable to disputes between people with opposing views, particularly on politically sensitive topics.

The latest venture from Larry Sanger, who helped create Wikipedia in 2001, is intended to bring more order to this creative chaos by drawing on traditional measures of authority. Though still open to submissions from anyone, the power to authorise articles will be given to editors who can prove their expertise, as well as a group of volunteer “constables”, charged with keeping the peace between warring interests.

Accusing Wikipedia of failing to control its writers and editors, he said: “The latest articles don't represent a consensus view – they tend to become what the most persistent ‘posters’ say.”
Mr Sanger said he had financial backing from an unidentified foundation for his new venture, while a web hosting company was providing its services free. He said he became frustrated with Wikipedia's failure to build expertise into its editing process and left after its first year.

Since then, the encyclopedia's other founder, Jimmy Wales, has taken some steps to bring more order to the Wikipedia approach, although he has avoided using authority figures such as editors.

Asked in an e-mail exchange how such disagreements should be resolved, Mr Wales replied: “With strong support for individual rights, and respect for reason.” His e-mail went on: “It is the fundamental responsibility of every individual to- think-, to- judge-, to-decide-. We must never abdicate that responsibility, not to the collective, not to Britannica, not to Wikipedia, not to anyone.”

Mr Sanger said volunteers would be able to become editors of his encyclopedia, called Citizendium, if they can show “minimum levels of qualification, based on real-world measures.”
This would be an “imperfect but effective” test based on “degrees, professional society memberships, things like that”.

Citizendium will be open “within the next few days” to a limited number of invited editors and members of the public who apply, and will be made generally available by the end of the year, said Mr Sanger.

It is likely to take Citizendium some time to prove whether it can create a better online encyclopedia. It will begin by simply taking over all of the existing entries from Wikipedia, then start the laborious job of having them filtered by expert editors – a job Mr Sanger called “a clean-out of the Augean stables”.

Copyright The Financial Times Limited 2006

Friday, October 13, 2006

Axel Springer: Größter europäischer Zeitungsverlag entscheidet sich für Getronics (contentmanager.de)

Das Workspace ICT Services Unternehmen Getronics hat die Unterzeichnung eines Vertrages mit dem größten europäischen Zeitungsverlag, der Axel Springer AG (Zeitungsgruppe Welt/Berliner Morgenpost), bekannt gegeben. Inhalt des Vertrages ist die Entwicklung eines neuen Content Management Systems (Escenic Media System) für die Webseiten der Zeitungen "Die Welt" und "Welt am Sonntag".

Die Axel Springer AG verlegt unter anderem die Titel "Die Welt", "Berliner Morgenpost", "Bild" und "Autobild". Die Online-Ausgabe der "Welt" ist zudem eine der bekanntesten Nachrichtenseiten in Deutschland. Als Teil ihrer "Online First"-Strategie implementiert Axel Springer momentan ein Software-Programm, das für die Vielzahl der Online- und Print-Redaktionen der Zeitungsgruppe Berlin einen einzigen integrierten Newsroom schaffen soll. Diese Entwicklung, gemeinsam mit veränderten Benutzeranforderungen und den gewachsenen Möglichkeiten des Internets, gab den Ausschlag für die Entscheidung, ein modernes, state-of-the-art Content-Management-System zu implementieren.

Nach einem sorgfältigen Auswahlverfahren entschied sich die Axel Springer AG für Getronics und das CMS System von Escenic. Den Ausschlag gaben dabei vor allem das umfassende Know-how und die große Erfahrung, die beide Unternehmen im Mediensektor vorweisen können. Darüber hinaus offerierten Getronics und Escenic ein fertig entwickeltes CMS-System, das bereits erfolgreich bei anderen Kunden implementiert wurde. Als weitere Gründe kamen hinzu, dass man Getronics auf Basis der großen Erfahrung und der erfolgreich abgewickelten Projekte zutraut, ein solches Vorhaben problemlos innerhalb eines knapp bemessenen Zeitrahmens international auszurollen und gleichzeitig dem Management die notwendige Unterstützung bei der Einführung des CMS zu bieten.

Das Content-Management-System von Esenic hat seine Leistungsfähigkeit bereits in einer Vielzahl von Projekten im Medienbereich unter Beweis gestellt. Dabei hat sich gezeigt, dass das System nicht nur optimal für Unternehmen ist, die auf einer Vielzahl verschiedener Webseiten einem hohen Level an Interaktivität genügen müssen. Zusätzlich erfüllt die Esenic-Lösung flexibel die Anforderungen in einem sich ständig verändernden Markt.Getronics stärkt mit diesem neuen Projekt seine führende Position im Marktsegment "Medien" und wird damit zunehmend zu einem interessanten Partner für die wachsende Zahl von Medienunternehmen mit einer Internationalisierungsstrategie.

Thursday, October 12, 2006

Google Will Face Challenges in Wake of YouTube Acquisition (Gartner)

The purchase of YouTube presents Google with an opportunity to tap into the lucrative video brand advertising market. But copyright hurdles must be cleared before that can happen.

Friday, October 06, 2006

Magic Quadrant for Information Access Technology, 2006

Magic Quadrant for Information Access Technology, 2006

No new Leaders emerged in this year's iteration of the Magic Quadrant for information access technology. Acquisitions and vision improvements
have nevertheless forced significant changes in positioning throughout.

This Magic Quadrant includes vendors with capabilities that go beyond enterprise search to encompass a collection of technologies, including: search; content classification, categorization and clustering; fact and entity extraction; taxonomy creation and management; information presentation (for example, visualization) to support analysis and understanding; and desktop (or personal knowledge) search to address user-controlled repositories to locate and invoke documents, data, e-mail and intelligence.

We consider all enterprise search vendors to be information access technology vendors; however, those that only offer search capabilities (frequently called "keyword search") are inherently not Visionaries or candidates for the Leaders quadrant. Finding information, and acting on it intelligently, demands increasingly sophisticated and innovative strategies.

We now recommend that Global 2000 enterprises at least select a platform vendor for the majority of future projects. Platform vendors offer modular architectures, wide varieties of relevance modeling, multiple vertical applications and significant customizability. Enterprises should also typically have a tactical vendor to increase the agility for short-term and quick-start projects. Such tactical vendors may lack architectural sophistication and customizability, but they are quicker to deploy and easier to understand. Enterprises must also recognize the need to explore more specialized products for important and specific projects, such as customer interaction hubs, e-commerce search or research science support.

Thursday, October 05, 2006

Nach Hummingbird-Übernahme: Open Text gibt Eckpunkte der künftigen Unternehmensstrategie bekannt (contentmanager.de)

Wie Open Text (Nasdaq: OTEX, TSX: OTC) mitteilte, stärkt der am 2. Oktober bekannt gegebene Abschluss der Übernahme von Hummingbird die Position des Unternehmens als des weltweit größten unabhängigen Anbieters von Enterprise Content Management (ECM)-Software. Unter dem Namen Open Text vereinigt das Unternehmen die ECM-Expertise, Lösungen und Partner, mit denen Kunden die Probleme im Zusammenhang mit dem Management von Informationen in großen Organisationen lösen können.

Wednesday, October 04, 2006

Web 2.0-Anwendungen lösen Ursprungsversprechen des Internets ein (contentmanager.de)

BVDW-Experten aus unterschiedlichen Bereichen sehen mit zunehmender Reichweite der verschiedenen Web2.0-Anwendungen lange existierende Hoffungen und Versprechen der Internetwirtschaft eingelöst. Weblogs, Videoblogs, RSS-Feeds & Co. - von vielen momentan noch als Phänomen unter Insidern und "Heavyusern" abgetan, wird nach Meinung von BVDW-Gesamtvorstand Andrea Schulz und Jörg Rensmann, stellvertretender Vorsitzender der Fachgruppe Services & Innovationen im BVDW nach und nach Einzug in die Kommunikationsstrategien der Unternehmen halten. Dabei lassen sich die verschiedenen Anwendungen als Instrumente in der Vermarktung als auch in der internen Kommunikation einsetzen. Entscheidende Bausteine für den erfolgreichen Einsatz der Technologien sind, so die Experten beim Kongress "Chance Web 2.0", Glaubwürdigkeit, Kritikfähigkeit, Authentizität, Relevanz sowie die Ermöglichung echter Partizipation.

Web 2.0-Anwendungen lösen Ursprungsversprechen des Internets ein (contentmanager.de)

BVDW-Experten aus unterschiedlichen Bereichen sehen mit zunehmender Reichweite der verschiedenen Web2.0-Anwendungen lange existierende Hoffungen und Versprechen der Internetwirtschaft eingelöst. Weblogs, Videoblogs, RSS-Feeds & Co. - von vielen momentan noch als Phänomen unter Insidern und "Heavyusern" abgetan, wird nach Meinung von BVDW-Gesamtvorstand Andrea Schulz und Jörg Rensmann, stellvertretender Vorsitzender der Fachgruppe Services & Innovationen im BVDW nach und nach Einzug in die Kommunikationsstrategien der Unternehmen halten. Dabei lassen sich die verschiedenen Anwendungen als Instrumente in der Vermarktung als auch in der internen Kommunikation einsetzen. Entscheidende Bausteine für den erfolgreichen Einsatz der Technologien sind, so die Experten beim Kongress "Chance Web 2.0", Glaubwürdigkeit, Kritikfähigkeit, Authentizität, Relevanz sowie die Ermöglichung echter Partizipation.

Content Integration - Teil 2 (contentmanager.de)

Bis auf die Systeme von SAP und ADP (Paisy) gibt es auf dem Markt für betriebswirtschaftliche Systeme kein Produktangebot mit einer veröffentlichten, zertifizierbaren "Standard"-Schnittstelle zu einem Enterprise Content Management System. Auch "Standard"-Integrationen von ECM-Herstellern in marktgängige Fachanwendungen sind mit Schwächen verbunden. Projektarbeit tut Not - doch welchen technologischen Ansatz, welche Gesamtarchitektur sollte man wählen und wie kann ein ECM-System auf seine Integrationsmöglichkeit bewertet werden? Im zweiten Teil des Artikels "Content Integration" werden diese Fragen beantwortet...

Content Integration - Teil 1 (contentmanager.de)

Obwohl seit fast 20 Jahren Archiv- bzw. Dokumenten Management Lösungen in Unternehmen eingerichtet werden, stellt auch heute noch die Verbindung von Fachanwendung und Enterprise Content Management System (ECM-System) in fast jedem Projekt eine Herausforderung dar und ist häufig eine der Gründe für unkalkuliert hohe Projektkosten.

Tuesday, October 03, 2006

Why Portals Exist (Line56)

Don't let meta-functions obscure the main point of the portal interface; thinking about Netflix and Starbucks

In Search Of The Collaborative Structure (Optimize Mag)

If you want to get your arms around collaborative technology, you'd better have an amazing wingspan. Forrester noted in a recent report that it logged some 400 client inquiries on collaboration in 2005 and the first half of 2006, encompassing everything from messaging and Web conferencing to document management and blogs. IT staffs are stuck trying to gain control of all these deployments in the name of collaboration and productivity. Fortunately, vendors are working to address the question of integrating the vast pool of content, portal, office-productivity, and other technologies.

Monday, October 02, 2006

WebEx Extends Its Offerings to Provide a Web Platform (Gartner)

With WebEx Connect, WebEx Communications seeks to move beyond conferencing and become a general platform for enterprise Web 2.0 applications. This platform will add more legitimacy to the "software as a service" model.

Line56: A New Direction (Line56)

A heads-up on our changing approach to the e-business marketplace, and what it means for the Line56 community

Going forward, in addition to coverage of events of general importance in e-business, Line56 will be focusing coverage in three areas:
1. Portals (enterprise portals in particular, but we will also be focusing on the human interface aspects of enterprise applications like ERP, SCM, KM, procurement, e-learning, and so forth).
2. On Demand CRM
3. Middleware (including application infrastructure, databases, and radio frequency identification technology).
Our approach to the portals category is different than the current market understanding. A portal is, of course, a standalone technology: the enterprise portal. But, as we understand it, a portal is also any online interface (relying on components like dashboards, portlets, and open standards) that gives human users access to enterprise information. Line56 is increasingly interested in the human interface to enterprise technology, wherever it resides on the technology continuum, and this is what we will be featuring going forward.
We have also decided to break out on demand customer relationship management (CRM) into a category of its own. Despite the fact that many CRM deployments remain traditional, we respect the potential of the on demand model and would like to give it expanded coverage here.
Finally, no e-business technology or process is viable without the glue that is middleware.
What does this mean for readers? The easiest explanation is that it is the depth, rather than the breadth, of our coverage that will be changing. We will be going more deeply into the portal as human interface, on demand CRM, and middleware; our coverage will provide more specific details about products, deployments, strategies, and market conditions as we move from being a general news source to a targeted provider of information to the technology buyer, particularly in the small and medium-sized business (SMB) arena.
What does this mean for vendors and PR agencies? Line56 will be more interested in product demonstrations and interviews with customers. We want to see products in action, and talk to customers who are using them.
In trying to get on our calendar, please be aware of the Line56 Yahoo Group:http://finance.groups.yahoo.com/group/line56/This is where you can directly schedule briefings with our Managing Editor, Demir Barlas, and check out his schedule several weeks in advance.
If you have not already done so, you can e-mail Demir at delikurt AT yahoo DOT com in order to receive an invitation to this group.
It is worth repeating that we will still cover news of general importance, and news that does not fit into the categories above. We will be more alert, however, to the focus areas above.
Thanks for your readership!
Sign up for Line56 Newsletters to receive the latest e-business news, blogs, viewpoints, and analysis via e-mail.
About From The Editor: This is part of a series of high-level discussions of e-business issues that, while grounded as far as possible in data and fact, also incorporates a modicum of speculative thinking. -- Editor
"From The Editor" is an Op-ed series intended to foster critical thinking and discussion of current issues. Opinions stated do not necessarily reflect the views of Line56 Media as a whole.

Friday, September 29, 2006

Gartner Announces New Portals, Content & Collaboration Summit 2006 (Contentmanager.net)

Gartner today announced it will launch a European annual Summit on the topic of Portals, Content & Collaboration. The new summit will be held from 2-3 October 2006 in London, United Kingdom.

Businesses everywhere are under pressure to accelerate performance and bring about better results, and according to Gartner, most need a fundamental shift in the way they use Information Technology (IT). The biggest revenue impact will come from IT projects that enable business growth by augmenting the behaviour of key knowledge workers and making them more innovative, creative and productive. To achieve this, capturing, managing and exploiting information that individuals and organisations possess have become a strategic priority. At this inaugural Gartner European Summit Gartner analysts and industry experts will tackle the most critical issues that are key to achieving this change and ultimately becoming what Gartner terms a ‘high-performance workplace’.

Debra Logan, conference chair and research vice president at Gartner said; "Today it’s all about high performance. The most successful organisations use superior information management to fuel creativity and innovation, translating into marketplace success. Effective business performance depends on the integration of people, processes and technology. Information, and the insight it provides, are the key ingredients to the biggest long-term success. To move forward companies need portals, content and collaboration tools, along with strategic vision and best practice insight".

During the Summit Gartner analysts will explore:
- Choosing and deploying state-of-the-art portal, content and collaboration technologies
- Managing content to minimize risk and exploit value
- Using technology to get everyone to work together
- Practices and behaviors to accelerate people's performance
- Measuring business impact as you bring systems, technologies and management best practices together
- The impact and role of emerging technologies, such collective intelligence, mashups, Web 2.0, folksonomies, e-discovery, Ajax,
- How people’s work behaviors and perspectives change over the next 10 years
- How mobile and wireless technologies and working practices will support collaboration
- If Knowledge Management is still relevant

Key Gartner presentations at the Summit include:
- High performance Workplace scenario: Top five actions to capitalise on change
- Sharepoint and Google: Right information to the right people at the right time?
- Portal Product Marketplace: The impact of consolidation
- Future of work in Europe: How people, processes and technology work together
- The collaboration scenario: Creating value and competitive advantage
- Enterprise Information Management: Getting business value from information assets
- The Future of Search: Where Information Access takes us all
- Compliance and e-discovery: What you Need to Know

Gartner thought leadership will be complemented by two external keynote speakers:
- Will Hutton, Chief Executive, The Work Foundation
- Ken Douglas, Technology Director, Chief Technology Office, BP International Ltd

Delegates will also learn from real-life experiences presented in a number of end-user case studies; six tutorials and a best practices session : Best Practices in Portal, Content Management, and Collaboration Development, Deployment and Management and a Gartner panel: Powerhouse vendors in the high performance workplace: Ladies and gentlemen, place your bets. The panel will combine the audience questions about the Powerhouse vendors and comments by Gartner’s global analysts. Other networking opportunities have been scheduled during the conference to ensure so delegates can discuss experiences, challenges and successes.Gartner organises 12 Summits in Europe each year on a wide range of IT industry topics. Each event features Gartner’s latest research and provides in-depth commentary by Gartner analysts, on-stage interviews with industry leaders, front-line case studies from across Europe and an opportunity to network with peers from across the region. In 2005, Gartner Summits in Europe attracted more than 3,300 delegates. For more information on Gartner events in the region please visit www.europe.gartner.com/events.
29.09.2006, Dorothee Stommel

Thursday, September 28, 2006

Apostles of the blogosphere (FT)

A few weeks ago I mentioned to a friend, who works in the “new media”, that I was to start a blog for FT.com. He was not impressed. “Blogging is over,” he informed me coldly.

I shrugged off the rebuke. After all blogs – personal online journals – are proliferating. According to Technorati, a firm that monitors such things, more than 50m blogs had been created by last month – and the number is doubling every six months.

My doubts returned, however, when I saw an ominous message on the website of Britain’s main opposition party: “Conservative Party enters the blogosphere”. It announced that David Cameron, Tory leader, had started a blog. When the world’s least fashionable political party discovers a social trend, it is surely a sign that it is peaking.

Mr Cameron is far from alone. Over the summer a strange array of politicians started blogging. They included Hillary Clinton, who hopes to be the next president of America; Lionel Jospin, who hopes to be the next president of France; and Mahmoud Ahmadi-Nejad, who is already president of Iran.

Political advisers around the world are clearly giving the same advice to their bosses. Blogging is meant to let politicians communicate directly with voters in a folksy style. In practice it makes aspiring statesmen sound like Mr Pooter, the character from Victorian fiction whose Diary of a Nobody was famous for its banality.

Mr Cameron’s entries from his recent visit to India have cheery little headlines, such as: “Going green in a Delhi tuk-tuk”. The Tory leader is shown around by a tour guide who is “a real character”; he sees the Delhi metro and pronounces it “amazing”. This kind of deadly dull stuff crosses the political divide. David Miliband, Britain’s clean-cut environment minister, got blogging earlier this year – claiming that this might help bridge “the growing and potentially dangerous gap between politicians and the public”. One of his most recent entries has the scintillating headline: “Three cheers for Brighton library”.

Mrs Clinton and Mr Jospin are saved from Pooterisms by their inability even to attempt chatty informality. By contrast, Mr Ahmadi-Nejad’s first blog was full of strange personal details. He notes, for example, that he did very well in his university entrance exams, in spite of suffering from a nosebleed. But after a promising debut in August, he has fallen silent – perhaps distracted by other tasks, such as governing the country and building a nuclear bomb.

Ferenc Gyurcsany , prime minister of Hungary, is more conscientious. He posts new comments on his blog most days – sometimes twice a day. He also has a dangerous frankness, making him a natural for the blogosphere. In a recent speech – now posted on his blog – he confessed to lying constantly to get elected; a revelation that prompted riots in Budapest.

Mr Gyurcsany’s blog is apparently a good read – if you have mastered Hungarian. But it is not clear that it has worked to his political advantage. In fact – for all the interest that consultants are showing in blogging – there is only one politician’s blog that has clearly had a real impact.
In France, Segolene Royal, who is likely to win the French Socialist party nomination to stand for the presidency next year, has been running a website and blog that has generated lots of interest and new support. Ms Royal puts essays on topics such as unemployment or immigration on her site and invites readers to post responses. She claims that she will then incorporate the best ideas into her platform for the presidency. It may be a gimmick, but it has helped her appear modern and in touch with the people – qualities in short supply in French politics.

The Royal experiment will certainly be watched with great interest by other politicians. But so far it seems to be a one-off.

That will hardly surprise the apostles of the blogosphere, however. They have always argued that blogging is politically significant, precisely because it is not a tool of the elite. Bloggers are, as a book on the phenomenon, An Army of Davids by Glenn Reynolds, puts it, holding the Goliaths of the media and the political world to account.

In the US, bloggers are claimed to have played a key role in forcing the resignation of Trent Lott as Senate majority leader in 2002, after he made comments that seemed to express nostalgia for the South in the days of segregation. It is argued that blogs kept the issue alive when the mainstream media was prepared to let it drop. The blogosphere is also said to have been crucial in mobilising support for Ned Lamont, an anti-war candidate, who defeated Senator Joe Lieberman in Connecticut’s Democratic primary in August.

In reality, it is hard to measure the precise impact of bloggers on such events. But the idea of an insurgent grass-roots movement, energised by folk tapping away at their computers, appeals to the romantic, anti-elitist strain in US politics. Many politicians in America and elsewhere clearly feel the need to pay their respects to the blogosphere – if only as a precaution.

It is not self-evident, however, that the blogosphere’s influence on politics is all for the good. A political consultant once complained that his bosses’ reliance on focus groups handed power to people who were prepared to sit around for hours talking about politics with strangers, in return for a free sandwich. Similarly if politics is increasingly shaped by the blogosphere, it will mean more power and influence for a sub-section of the population willing to waste hours trawling through dross on the internet.

Blogging as a medium has virtues: speed, spontaneity, interactivity and the vast array of information and expertise that millions of bloggers can bring together. But it also has its vices. The archetypal political blog favours instant response over reflection; commentary over original research; and stream-of-consciousness over structure.

Was that last judgment fair? Does it really follow logically from the rest of the argument? I am not sure and I have no time to think about it further. I have to get back to my blog.
gideon.rachman@ft.com

Copyright The Financial Times Limited 2006

Brussels attacked over Microsoft delay risk (FT)

The European Commission on Thursday came under attack over its antitrust battle with Microsoft, when members of the European Parliament and retailers warned that delays to the launch of the group’s new operating system would harm businesses.

Wednesday, September 27, 2006

Wolters puts educational arm up for review (FT)

Wolters Kluwer could reap about €600m ($760m) from selling its education business, analysts said on Wednesday as the Dutch publisher put the division up for review and unveiled new growth goals to round off a three-year restructuring process.

Informations-Management on demand! (Contentmanager.de)

In den letzten Monaten ist deutlich geworden: Niemand kann sich dem Software on demand-Trend widersetzen. Selbst bis vor kurzem noch kritische Softwarehäuser wie SAP und Microsoft bieten dieses Geschäftsmodell inzwischen an. Die Mainzer TNCS GmbH & Co. KG hat die Zukunftsfähigkeit des on demand-Konzepts bereits früh erkannt und gemeinsam mit ihrem Geschäftspartner, der aeveo it GmbH mit Sitz in Erlangen, das on demand-Portal www.office4business.de ins Leben gerufen - ganz nach dem Motto des TNCS-Geschäftsführers Thomas Hahner: "Wenn andere noch träumen, macht sich office4business bereits auf den Weg."

Bei Software on demand wird eine Softwareanwendung durch ein Dienstleistungsunternehmen, den Application Service Provider (ASP), betrieben und dem Kunden über öffentliche Netze, wie beispielsweise das Internet, angeboten. Interessant dabei ist die Verschiebung des Geschäftsrisikos. Da die benötigte Software nicht gekauft, sondern im Bedarfsfall über das Datennetz für die Nutzung angemietet wird, minimiert sich das Investitionsrisiko des Kunden. Oftmals werden bei dieser Auslagerung von Geschäftsprozessen nicht nur einzelne Anwendungen extern betrieben, sondern - wie bei dem Informations-Management-System Xbs-Client von TNCS - mehrere zusammengehörige Bereiche wie Dokumenten-, Workflow- und Kontaktmanagement.

Mit Hilfe von ASP-Dienstleistungen können Unternehmen demnach die Software ganzer Verwaltungsbereiche auslagern. Der Dienstleister kümmert sich um die komplette Administration, wie Softwarepflege, Aufrüstung, Lizenzen und optional eine Benutzerbetreuung.
27.09.2006, TNCS GmbH & Co. KG

Monday, September 25, 2006

The digital democracy's emerging elites (FT)

There are no prizes for guessing the most popular (and sought-after) types of internet enterprise at the moment. Anything that can be labelled Web 2.0 - social networks such as MySpace and Facebook, news aggregators such as Digg and Reddit and user-generated sites such as Wikipedia and Flickr - are the new new media.
Facebook is the latest such company to think of selling itself, with companies such as Yahoo and Viacom being asked to cough up $1bn (£526m). With News Corporation's purchase of MySpace last year for $580m now being regarded by Wall Street as a master stroke, other media companies are trawling for their own Web 2.0 acquisitions to transform themselves in the eyes of investors.
The hoo-hah over Web 2.0 companies is more than a matter of financial credibility. These companies, unlike most newspapers, magazines or television operations, do not employ professional writers, editors and producers to create material for their audience. Instead, they encourage their users both to contribute content and to select the most interesting things to display to others.
Old media "gatekeepers" (such as the people who edit this column) are out of fashion and what Jay Adelson, chief executive of Digg, calls "collective wisdom" is in. As Rupert Murdoch said last year of young internet users: "They don't want to rely on a god-like figure from above to tell them what's important . . . They want control over their media, instead of being controlled by it."
But such democratic rhetoric (what one critic has dubbed "digital Maoism") ignores one awkward fact. While anyone is free to launch a blog, contribute to Wikipedia or publish photographs on Flickr, a relatively small number of activists often dominate proceedings on Web 2.0 sites. Although they are unpaid, they can nonetheless achieve an elite status reminiscent of the old media's professional gatekeepers. An illuminating spat occurred last month at Digg, which encourages its 500,000 registered users to submit news stories from around the internet and vote for (or "digg") the most interesting. The most popular rise up its rankings and are displayed on its home page. It is the kind of thing that might have pleased the original Diggers, an 18th-century English sect that believed people should form self-governing communes. After protests that a group of about 20 Digg activists were promoting the stories they liked by supporting each other's choices, the site changed the algorithm that helps to rank stories. Kevin Rose, Digg's founder, said it would give more weight to "the unique digging diversity of the individuals digging the story" - in other words, make it harder for a cabal to distort the results.
That created outrage from the other side, with its activists complaining that they were being accused unfairly of cheating. Its top user, ranked by success in promoting stories to the home page, threatened to abandon the site. "I bequeath my measly number one position to whoever wants to reign," wrote the user known as P9. He was persuaded back and is still ranked at number two.
Mr Adelson insists that Digg is more democratic than some other sites because it is easy for anyone to contribute: users simply click to vote. Others, such as Wikipedia, demand more effort and have narrower participation. Jimmy Wales, the latter's founder, estimates that 70 per cent of editing is done by less than 2 per cent of registered users.
At one level, the fact that an elite often emerges within Web 2.0 sites is neither surprising nor sinister. The same thing can be seen in physical communities such as political parties. Relatively few have the patience or inclination to attend meetings and work on projects. The result is that groups of like-minded people who are particularly dedicated to the cause gradually gain dominance.
All the other slackers (or lurkers, as people who browse community sites for news and information without themselves contributing are known) gain a free ride at the expense of not controlling the agenda. "Things will always be done by the people who most want to do them. I don't think we will ever be shielded from that," says Clay Shirky, a consultant and academic.
But it does, as Nicholas Carr, a technology writer, says, "contradict a lot of the assumptions promulgated about the great egalitarianism of the web". There is not much of a logical distinction between someone who edits stories for money and someone who does so for recognition and social status. Indeed, Netscape has lured away some of the most active Digg users by paying them to submit stories to its site instead.
These are early days for Web 2.0 sites so it is difficult to predict the degree to which new media will come to look like old, with small groups of people filtering content for mass audiences. The optimistic view is that technology will make it so easy to switch among filters that gatekeepers will have less power. Digg already allows people to see stories that have been recommended by their friends rather than all of its users.
Still, the fact that there is an "A-list" of bloggers who garner a large proportion of internet links and traffic indicates that just because the web is an open medium it is not necessarily an egalitarian one. This generation of consumers has learnt to be sceptical about how information and entertainment is edited and filtered by groups of professionals. It ought to remain on its guard in the Web 2.0 world as well.
Copyright The Financial Times Limited 2006

Tuesday, September 19, 2006

Microsoft's Open-Source Promise Is All About the Future (Gartner)

Microsoft's Open Specification Promise will help clarify the use of certain Microsoft intellectual property in open-source initiatives. But it will have minimal immediate impact on the enterprise market.

Grid computing and virtualisation - are they money savers? (FT)

Five years ago it was a laboratory wonder, a new-fangled way of data processing that only boffins and rocket scientists understood or could use.

Today, grid computing is making its way steadily into the mainstream as senior managements seek new ways of extracting more and better value from their computing resources.

Its progress is being smoothed by a string of positive examples of the way it can boost efficiency and cut costs. Higo Bank in Japan, for example, was concerned that its loan processing system was taking an inordinately long time to service current and potential customers.

The answer was to integrate three important databases – risk assessment, customer credit scoring and customer profile – using grid technology. The result was a 50 per cent reduction in the number of steps, the amount of time and the volume of paperwork needed to process a loan.
The consequence? Instant competitive advantage compared with rival lenders.

A company in Europe was able to improve one of its business processes as well as its overall systems efficiency as a consequence of the grid phenomenon.

The company, Magna Steyr, a leading European automobile parts supplier, built an application called “Clash”, a three dimensional simulator it uses in the design process to ensure that a new part does not interfere physically with existing fittings.

It took 72 hours to run, however, and was therefore slotted in at the end of the design process. If a problem was found, the designers had to go back to the beginning and start again.

Run on a grid system, it took four hours. “By reducing the time to four hours,” says Ken King, IBM’s head of grid computing, “the company was able to run the application nightly, changing the nature of the process from serial to iterative: it was able to make changes to designs on the fly, saving time and money.”

Charles Schwab, the US financial services group and a pioneer in the use of grid, had a portfolio management application that their customer service representatives used when their customers phoned up.

It ran an algorithm capable of spotting changes in the market and predicting the likely impact and risks. It was running on a Sun computer but not running fast enough. Customers could be left on the phone for four minutes or more – an unacceptable period in banking terms.

Run in a Linux-based grid environment, the system was providing answers in 15 seconds. As a consequence, Schwab was able to provide better customer service leading to better customer retention.

These examples of grid in action, all developed by IBM, illustrate the power of grid to improve the utilisation of computing resources, to accelerate response rates and give users better insights into the meaning of their data. IBM claims to have built between 300 and 500 grid systems.

Oracle, Sun and Dell are among other hardware and software manufacturers to have espoused grid principles. Grid computing, therefore, looks like the remedy par excellence for the computing ills of the 21st century.

But is it silver bullet or snake oil? How and why is it growing in popularity?

Thirty years ago, grid would have been described as “distributed computing”: the notion of computers and storage systems of different sizes and manufacture linked together to solve computing problems collaboratively.

At that time, neither hardware nor software were up to the task and so distributed computing remained an unrealised ideal. The advent of the internet, falling hardware costs and software advances laid the foundation for grid computing in the 1990s.

It first achieved success in tackling massive computational problems that were defeating conventional supercomputers – protein folding, financial modelling, earthquake simulation and the like.
But as pressures on data processing budgets grew through the 1990s and early part of this decade, it began to be seen as a way of enabling businesses to maximise flexibility while minimising hardware and software costs.

Companies today often own a motley collection of computing hardware and software: when budgets were looser it was not unusual to find companies buying a new computer simply to run a new, discrete application. In consequence, many companies today possess vast amounts of under-utilised computer power and storage capability. Some estimates suggest average utilisation is no greater than 10 to 15 per cent. A lot of companies have no idea how little they use the power of their computer systems.

This is expensive in capital utilisation, in efficiency and in power. Computation requires power; keeping machines on standby requires power; and keeping the machines cool requires even more power. Clive Longbottom of the IT consultancy Quocirca, points out that some years ago, a large company might have 100 servers (the modern equivalent of mainframe computers).

“Today the average is 1,200 and some companies have 12,000,” he says. “When the power failed and all you had was 100 servers it was hard enough trying to find an uninterruptible power supply which would keep you going for 15 minutes until the generator kicked in.”

Now with 12,000 servers you can’t keep them all alive. There’s no generator big enough unless you are next door to Sizewell B [the UK’s most modern nuclear power station].”

Mr Longbottom argues that the answer is to run the business on 5,000 servers, keep another 5,000 on standby and close the rest down.

This sets the rationale for grid: in simple terms, a company links all or some of its computers together using the internet or similar network so that it appears to the user as a single machine.

Specialised and highly complex software breaks applications down into units that are processed on the most suitable parts of what has become a “virtual” computer.

The company therefore keeps what resources it has and makes the best use of them.
It sounds simple. But in practice the software – developed by companies such as Platform Computing and Data Synapse – is complex and there are serious data management issues, especially where large grids are concerned.

And while the grid concept is understood more widely than a few years ago, there are still questions about the level of its acceptance.

This year, the pan European systems integrator Morse published a survey among UK IT directors that suggested most firms have no plans to try grid computing, claiming the technology is too costly, too complicated and too insecure. Quocirca, however, which has been following the growth of grid since 2003, argued in an analysis of the technology this year that: “We are seeing grid coming through its first incarnation as a high-performance computing platform for scientific and research areas, through highly specific computer grids for number crunching, to an acceptance by businesses that grid can be an architecture for business flexibility.”

Quocirca makes the important point that knowledge of Service Oriented Architectures (SOA), which many see as the answer to the increasing complexity of software creation, is poor among business computer users, while grid-type technologies are critical to the success of SOAs: “Without driving knowledge of SOA to a much higher level,” it argues, “we do not believe that enterprise grid computing can take off to the extent we believe it could.”

Today’s grids need not be overly complicated. Ken King of IBM pours cold water on the notion that a grid warrants the name only if different kinds of computer are involved and if open standards are employed throughout: “That’s a vision of where grid is going,” he scoffs.

“You can implement a simple grid as long as you take application workloads, and these can be single applications or multiple applications, and distribute them across multiple resources. These could be multiple blade nodes [blades are self-contained computer circuit boards that slot into servers] or multiple heterogeneous systems.”

“The workloads have to be scheduled according to your business requirements and your computing resources have to be adequately provisioned. You have continually to check to be sure you have the right resources to achieve the service level agreement associated with that workload. Processing a workload balanced across multiple resources is what I define as a grid,” he says.

To meet all these demands, IBM marshalls a battery of highly specialised software, much of it underpinned by Platform Computing and derived from its purchase of Tivoli Systems.

These include Tivoli provision manager, Tivoli intelligent orchestrator and Tivoli workload scheduler and the eWorkload manager that provides ene-to-end management and control.

Of course, none of this should be visible to the customer. But Mr King says grid automation is still a way off: “We are only in the first stages of customers getting comfortable with autonomic computing,” he says wryly.

“It is going to take two, three, four years before they are willing and able to yield up their data centre decision-making to the intelligence of the grid environment. But the more enterprises that implement grid and create competitive advantage from it, the more it will create a domino effect for other companies who will see they have to do the same thing. We are just starting to see that roll out.”

Virtualisation can bring an end to ‘server sprawl’

Virtualisation is, in principle, a simple concept. It is another way of getting multiple benefits from new technology: power-saving, efficiency, smaller physical footprint, flexibility.

It means taking advantage of the power of modern computers to run a number of operating systems – or multiple images of the same operating system – and the applications associated with them separately and securely.

But ask a virtualisation specialist for a definition, however, and you’ll get something like this: “It’s a base layer of capability that allows you to separate the hardware from the software. The idea is to be able to start to view servers and networking and storage as computing capacity, communications capacity and storage capacity. It’s the core underpinning of technology necessary to build any real utility computing environment.”

Even Wikipedia, the internet encyclopaedia, makes a slippery fist of it: “The process of presenting computer resources in ways that users and applications can easily get value out of them, rather than presenting them in a way dictated by their implementation, geographic location or physical packaging.”

It is accurate enough but is it clear?

To cut through the jargon that seems to cling to this topic like runny honey, here is an example of virtualisation at work.

Standard Life, the financial services company that floated on the London stock market this year, had been, over a 20-year period, adding to its battery of Intel-based servers in the time-honoured way. Every time a new application was required, a server was purchased.

By the beginning of 2005, according to Ewan Ferguson, the company’s technical project manager, it was running 370 physical Intel servers, each running a separate, individual application. Most of the servers were under-utilised; while a variety of operating systems were in use, including Linux, it was predominantly a Microsoft house – Windows 2000, 2003 and XP Desktop.

The company decided to go the virtualisation route using software from VMware, a wholly owned (but very independent) subsidiary of EMC Corporation, the world’s largest storage system vendor. VMWare, with its headquarters in Palo Alto, California, virtually (if you’ll excuse the pun) pioneered the concept. As a competitor accepted: “VMware built the virtualisation market place.”

By January 2006, Standard Life had increased the number of applications running on its systems to 550: the number of physical servers, however, had decreased by 20 to 350.
But why use virtualisation? Why not simply load up the underutilised machines?

Mr Ferguson explains: “If you are running a business-critical application and you introduce a second application on the same physical machine there are potential co-existence issues. Both applications may want full access to the processor at the same time. They may not have been programmed to avoid using the same memory space so they could crash the machine.

“What virtualisation enabled us to do was to make the best use of the physical hardware but without the technology headache of co-existing applications.”

And the benefits? Mr Ferguson points to faster delivery of service – a virtual machine is already in place when a new application is requested – better disaster recovery capability and less need for manual control of the systems: “By default now, any new application we install will be a virtual machine unless there is a very good reason why it has to be on dedicated hardware,” Mr Ferguson says.

While adoption of virtual solutions is still at an early stage, manufacturers of all levels of data processing equipment are increasingly placing their bets on the technology.

AMD, for example, the US-based processor manufacturer fighting to take market share from Intel, the market leader, has built virtualisation features into its next generation of “Opteron” processor chips.

Margaret Lewis, an AMD director, explains: “We have added some new instructions to the x86 instruction set [the hardwired commands built into the industry standard microprocessors] specifically for virtualisation sofware. And we have made some modifications to the underlying memory-handling system that makes it more efficient. Virtualisation is very memory intensive. We’re tuning the x86 to be a very effective virtualisation processor.”

Intel, of course, has its own virtualisation technology that enables PCs to run multiple operating systems in separate “containers”.

And virtualisation is not limited to the idea of running multiple operating systems on a single physical machine. SWsoft, an eight-year-old software house with headquarters in Herndon, Virginia, and 520 development staff in Russia, has developed a system it calls “Virtuozzo” that virtualises the operating system.

This means that within a single physical server the system creates a number of identical virtual operating systems: “It’s a way of curbing operating system ‘sprawl’,” says Colin Wright, SWsoft enterprise director, comparing it with “server sprawl”, which is one of the targets of VMware.

Worldwide, 100,000 physical servers are running 400,000 virtual operating systems under Virtuozzo. Each of the virtual operating systems behaves like a stand-alone server.

Mr Wright points out that with hardware virtualisation, a separate licence has to be bought for each operating system. With Virtuozzo, it seems only a single licence need be bought.

This does raise questions about licensing, especially where proprietary software such as Windows is involved. Mr Wright complains that clarification from Microsoft is slow in coming. “It’s a grey area,” he says, “the licensing bodies are dragging their heels.”

In fact, the growth of virtualisation seems certain to open can after can of of legal worms. Hard experience shows vendors are likely to blame each other for the failure of a multi-vendor project.

So who takes responsibility when applications are running on a virtual operating system in a virtual environment? The big fear is that it will be virtually no one.
Copyright The Financial Times Limited 2006

They are the future – and they’re coming to a workplace near you (FT)

They are the future – and they’re coming to a workplace near you
By Lee Rainie
Published: September 19 2006 18:20 Last updated: September 19 2006 18:20

As consultant Marc Prensky calculates it, the life arc of a typical 21-year-old entering the workforce today has, on average, included 5,000 hours of video game playing, exchange of 250,000 e-mails, instant messages, and phone text messages, 10,000 hours of mobile phone use. To that you can add 3,500 hours of time online.

Friday, September 08, 2006

Open Text: Swimming With the Big Fish (AMR)

Open Text’s final results were in line with its July 5 preannouncement. Total 4Q06 revenue fell 3.8% to $105.2M year to year, and earnings per share were 16 cents on a GAAP basis. Open Text is among the top four enterprise content management (ECM) players, alongside EMC Documentum, FileNet, and IBM.

Wednesday, September 06, 2006

BEA WebLogic Real Time in neuer Version verfügbar - Java in höchster Performance für die Finanzwelt (contentmanager.de)

BEA Systems hat die Version WebLogic Real Time Core Edition 1.1 veröffentlicht. Dieses Produkt ermöglicht Java-Echtzeit-Anwendungen. Die vorhersehbare Antwortzeit ist jetzt dreifach schneller, die Lösung bietet in der Benchmark-Applikation eine Latenzzeit von höchstens 30 Millisekunden. Damit werden Java-Programme für Live-Umgebungen geeignet. Das Release steht unter http://www.bea.com/realtime zum Download zur Verfügung.

Tuesday, September 05, 2006

Community-Plattformen und Content Management – Teil 1/3 (contentmanager.de)

Teil 1: Communities und User-generated-Content Das Schlagwort "WEB 2.0" steht für technische als auch inhaltliche Trends. "Communities" und "User-generated-Content (UGC) sind dabei nicht nur eine Herausforderung für Portalbetreiber, sondern ebenso für Hersteller von CMS und Community-Software: Es gilt, beide Systeme zu einer homogenen Einheit zu formen. Anhand des schwedischen Portals http://www.contentmanager.de/_tools/urltracker.php?url=www.expressen.se wird in drei Teilen dargestellt, wie sich Communities und UGC auf Content Management auswirken.

Friday, September 01, 2006

Hummingbird Enterprise und RedDot XCMS werden als "Trend-Setting Products of 2006" ausgezeichnet (contentmanager.de)

Hummingbird wurde damit zum vierten Mal in Folge von Analysten, Anwendern und Redakteuren des KMWorld Magazin auf die Gewinnerliste der besten Produkte gewählt

Tuesday, August 29, 2006

Enterprise Applications Offer a Glimpse of Google's Ambitions (Gartner)

Google will introduce communications applications intended for use within enterprises. Service-level agreements, security and support will determine whether these applications will catch on within their target market.

Enterprise Applications Offer a Glimpse of Google's Ambitions (Gartner)

Google will introduce communications applications intended for use within enterprises. Service-level agreements, security and support will determine whether these applications will catch on within their target market.

Monday, August 28, 2006

Friday, August 25, 2006

IBM erweitert sein Speichersysteme-Portfolio (contentmanager.de)

Neue Turbo-Modelle der IBM System Storage DS8000-Serie und der Enterprise-Class N Series / Höhere Leistung, vereinfachtes System Management und preisattraktive Lösungen

Monday, August 21, 2006

Web Content Management: Tridion Named A Global Content Management Leader By Globalization Research Firm (ECM Connection)

Tridion, a leading global provider of web content management solutions, today announced that Common Sense Advisory Inc., an independent business globalization research and consulting firm, has named it a category leader among web content management (WCM) platforms for global and multilingual support. The report, entitled "Global Content Management Technology: A Buyer’s Guide," listed Tridion as one of the leading global content management technology vendors.

Research firm interviews 16 content management suppliers, provides SWOT assessment for each, and offers best practices, case studies, and technology recommendations for GCM read

Wednesday, August 16, 2006

Monday, August 14, 2006

IBM Looks to FileNet Buy for Content, Process and Compliance (Gartner)

By buying its rival in the enterprise content management market, IBM gains content-centric business process management capabilities. We believe IBM's recent acquisitions point to a deeper strategy focused on service-oriented architecture.

The Hidden Flaw in Web 2.0 (FTD)

New-generation web-driven applications make it easy for users to build sites - but also for virus writers to attack. While the technology and tools may bring new freedoms, they also represent virgin territory for virus writers and identity thieves. A dark side to the shiny new world of Web 2.0 is being exposed by virus writers and the internet security companies that counter them. The linked-up, sharing, live-updating melting pot of web technologies that has been dubbed the second version of the web is proving fertile ground for infiltrators seeking to inject malicious code into the mix.

Friday, August 11, 2006

Open Text/Hummingbird Deal May Cause Concern Among Customers (Gartner)

Hummingbird has accepted a takeover bid from its rival Open Text. The deal will change the competitive landscape in the enterprise content management market. Hummingbird customers could face decisions on migration.

Weblog-Marketing: Verschenken Sie Potential? (contentmanager.de)

Schöpfen Sie das Onlinemarketing-Potential Ihres Blogs voll aus? (Gute) Weblogs bieten die besten Voraussetzungen für gute Suchmaschinenpositionen, da sie von anderen Bloggern verlinkt und häufig aktualisiert werden. Hinzu kommt ein suchmaschinenfreundlicher HTML-Code, denn Weblog-Layouts werden in der Regel mittels CSS realisiert – für Suchmaschinen nichts sagende Layoutangaben im HTML-Code werden auf diesem Weg "ausgelagert". Um dauerhaft gute Positionen auch in hart umkämpften Keywordbereichen zu erlangen, bedarf es jedoch ein wenig mehr ...

Thursday, August 10, 2006

IBM Snaps Up FileNet for $1.6B (AMR)

IBM has announced plans to acquire FileNet in an all cash deal valued at $1.6B. The deal is expected to close sometime in the fourth quarter of 2006.

FileNet is one of the leaders in the enterprise content management (ECM) market. In business since 1984, FileNet predated the Internet boom and has been vital in helping companies digitize their document-intensive processes. Over the course of time, FileNet has expanded its core document repository capabilities, most prominently promoting business process management (BPM) in recent years.

In the ECM space, IBM has been one of FileNet’s most frequent competitors. While it has made numerous acquisitions related to content management in the past, including a July 2003 acquisition of web content management (WCM) vendor Aptrix and an August 2004 acquisition of content integrator Venetica (see “When Little Acquisitions Mean a Lot: IBM Acquires Venetica”), this is by far IBM’s most significant acquisition. In fact, it’s the most significant event in ECM since EMC acquired Documentum in October 2003 (see “Hindsight on Documentum’s Content Management Leadership; Foresight on EMC’s Direction”).

In IBM’s view, content management is becoming more vital as a component of an enterprise IT strategy to support decision making, information analysis and response, and to address issues like compliance, which encompasses every form of information and documentation an organization employs.

The FileNet side

To some extent, this is a far longer story than people think—one about FileNet finding a buyer—sparked by EMC’s Documentum acquisition.
Like its traditional competitors, FileNet has been under pressure to differentiate itself in a market under serious threat of commoditization, especially with the encroachment of the infrastructure heavyweights, Oracle, Microsoft, and of course, IBM, on its market. While its competitors gobbled each other up by amassing loosely related content management categories, FileNet’s growth has been conservative and organic. With a couple of minor distractions, FileNet kept close to its core document management capability, building its own BPM and records management systems to suit customers’ developing needs.

But the constant pressure to differentiate its products to suit specific customer and industry needs against its behemoth competitors may have been forcing FileNet into less fruitful niches. Undoubtedly, FileNet also found it had to embrace rather than work against new competitors like Microsoft, Oracle, and IBM itself, meaning it had to support various infrastructure platforms.

The IBM side

Despite FileNet’s worthy product developments over the past few years, the acquisition is hardly about technology. With the exception of some industry-specific applications built on the FileNet platform, IBM already has all the technology pieces. The motivation for the move is more likely acquiring customers and gaining a better competitive position relative to the ECM market and the enterprise IT market as a whole.

FileNet has a very strong position within its 4,700 customers, treated as a system of record almost as central to the success of document-intensive organizations and industries—banking, insurance, financial services, government, telecommunications, and utilities—as the ERP backbone. IBM will get these valuable customers.

As for the competitive factors within the ECM market, IBM has increasingly found itself vying for deals against FileNet, EMC, and Open Text on the content management front. Taking on FileNet means taking out one of the competitors and gaining an immediate market share boost—to No. 1 by a considerable distance over EMC and Open Text. Moreover, more customers are establishing company-wide standards on ECM, most often on a provider among many providers that they already use in house. So, IBM ensures that it’s called to the table far more often when such a standards decision happens.

As for the broader market, IBM’s competitive focus is on Oracle and Microsoft, both of which have expressed late interest in the ECM market. Simply put, they all realize that their success depends on being able to manage more of your information, with more of it needing to be managed because of compliance regulations and other concerns, and that more of it is not in databases. Moreover, being treated as a central source of information allows IBM to sell additional products and services, to remain entrenched in the business, and to adapt and respond to market demand as it occurs.

Oracle’s approach to content management was only recently rearticulated when it announced, along with numerous complementary content management partners, its newly positioned Oracle Content Database and Oracle Records Database products. While it’s a more definitive and realistic position in the ECM market than Oracle has ever had before, it won’t likely be aggressive enough in light of IBM’s move. An acquisition—think Open Text, Interwoven, or Vignette—is an even more distinct possibility.

While IBM and Oracle have been battling it out at the database or repository level, Microsoft is working the desktop. Introducing ECM to the masses through the Microsoft Office System, and specifically SharePoint, could be the most compelling angle of all.

The customer side

The most loyal and longstanding FileNet customers may feel a sense of relief, or they may feel a sense of fear. Those most fully invested in FileNet as the backbone for vital business processes, and those that have gone through considerable effort to upgrade to FileNet’s modernized P8 architecture, will undoubtedly be worried about the prospect of a rip and replace. IBM has so far been mum on the topic of product rationalization and consolidation, and we won’t hear about specific plans until after the acquisition is complete.

But to speculate on that topic despite the considerable overlap, IBM won’t likely try to merge the product lines for the foreseeable future. A development effort to combine the products would be too expensive for IBM and too disruptive for customers. In the meantime, both companies’ conformance to emerging content management standards and IBM’s Venetica acquisition (remember—“content integration”) should make integration and extension, rather than rip and replace, the direction for the future.

IBM buys FileNet for $1.6bn (FT)

IBM threw down a challenge to database rival Oracle on Thursday with a $1.6bn acquisition that marks its biggest software purchase in three years.

The acquisition of FileNet, a company that sells the content management systems that companies use to capture and manage many different types of digital documents, will also push IBM deeper into the software business at a time when its bigger hardware and services divisions have been struggling to grow.

The deal is set to double IBM’s share of the content management business and put it ahead of current industry leader Documentum, which was bought three years ago by storage company EMC.

Between them, IBM and FileNet accounted for about 18 per cent of the $3.2bn content management business last year, compared with 11 per cent for EMC, according to figures compiled by IDC.

That will put further pressure on Oracle to respond with an acquisition of its own, said Jim Murphy, a research director at AMR Research.

By helping to organise the many different types of “unstructured” data that companies amass, companies such as FileNet have become a critical part of the broader information management software that companies like IBM and Oracle are looking to sell. Shares in Open Text, the largest remaining independent company in the sector, climbed 3 per cent on news of the FileNet acquisition.

While FileNet “does not have any capabilities they don’t already have themselves,” the purchase of a company that last year had total revenues of $423m and net income of $40m will bulk up IBM’s presence in a corner of the software business that is likely to continue to see solid growth, said Mr Murphy.

The acquisition comes at a time when Oracle threatens to overtake IBM as the world’s second-largest software company, thanks to faster growth in its core database business and a string of big acquisitions of its own.

Last year Oracle’s revenues jumped to $14.4bn, while IBM’s revenues from software edged up by 4.4 per cent, to $15.7bn.

Sam Palmisano, IBM’s chief executive, has recently accelerated its push into the infrastructure software and middleware on which large corporate IT systems are built. IBM has made more than 50 software acquisitions in the past 10 years.

By comparison other big hardware companies, moving later than IBM into a business with greater profit and growth potential than their traditional businesses, have responded with more sizeable deals.

HP’s planned acquisition of Mercury Interactive, announced last month, will double its software revenues to around $2bn, putting it among the top dozen software companies, alongside EMC, which has also mounted a series of big purchases.

In the most recent quarter, IBM’s software division accounted for 19 per cent of the group’s overall revenues but 40 per cent of its gross profits.
Copyright The Financial Times Limited 2006

Wednesday, August 09, 2006

Stellent's Acquisitions Seek to Boost Content Security (Gartner)

SealedMedia and Bitform will provide complementary technologies for Stellent's content management platform. To succeed, Stellent must quickly integrate these with its compliance and Web content and records management applications.

Monday, August 07, 2006

Forbes sells stake to Bono buy-out fund (FT)

The Forbes family has agreed to sell a large minority stake in its media empire to Elevation, the buy-out fund of rock star Bono, in a deal worth about $300m. The move highlights how private investors are seeking bargains in the struggling business of print journalism.

Friday, August 04, 2006

Stellent Makes Acquisitions (Line56)

Addressing metadata security and DRM with acquisitions of SealedMedia and Bitform

Enterprise content management (ECM) company Stellent has acquired SealedMedia and Bitform. SealedMedia's purchase price was $10 million while Bitform went for $1.2 million in cash and $1.3 million in potential incentive-based payments.

Wednesday, August 02, 2006

Interwoven WorkSite löst bestehende Dokumenten-Management-Lösung ab (contentmanager.de)

Interwoven WorkSite löst bestehende Dokumenten-Management-Lösung ab

Interwoven, Inc. (Nasdaq: IWOV), Anbieter von Enterprise Content Management-Lösungen (ECM) für Unternehmen, zählt die renommierte deutsche Anwaltskanzlei Gleiss Lutz zu seinen neuen Kunden. Die Kanzlei hat die Dokumenten-Management-Lösung Interwoven WorkSite für ihre europaweit sieben Büros erworben, um die Effizienz ihrer täglichen Mandatsarbeit weiter zu erhöhen. Der neue Kunde ist eine der führenden international tätigen Anwaltskanzleien Deutschlands. Mit über 220 Anwälten berät die Kanzlei Unternehmen und Konzerne im In- und Ausland sowie Körperschaften und Institutionen des öffentlichen Rechts.

Die Tätigkeit erstreckt sich auf alle Bereiche des Wirtschaftsrechts. Die Kanzlei verfügt unter anderem über Büros in Berlin, Frankfurt, München und Stuttgart. Seit dem Jahr 2000 hat Gleiss Lutz eine Allianz mit den Top-Kanzleien Herbert Smith LLP in England und Stibbe in den Niederlanden/Belgien. Mit insgesamt 1.500 Anwälten in über 20 Büros in Europa, Asien und den USA arbeiten die Kanzleien in grenzüberschreitenden Mandaten zusammen, vor allem in internationalen Transaktionen, im Banking- und Finance-Bereich sowie bei internationalen Schiedsgerichtsverfahren und Prozessen.

Monday, July 31, 2006

Barrierefreies Web 2.0 (contentmanager.de)

Was war eigentlich Web 1.0, wenn jetzt alle von Web 2.0 sprechen und schreiben? Vermutlich sind wir schon bei Web 9.4. Sehen wir es einfach pragmatsich: Web 2.0 ist eine Art Sammelbegriff für das nochmalige Aufbegehren der Neuen Medien, sozusagen der zweite Anlauf mit zum Teil neuen Technologien und alten Ideen.

Die Speerspitze bildet wohl AJAX, das mittlerweile vielen Menschen ein Begriff ist, allerdings eher, wenn man es an praktischen Beispielen, wie Google Maps festmacht. Einen echten Unterschied zu einer herkömmlichen Webanwendung wird vielen Menschen wohl auch nicht auffallen und doch schickt sich AJAX an, das Internet nachhaltig zu reformieren: Von der Auto-Vervollständigung beim Eintippen von Suchbegriffen oder der Korrektur von Tippfehlern nach dem Motto "Meinten Sie vielleicht ..." bis hin zu Google Maps und WYSIWYG-Editoren mit AJAX-Unterstützung reicht die Bandbreite der dynamischen Helferlein.

Wissensaustausch mit Wikis: Einfach loslegen (contentmanager.de)

Wissensbasen lassen sich nicht nur mit teuren Werkzeugen realisieren, sondern auch mit Wikis. Die wichtigsten Vorteile solcher Implementierungen sind die kostenlose Verwaltung und eine einfache Verlinkung. Auch im Bereich der Technischen Dokumentation sind Wikis für so manche Anwendung die beste Lösung.

Wiki-Plattformen

Wiki-Anwendungen