Wednesday, April 30, 2008

FT.com / Technology / Digital Business - Personal view: 3D brings challenges for the world wild web

FT.com / Technology / Digital Business - Personal view: 3D brings challenges for the world wild web

Personal view: 3D brings challenges for the world wild web
By David Wortley

Published: April 30 2008 04:12 | Last updated: April 30 2008 04:12

The future for IT is 3D. The onscreen desktop will be replaced by a doorway to walk through, the typical corporate website by worlds to explore.

There is nothing imaginative or futuristic about that. As the limits of technology fall away, IT portals can be more instinctive, and resemble more closely the physical world we are used to dealing with.

Serious virtual worlds have the potential to make conventional websites seem about as effective a business tool as a leaflet.

Of course, there are the added depth of functionality and potential uses of the virtual world – but it goes further.

Virtual worlds have the ability to get to the crux of the issue of customer contact, offering a halfway house between the flat efficiencies of the website or call centre and the high costs of face-to-face interaction and the physical and branded location of an office or retail outlet.

The virtual world offers a sense of place and genuine interaction for large audiences at an affordable price.

How many people click out of a website because they cannot find what they want or immediately see what is relevant to them? To overcome this, the office of PA Consulting in the Second Life virtual world is staffed constantly by a team of Second Life PA avatars in locations around the world.

It means the organisation has trained “greeters” who can find out more about visitors, what kinds of services they are looking for and offer them what they need.

A little human charm changes the nature of the relationship between the user and the software, encourages greater interest, more thought. In a virtual world, relationships with customers can be developed through meetings across a desk with an avatar, using an audio or video conversation.

BP is trialling the idea of using Second Life as a place for employees to meet a counsellor or manager to talk about issues they might feel are too sensitive to discuss face-to-face.

In general, virtual worlds are ideal for hosting events that can bring together customers, experts and star “draws” internationally, involving speakers who can take part from home with a PC and a microphone. This leads to what Cisco’s Christian Renaud has been calling “serendipitous meetings”, the kind of unlikely meetings between people to exchange ideas and talk about partnerships that would not otherwise happen.

There is the opportunity for promotion, building customer loyalty and viral marketing through offering virtual objects. The average Second Life user is acquisitive, keen to have a distinctive appearance, and – in a world where every detail has to be created from nothing – keen on any kind of “stuff” that can be shared with others.

The experience of virtual worlds is “real” enough to ensure people maintain a strong sense of self-awareness. Recently, we created avatars for a couple of MPs visiting us at the Serious Games Institute, giving them the chance to speak and interact with an audience in a virtual world. The standard uniform for avatars being jeans and T-shirts, I had to do some shopping to find the appropriate suits and ties they could wear.

But big business needs to be cautious about the growing market for virtual objects. When Nissan wanted to launch its latest sports car, it had the idea of a huge vending machine in Second Life which would give away models of the car for people to drive around in.

Rather than being regarded as a treat, this giveaway upset the spirit of Second Life. A number of the population had managed to build little businesses from creating and selling virtual vehicles to other users. A business giant coming in and giving away sports cars for nothing became the subject of gossip and led to a boycott of the Nissan island.

The processing power required to facilitate the shift to 3D IT is an issue for the moment, but it’s only a temporary one. The biggest challenges concern interoperability and security.

Just as the standard HTML language was the making of the web, so will the ability to have a universal avatar capable of slipping effortlessly between one virtual world and another.

A consortium of organisations in the US is pushing for a recognised standard that will allow this to happen. As previous attempts to create standards have shown, however, it is not going to happen without friction between the commercial organisations building their proprietary environments and virtual customer bases.

For commercial operations to settle with confidence into virtual worlds, far more work is going to be needed on security. It is virtually impossible to find out the real identity of people behind the avatars – meaning they have no responsibility for what they do.

Web visitors to company sites are similarly anonymous, but they do not have the same opportunity to abuse staff, band together to organise protest raids, or generally upset other visitors.

Some form of digital signature will be needed to ensure avatars are held to account for their actions, just as they would be in the real world.

In many ways, the growth of virtual worlds is like the frontier towns of the Wild West, where new social forms were worked out messily and in public. In the same way, new codes of behaviour will eventually be adopted.

David Wortley is director of the Serious Games Institute, Coventry University.
Copyright The Financial Times Limited 2008

ECM = integratie | Opinie | ECM | Computable.nl

ECM = integratie | Opinie | ECM | Computable.nl

ECM = integratie

Enterprise content management wordt vaak als een apart product in de markt gezet. Het beheer van content zou volgens deze producten zoveel mogelijk plaats moeten vinden binnen dit product. Eigenlijk is dit een rare gedachtegang. In principe zou ecm veel meer een visie dan een product moeten zijn. Want content is overal: in de productendatabase, in het crm-pakket, op de website, in Word- en Excel-documenten, in Exchange/Outlook, in de wiki, op het intranet.

Wat een echt overkoepelend ecm-product zou moeten zijn is een grote integratietoepassing: een product dat content overal vandaan haalt, omzet in een uniform geheel, aanpasbaar en beheersbaar maakt, maar ook weer terug wegschrijft in de contentsilo waar het vandaan kwam. Een mapping dus, tussen content in de diverse applicaties en content in het ecm-pakket.

In plaats daarvan wordt de ecm-markt op dit moment vooral gedomineerd door hele grote softwarepakketten die vooral hun best doen om content IN die pakketten te houden. Dat lijkt een simpele oplossing, maar het is een versimpeling van de werkelijkheid en levert op termijn een informatiemanagement-nachtmerrie op. Wat organisaties eigenlijk vooral willen is integratie, aggregate en syndicatie van content.

Het is niet dat er niets gebeurt. Er zijn diverse initiatieven gestart om contentuitwisseling te verbeteren. Zo is vanuit de Java Community Proces, de standaardorganisatie voor Java-technologie, JSR-170 en de opvolger JSR-283 gestart, een Java-standaard om het toegangsprotocol op content te standaardiseren. De OASIS-standaardorganisatie heeft tientallen open standaarden ontwikkeld voor gestructureerde informatie. De Dublin Core-standaard van ISO geeft houvast voor het definieren van metadata. Het World Wide Web Consortium (afgekort W3C) heeft tientallen zeer bekende standaarden zoals XHTML, HTTP, XSLT, CSS en SOAP onder zijn hoede.

Alhoewel sommige van deze initiatieven zeer succesvol zijn, blijft het opvallen hoe de ecm-industrie de makkelijke weg kiest en niet standaarden adopteert maar toch weer eigen, proprietary formats definieert en vooral ook content zoveel mogelijk binnen de eigen informatiesilo houden, in plaats van content zoveel mogelijk te halen uit de bestaande contentsilos. Een gemiste kans?

Friday, April 25, 2008

FT.com / Home UK / UK - ‘Massive deal’ from Autonomy

FT.com / Home UK / UK - ‘Massive deal’ from Autonomy

‘Massive deal’ from Autonomy
By Tom Braithwaite and Philip Stafford

Published: April 24 2008 22:21 | Last updated: April 24 2008 22:21

Shares in Autonomy tumbled yesterday despite record first-quarter results and the announcement of a “massive deal”, with Deutsche Bank, as the search software company benefited from the subprime crisis.

Shares in the Cambridge-based company dropped 146p, or nearly 15 per cent, to 844p after it met analysts’ forecasts but failed to trigger the raft of immediate upgrades that some had expected.

Pre-tax profit in the three months to March 31 rose 47 per cent to $23.6m (£12m) on revenue up from $65.5m to $105.1m as it attracted new customers including Barclays Capital and Michelin. Earnings per share rose from 7 cents to 10 cents.

Autonomy’s products’ ability to sort through unstructured data from text to phone calls has attracted customers from governments’ intelligence agencies to banks, looking to beef up compliance procedures and prepare for lawsuits related to the credit squeeze.

Autonomy also announced a deal for compliance software. The deal is with Deutsche Bank, though Autonomy did not name the bank. Mike Lynch, chief executive, said it was “expected to be one of the most significant contracts Autonomy has won” and is worth at least $20m over two years.

“Various sectors shifted spending from general IT to regulatory and litigation- related purchases, making the direct effect of the subprime crisis a net positive for our business,” said Mr Lynch. He added Autonomy would maintain its “conservative view” on prospects.

It flagged that “some customers delay[ed] payments until immediately after quarter end” but said cash collection had recovered. Cash balances rose by $2.9m from the end of the previous quarter to $95.5m; the company is debt free and Mr Lynch said he would consider returning cash to shareholders with $100m “probably the magic number”.

FT Comment
● Following a strong run, yesterday’s 15 per cent drop should bring a more sober aspect to Autonomy’s share price. A prospective p/e ratio of about 30 times earnings was priced to perfection, given its low earnings visibility. In the face of an economic slowdown, upgrades may not appear this year. Until further guidance comes through, shares may be range bound from here.
Copyright The Financial Times Limited 2008

FT.com / Companies / By region - Autonomy gets credit crisis boost

FT.com / Companies / By region - Autonomy gets credit crisis boost

Autonomy gets credit crisis boost
By Tom Braithwaite

Published: April 24 2008 09:18 | Last updated: April 24 2008 09:18

Autonomy produced record first quarter results on Thursday and announced a “massive deal” with an investment bank as the search software company benefitted from new customers and the subprime crisis.

But shares in the Cambridge-based company fell 66p, or 6.7 per cent, to 924p in early trading - capping a strong run - after the company met analysts’ forecasts but failed to trigger the raft of immediate upgrades that some had expected.

Pre-tax profit in the three months to March 31 rose by 47 per cent to $23.6m on revenue that rose from $65.5m to $105.1m as Autonomy attracted new customers including Barclays Capital and Michelin. Earnings per share increased from $0.07 to $0.10.

Autonomy’s products’ ability to sort through unstructured data from text to phone calls has attracted customers from banks, looking to beef up compliance procedures and prepare for lawsuits related to the credit squeeze, and governments’ intelligence agencies.

“Various sectors shifted spending from general IT to regulatory and litigation-related purchases, making the direct effect of the subprime crisis a net positive for our business,” said Mike Lynch, chief executive. “We have decided to maintain our conservative view on prospects, which we will review if, as expected, current strength continues.”

Mr Lynch also announced a “massive deal” with an unnamed investment banking client related to litigation. He said it was “expected to be one of the most significant contracts Autonomy has won”.

Autonomy flagged that “some customers delay[ed] payments until immediately after quarter end” but said cash collection had since recovered. Cash balances increased by $2.9m from the end of the previous quarter to $95.5m; the company is debt free and Mr Lynch said he would consider returning cash to shareholders with cash of about $100m - “probably the magic number”, he said.

“A solid set of results combined with a cautiously optimistic outlook,” said Derek Brown, analyst at Seymour Pierce.

Copyright The Financial Times Limited 2008

Wednesday, April 23, 2008

contentXXL ASP.NET CMS - Online Consulting ist begeistert vom Content Relationship Management / Wissens - Management des contentXXL CMS

contentXXL ASP.NET CMS - Online Consulting ist begeistert vom Content Relationship Management / Wissens - Management des contentXXL CMS

Der Schweizer IT-Dienstleister Online Consulting AG aus Wil, Schweiz, ist begeistert vom Content Relationship Managements von contentXXL. Auch aus diesem Grund setzt er das Microsoft .NET-basierte Business Content Management System (CMS) als Plattform für die jüngst relaunchte Webseite ein. Das Content Relationship Management ist ein herausragendes Merkmal von contentXXL und bietet die Möglichkeit, einzelne Inhalte zueinander in Beziehung zu setzen. Damit werden verwandte Inhaltsobjekte (Handbücher, Kontakdaten, weiterführende Links, alternative Produkte) beim Einpflegen verknüpft und dann im jeweils sinnvollen Kontext gemeinsam auf verschiedenen Seiten des Internetauftritts präsentiert. Diese Objekte können von Fall zu Fall an anderer Stelle oder in anderen Sprachen erneut verwendet werden, ohne mehrmals gepflegt zu werden. Online Consulting setzt das Content Relationship Management beispielsweise ein, um Besuchern der Webseite im Bereich Produkte zugleich Referenzen, relevante Artikel, weiterführende Links oder Dokumente anzubieten.

FT.com / Companies / IT - Microsoft unveils hybrid computing platform

FT.com / Companies / IT - Microsoft unveils hybrid computing platform

Microsoft unveils hybrid computing platform
By Richard Waters in San Francisco

Published: April 23 2008 03:55 | Last updated: April 23 2008 03:55

Microsoft unveiled on Wednesday the first important piece of a new hybrid computing platform intended to ease the transition of its core software business to the web.

The move comes two-and-a-half years after Bill Gates, chairman, warned that the rise of internet computing could one day threaten Microsoft’s desktop software business.

It is the clearest evidence so far of the influence of Ray Ozzie, who took over from Mr Gates as the company’s chief software architect in 2005.

The new technology, known as Live Mesh, is designed to free a consumer’s data from the PC or other device where it resides and place copies of it automatically on any other internet-connected gadget, or make it available through a web browser.

Microsoft executives said the Mesh could make it possible for people to access digital music stored on their home PC from any device or computer, work on documents that were entered on other computers or share their photos and other media automatically with friends over the internet.

The technology is being launched in an early test version, with a full trial scheduled for later this year, Microsoft said. The group did not say when it expected consumer services based on the idea to be available.

By giving users a way to copy data easily to Microsoft’s servers and then work on it in a “virtual desktop” through a browser, the idea echoes the so-called “cloud computing” strategies of companies such as Google.

Microsoft said it would guarantee at least five gigabytes of storage free of charge.

However, Microsoft’s plan adds a further element, making it possible to “sync” information automatically between a user’s computers and other digital devices, creating what it called a personal “device mesh”.

Users who first register all their devices on a Microsoft website will be able to copy information between them simply by “right clicking” on the relevant folders, Microsoft said.

This peer-to-peer system, though using the internet as a hub, leaves data and the programs needed to manipulate it on the “client” devices – an idea that ties in with Microsoft’s argument that the internet is not yet ready to replace all client-based computing, and Mr Ozzie’s own long-running work in his earlier companies on similar peer-to-peer technologies.

“It has the potential to be visionary – there is something more powerful than the cloud, it’s the cloud plus the device mesh,” said David Smith, an analyst at Gartner.

As a transitional technology that does not rely on a complete shift to the web, the plan is designed to protect Microsoft’s own earlier technology investments, as well as those of customers who rely on its software, he added.

Adoption of the technology would depend on how closely Microsoft integrates the Mesh idea into its other software, and whether other developers adopted it and built applications around it, Mr Smith said. Reflecting Microsoft’s increasing move away from a Windows-only computing world, the company said it would soon make versions of the technology available that run on a wide range of internet browsers as well as Apple’s Mac operating system.

Mr Ozzie said in an internal memo that the aim was to link all internet-enabled devices, “not just PCs and phones but TVs, game consoles, digital picture frames, DVRs, media players, cameras and camcorders, home servers . . . our car’s entertainment and navigation systems, and more”.
Copyright The Financial Times Limited 2008

Tuesday, April 22, 2008

'Werkruimte verandert fundamenteel' | Nieuws | Internet | Computable.nl

'Werkruimte verandert fundamenteel' | Nieuws | Internet | Computable.nl

De wijze waarop bedrijven opereren en communiceren met personeel en klanten, zal de komende jaren fundamenteel veranderen. Nieuwe webtechnologieën maken het werk efficiënter en minder plaatsgebonden. Dat stelt onderzoeker Forrester aan de vooravond van een Web 2.0-conferentie.

Friday, April 18, 2008

Eine Plattform für alle Online-Magazine: contentXXL CMS

Eine Plattform für alle Online-Magazine: contentXXL CMS

Verlagsgesellschaften betreiben heute parallel zu Ihren Print-Publikationen die Online-Ausgaben, nutzen aber oft nicht die Synergien, die ihnen das Internet etwa durch Verknüpfung von Inhalten oder die gemeinsame Nutzung von Anwendungen bietet. Der Schweizer KünzlerBachmann Verlag aus St. Gallen arbeitet hier anders. Nachdem man 2007 die Online-Präsenz der Kinder- und Jugendzeitschrift "Spick.net" mit Hilfe des Content Management Systems (CMS) contentXXL neu aufgesetzt hat, startete nun auf der gleichen Plattform das moderne Internet-Familienmagazin "swissfamily.ch". Der klare Vorteil dieser Vorgehensweise besteht in der Möglichkeit, in beiden (und weiteren) Internetportalen sämtliche eingesetzten contentXXL-Module sowie den Inhalt der Portale gegenseitig nutzen zu können. Der Redakteur entscheidet mit wenigen Klicks, welche Inhalte auf welchem Navigationspunkt bei welcher URL ausgegeben werden.

Verantwortlich für dieses Projekt ist wie beim "Spick" der langjährige Internetpartner des Verlags, Online Consulting AG aus Wil/Schweiz.

Der KünzlerBachmann Verlag betreut gleich mehrere Print- und Online Produkte rund um die junge Familie. Im vergangenen Jahr startete man mit der Entscheidung für das auf Microsoft .NET basierte CMS contentXXL eine Optimierungsoffensive für sämtliche Online-Magazine und -Portale. Im letzten Jahr "spick.net", in diesem Jahr das übergreifende Familienportal. "swissfamily.ch".

Enterprise Content Management for SMBs

Enterprise Content Management for SMBs

Most vendors have given SMBs pretty short shrift. That's got to change--content management is a good place to start.

Long thought of as the second-class citizens of IT, small-tomedium- businesses (SMBs) drew the focus of major and mid-range IT vendors alike in the late 1990s for a number of reasons. First, the era of big, enterprise installed applications such as enterprise resource planning (ERP) seemed about over. SAP and other vendors had saturated the high end of the customer base and needed to go downmarket. Second, the SMB market was supposed to dramatically outgrow the Global 2000 market in the next 10 years. Third, in order for those new SMBs to compete with G2000s, they would need to adopt technologies that would level the playing field with G2000s—specifically, Internet and Web content management technologies that allow the creation of virtual storefronts with customer service.

Major players like Microsoft launched new divisions supposedly devoted to serving SMBs. Others like IBM with its Express line, debuted mid-range versions of older enterprise products for SMBs. Still others acquired mid-range offerings EMC/Documentum from its OTG acquisition. Application service providers (ASP) sprung up all over the place purportedly to provide SMBs with enterprise applications online. Even small vendors chanted the “small is beautiful” mantra. Well, it all sounded good, but most SMBs found that these players paid them little more than lip service. If, in fact, their booming numbers did compensate in volume of smaller individual sales for the thinner margins they required, then IT vendors had their bases covered. But they weren’t going to promote and seed a market that might never materialize.

The fact is, says Tom Eid, principal analyst, Content, Communication and Collaboration, Gartner, “vendors pay more attention to larger companies in terms of marketing, providing advanced releases, and other benefits.” That’s because “SMBs do not buy technology—instead, they tend to buy solutions that solve their business problems,” says Sanjeev Aggarwal, senior analyst, Small and Medium Business Strategies, Yankee Group. In other words, SMBs don’t madly pursue the state-of-the-art because the generally lack the funds and/or IT resources. They buy solutions that do things like help them be more profitable, cut costs, and be more efficient.

FT.com / Home UK / UK - Notebook computers to go

FT.com / Home UK / UK - Notebook computers to go

Notebook computers to go
By Paul Taylor

Published: April 18 2008 03:00 | Last updated: April 18 2008 03:00

Those with long memories and large coat pockets may recall the "sub-notebook" PCs of the 1990s and early 2000s, such as Psion's Series 7, Toshiba's Libretto, Compaq's Contura Aero and Olivetti's Quaderno.

The concept is making a comeback as a high-performance, "ultra-portable" device that slots neatly in between full-size laptops and hybrid devices such as the HTC Shift ( www.htc.com ) with its tilting 7in touch screen, slide-out qwerty keyboard and built-in 3G cellular data connection.

Asus, the Taiwanese PC-maker, has scored an unexpected hit with the Asus Eee PC ultra-portable notebook, with prices starting at £183 in the UK. The Eee PC, based on Intel's ClassmatePC project, is designed to run internet-based applications and was targeted at the education market. But it has been snapped up by mobile professionals looking for a low-cost portable device with a reasonable, albeit somewhat cramped, qwerty keyboard.

The basic Eee PC, which went on sale last October, was built round a Linux operating system, with solid-state flash memory storage rather than a spinning hard drive, Wi-Fi networking and a bright 7in screen - all weighing less than 2lb (0.9kg). (Asus has now ex-panded the range to include a Windows XP machine and others with greater storage.)

But what differentiates the Eee PC from tablet-size "ultra-mobile" PCs such as Samsung's U1 and from mobile internet devices such as OQO's Model O1 are its traditional clamshell design, touch pad -pointing device and near- full-size qwerty keyboard. With Asus expecting to sell up to 5m Eee PCs this year, other PC-makers have begun to take notice.

I have been testing one of the first direct rivals, Hewlett-Packard's HP 2133 Mini-Note PC, launched in the US and elsewhere 10 days ago. Like the Eee PC, the Mini-Note ( www.hp.com ) offers a choice of operating system including Linux and Windows Vista Business. It costs from £388 in the UKfor the basic version with 512KB of Ram, a 1GHz Via processor and SuSe Linux.

My $599 test model came with Windows Vista Home Basic, a 1.2GHz Via C7-M microprocessor, 1GB of Ram and an 8.9in widescreen display. Unlike the Eee PC, its 120GB hard drive is standard.

The Mini-Note is designed to appeal equally to students and cost-conscious business travellers tired of carrying a heavy laptop. It measures 10.04in wide by 6.5in deep by 1.05in thick (255 x 165 x 27mm) - smaller than most hardback novels - and weighs about 2.8lb (1.27kg) with the standard three-cell battery. An optional six-cell battery doubles the battery life to about four hours and elevates the back of the PC to create a convenient angle for typing. This raises the weight to about 3.3lb.

The Mini-Notes's size is determined by the cleverly designed qwerty keyboard with big keys, and the LCD (liquid crystal display) panel bordered by stereo speakers and a 1.3 megapixel webcam. With an elegant and durable brushed aluminium case, the Mini-Note has the look and feel of a machine several times its price. This is a device almost anyone - male or female - would feel comfortable carrying in one hand.

Its sturdiness is more than skin-deep. HP has built the Mini-Note around a magnesium alloy support structure and included the company's HP 3D DriveGuard to help protect the hard drive and its data.

Other features include a mini-touch pad with "scroll zone" for navigation, although the mouse buttons on either side of the pad are awkward. The Mini-Note has all the standard ports and connectivity options including integrated Wi-Fi, optional Bluetooth and the option to add a broadband wireless data card in an express card slot.

The low-power Via processor and integrated Via graphics sub-system do a good job driving the system, butgenerate a lot of heat that can make the Mini-Note uncomfortable to hold on a lap for long periods.

Most users should find the 120GB hard drive adequate, although HP offers a 160GB option as well as a 4GB flash memory for Linux-based systems and a 64GB SSD (solid state drive) version for those who need faster, more reliable storage.

Overall, despite a few niggles, I am impressed by the HP Mini-Note. It provides value for money and is a worthy competitor for Asus.

I have also been looking at the HTC Shift. At 800g, it is smaller and lighter than the Eee PC and the Mini-Note, and may be ideally suited to mobile internet access.

The model I have been testing is powered by an 800MHz Intel processor. It has 1GB of Ram and a 40GB hard drive, comes with a 7in 800 by 400 pixel touch-sensitive display and is running Windows Vista Business. It also has a built-in 3G wireless data card, WiFi networking and Bluetooth connectivity.

Stand-out featuresinclude the clever way its screen slides up and tilts to reveal a mini-qwerty keyboard. Its SnapVue technology provides quick, easy access to e-mail and SMS text messaging without the need to fire up Windows.

I am not sure I would feel comfortable leaving my laptop at home and taking the Shift on a long business trip, but I found it great for my daily commute. The Mini-Note is a far more rounded machine that should please most users, including students and business people.

When a full-size laptop is just too heavy

Q. I want an ultra-mobile device that I can use while travelling. What are my options? You could consider a qwerty- based smartphone, provided you do not intend to do too much typing. Alternatively, take a look at a larger device such as the HTC Shift that comes with a reasonably large touch screen, a qwerty keyboard suitable for two-finger "hunt-and-peck" typing and Windows Vista.

Q. How about a sub-notebook device such as the Asus Eee PC or HP 2133 Mini-Note? Both provide a good alternative to lugging around a full-size laptop. They are a fraction of the price and ideally suited for running basic office productivity software and web applications. The Mini-Note is particularly attractive, especially when paired with a plug-in wireless broadband card.

Q. What about a lightweight laptop with a full-size screen? If you plan on doing heavy duty office work, gaming or running processor-intensive multimedia options, ultra-light laptops such as the ThinkPad X300 are probably the best bet. But be prepared for a relatively hefty price tag.

paul.taylor@ft.com Paul Taylor tackles your high-tech problems and queries at www.ft.com/gadgetguru
Copyright The Financial Times Limited 2008

Thursday, April 17, 2008

ERP/SCM: FT.com / Companies / IT - US fears weigh on Sage shares

ERP/SCM: FT.com / Companies / IT - US fears weigh on Sage shares

US fears weigh on Sage shares
By Tim Bradshaw

Published: April 15 2008 03:00 | Last updated: April 15 2008 03:00

Sage shares fell yesterday despite the software group stating that first-half results would be in line with market expectations.

Analysts' average expectations estimate sales for the six months to March 31 will be £617m, with earnings before interest, tax and amortisation of £145m.

Sage shares fell 6.1p to 192.4p, partly because of concerns about the prospects for a recovery in its US healthcare business and fears that a slowing US economy will hold back IT spending.

But analysts at Merrill Lynch said the "in-line" statement was encouraging given weaker results from Intuit, Sage's US rival, in February. About 70 per cent of Sage's revenues are classified as recurring, which analysts said should provide some insulation from broader economic fluctuations. Tim Bradshaw
Copyright The Financial Times Limited 2008

Monday, April 14, 2008

KMWorld.com: What’s the New Face of Knowledge Management?

KMWorld.com: What’s the New Face of Knowledge Management?

Here’s a shocker: There was a time when knowledge management wasn’t very well accepted. The early proponents—self-described "global, big-picture" thinkers—made a critical strategic error. By overloading the significance of KM with visions of utopian "transparent organizations" and "corporate agility," they gave the reigning executives of the day the perfect exit route. Had they simply asked for technology support for certain broken business processes (as many did, but not all), they probably would have gained a fair share of executive buy-in. But instead they insisted on weighing down the conversation with talk of "the sharing organization." To which, the typical executive simply replied: "We already have sharing technology. We have networks, and file shares. We have email. We have meetings. Why should I spend more money to do something we are already doing?"

FT.com / Services & tools / Search

FT.com / Services & tools / Search

The future of search: It's how, not where, you look
,By Alan Cane, FT.com site
Published: Mar 28, 2007


The time staff waste searching for "stuff " - the information necessary to do their jobs more effectively - has become legendary. Accenture, the consultancy, polled more than 1,000 executives in the US and UK and found that managers were on average spending up to two hours - a quarter of their working day - searching for stuff.

When they found it, moreover, at least 50 per cent was useless: irrelevant, out-of-date or just wrong.

Concerned that its intranet was becoming overburdened, BAE Systems, the aerospace group, carried out its own survey and discovered that four out of five employees on the network were wasting an average of 30 minutes a day retrieving information while 60 per cent were spending an hour or more duplicating the work of others.

The solution was a system from Autonomy, a UK company which, with 16,000 customers worldwide, leads the market for what is known as "enterprise search ", a family of technologies that make it possible to extract information quickly from both structured and unstructured sources. With the Autonomy system in place, BAE estimates that time spent in finding information is down by more than 90 per cent.

Another example: lawyers with the US firm Morrison & Foerster found they were drowning in information scattered through their systems: client histories were stored in accounting and customer relationship management systems, documents were stored in a document management system, communications in e-mail servers and so on.

The firm drew up a specification for an ideal solution, which it called AnswerBase, and commissioned a system from Recommind, a legal search vendor. Searches which had previously taken hours could be completed in seconds using AnswerBase; those taking days were reduced to minutes.

As Craig Carpenter, Recommind's head of marketing and business development, puts it, the days when enterprise search was a non-essential novelty are past; now the future lies with search technologies which will home in on concepts rather than keywords.

Enterprise search is a comparatively recent phenomenon, forced on companies by the internet, e-mail, company intranets and the 20bn gigabytes of new data now being created by businesses each year.

Google currently leads the world in conventional internet search but as Mike Lynch, Autonomy chief executive, emphasises, enterprise search is different: "Unlike the internet, enterprise information is in different formats. A large company might support 300 different information formats scattered through 5,000 separate repositories.

"An enterprise search engine has to be able to understand all those formats and talk to all those repositories. And most staff are not allowed to see all the information a company has stored away. In a large group, for example, an individual might be allowed to see only one in every 10,000 documents. Each repository has its own set of complex rules governing who is allowed to see what and it is changing all the time. "

So Autonomy uses "spiders " and "ants " - intelligent software - to roam the intranet, indexing all the material available for a search: in that sense, even unstructured data has a structure of sorts. Ants are self-learning and capable of appreciating that particular pieces of information are frequently requested or that some categories of information change rapidly. Mr Lynch says attempts to create search tools without overall indexing - known as "federated search " - are unworkable: "They glow red hot and melt. "

Tamara Alairys, global leader for search at Accenture, points out that using Google to search a word like "Turkey " will return thousands of hits but it will not distinguish between the country and the bird: "The challenge for people searching their intranets has been to get better search relevancy and to retrieve data that can help them make a better decision. "

She argues that search technologies have improved "by leaps and bounds " in the past two years: "Early capabilities were limited: a user could only perform basic keyword searches and sort the results using parameters such as the date of creation. Much more is possible today. Structured and unstructured data can be searched. And natural language processing enables the search engine to understand the intent behind a user's query and give a meaningful response. "

The cost of failing to retrieve relevant data can be high. Zia Zaman, in charge of strategic market development for Fast, a search company based in Oslo, Norway, recalls a pharmaceuticals company that entered into a strategic relationship with a drug delivery group: "The two companies invested years and millions of dollars in trying to figure out how they could work together but in the end they had to pull the plug on the deal. Then the pharmaceuticals company found a document in its own files which detailed how the drug delivery mechanism could never work. They had been making decisions in a fog. "

Changes in the legal environment in the US is driving interest in enterprise search. The latest revision of the Federal Rules of Civil Procedure, the code for civil legal action, published in December year, gives companies involved in a lawsuit 99 days to produce relevant information stored electronically compared with three years or so previously.

Mike Lynch comments: "This will be impossible for a big drug company or manufacturer unless they already have a system in place. Companies which have some experience of lawsuits have already realised how important this is. Others are just waking up to it. Later this year I would expect to see the first prosecutions resulting from a failure to comply with the requirements. " It is expected that other countries will follow the US lead in principle.

Over the past 18 to 24 months there have been significant improvements in search technology, and the number of vendors of enterprise search systems has grown. IBM, Microsoft and Google have offerings aimed at business. The top end of the market is dominated by Autonomy, Convera, Fast and Open Text while specialist players include Endeca, InQuira, Siderean Software and Vivisimo.

The result, as Jerome Pesenti of Vivisimo writes, is that the search market is fragmented and confusing but that should not stop companies experimenting: customers don't know what to ask, what features are needed and which vendors to look at, he notes, going on to argue that search should be seen as a long-term application, deployed quickly and improved in phases based on end-user feedback: "There is no limit as to how good and useful a search can be, " he claims, "but modest goals, early rewards and especially, valuable user feedback, can be obtained through quick deployment. "

The BBC has difficult information retrieval needs. It is awash with information: core business systems as well as financial information about programmes, approvals processes, e-mails, audio and video.

Keith Little, BBC chief information officer, says: "We have lots of information that is unsearchable - valuable information that nobody can access. We have systems with search facilities but these are silos and then there are e-mails and other repositories of unstructured data that go right across the organisation. "

The BBC uses several search tools - Autonomy, Microsoft Sharepoint and OpenText Livelink among them. Mr Little says: "At the top level, our search strategy is to create a framework for plugging in, in a service-oriented manner, legacy and future systems.

"We need the ability to put those together to meet the search requirements from the business and we then have to think about how we provide access to our real audience - the people who pay our licence fees. " "Infax ", a simple programme search tool, was made available to the public last year.

What lies ahead for enterprise search? Ms Alairys of Accenture sees four developments. First, advanced analytics and monitoring which will make it possible to tap information in real time and provide rapid responses. Second, sentiment analysis which uses textual analysis to gauge the tone of a document - whether results show a company in a positive or negative light, for example. Third, multimedia search across textual, video and audio sources. And fourth, guided information discovery - exploring information without a specific query.

So in future, even if you don't know what you want or where to find it, enterprise search will guide you to the right answer.

Friday, April 11, 2008

FT.com / In depth / Yahoo merger bid

FT.com / In depth / Yahoo merger bid

Web giants take sides in battle for Yahoo
Yahoo ratcheted up its efforts to improve its negotiating position in the face of an unsolicited takeover bid from Microsoft amid signs that it was edging towards a three-way alliance with Google and AOL - Apr 10 2008

Yahoo digs in for final battle
Yahoo has positioned itself for the endgame in its battle with Microsoft by issuing its strongest rejection to date of its rival’s $42bn takeover offer - Apr 7 2008

Yahoo seeks boost from ad sales system
New platform aims to simplify significantly the process of buying and selling adverts online and improve Yahoo’s value in the face of Microsoft’s renewed takeover efforts - Apr 7 2008

Yahoo board huddles over Microsoft bid
Directors were in discussions about Microsoft’s latest gambit in its unsolicited takeover approach, amid signs that Yahoo was preparing to dig in deeper against its suitor’s current bid - Apr 6 2008

Advertisers welcome prospect of Google rival
Google executives are quietly rehearsing their arguments against Microsoft’s approach to Yahoo in internal discussions which could indicate the search group’s lobbying strategy once a bid comes before regulators - Mar 16 2008

Thursday, April 10, 2008

PC World - Business Center: Gartner: Enterprise Search Pervasive by 2012

PC World - Business Center: Gartner: Enterprise Search Pervasive by 2012

Thursday, April 10, 2008 2:20 PM PDT
Enterprise search is set to become a pervasive and demanding force in IT over the next several years, according to Gartner analyst Whit Andrews.

"Information access technology will locate and analyze more than 90 percent of data in more than 50 percent of Global 2000 enterprises" by the end of 2012, according to materials from a presentation Andrews gave at Gartner's Symposium ITxpo conference in Las Vegas this week. Gartner refers to enterprise search by the more general phrase "information access technology."

Some observers saw Microsoft's recent move to acquire enterprise search vendor FAST Search & Transfer as a validation of the market. It will compete with a range of large vendors, such as Autonomy and FAST, along with a series of smaller companies, including Recommind and X1 Technologies.

"All the infrastructure vendors need to respond in some way to the need for effective search technology in their products," Andrews said in an interview on Thursday.

But it's unclear whether the market will see a rush of major consolidation, given the high cost of buying a top independent player, he said. (Autonomy has a market capitalization of US$4 billion, according to its Web site.) "At this point, what I've seen is that this is well short of a gold rush. ... It's not easy to see where this is going to go."

As for smaller players, Andrews said he is "fairly confident they're all looking to be acquired at this point. I think if they don't get bought, then they have to go highly specific."

Whatever path companies choose regarding enterprise search, they will face major challenges, chiefly the high expectations of users, Andrews noted in his presentation: "End-users of information access technology do not recognize, respect and treat as reasonable the divisions that application architecture have forced on information access strategy."

Information itself will need to be organized and augmented to a greater degree, he states. "Critical issues include flexibility of indexing, incorporation of security down to the document level and, in rare cases, the subdocument level, and the flexibility to access APIs in business applications."

The general methodology for collecting search results will change as well, Andrews predicts.


"The classic model for information access technology is the spider, which travels around the threads of a document Web and returns with a picture of its structure ... However, this model demands that the data be somewhat stale, and it is not acceptable for transaction-sensitive business applications or the databases they feed and by which they are fed," he writes.

An emerging model, in Andrews' words, can be described as an ant: "Rather than traverse a document set and store a pattern of what it finds, the ant travels on well-known pathways to discover the 'freshest' morsel of data and return it to the colony where it can be merged with other such morsels for the good of the whole."

Andrews also predicted that:

Through the end of 2012, no Global 2000 enterprise will have standardized "absolutely" on a particular information access platform.

Business intelligence will work in concert with enterprise search at 90 percent of Global 2000 companies in 2012. "Absolute convergence between business intelligence and search applications will, typically, not occur, but business intelligence and search will work in tandem," he wrote. "However, a few exceptional vendors will emerge to provide a combined business intelligence and information access platform."

PC World - Business Center: Gartner: Enterprise Search Pervasive by 2012

PC World - Business Center: Gartner: Enterprise Search Pervasive by 2012

Enterprise search is set to become a pervasive and demanding force in IT over the next several years, according to Gartner analyst Whit Andrews.

"Information access technology will locate and analyze more than 90 percent of data in more than 50 percent of Global 2000 enterprises" by the end of 2012, according to materials from a presentation Andrews gave at Gartner's Symposium ITxpo conference in Las Vegas this week. Gartner refers to enterprise search by the more general phrase "information access technology."

Some observers saw Microsoft's recent move to acquire enterprise search vendor FAST Search & Transfer as a validation of the market. It will compete with a range of large vendors, such as Autonomy and FAST, along with a series of smaller companies, including Recommind and X1 Technologies.

"All the infrastructure vendors need to respond in some way to the need for effective search technology in their products," Andrews said in an interview on Thursday.

But it's unclear whether the market will see a rush of major consolidation, given the high cost of buying a top independent player, he said. (Autonomy has a market capitalization of US$4 billion, according to its Web site.) "At this point, what I've seen is that this is well short of a gold rush. ... It's not easy to see where this is going to go."

As for smaller players, Andrews said he is "fairly confident they're all looking to be acquired at this point. I think if they don't get bought, then they have to go highly specific."

Whatever path companies choose regarding enterprise search, they will face major challenges, chiefly the high expectations of users, Andrews noted in his presentation: "End-users of information access technology do not recognize, respect and treat as reasonable the divisions that application architecture have forced on information access strategy."

Information itself will need to be organized and augmented to a greater degree, he states. "Critical issues include flexibility of indexing, incorporation of security down to the document level and, in rare cases, the subdocument level, and the flexibility to access APIs in business applications."

The general methodology for collecting search results will change as well, Andrews predicts.

"The classic model for information access technology is the spider, which travels around the threads of a document Web and returns with a picture of its structure ... However, this model demands that the data be somewhat stale, and it is not acceptable for transaction-sensitive business applications or the databases they feed and by which they are fed," he writes.

An emerging model, in Andrews' words, can be described as an ant: "Rather than traverse a document set and store a pattern of what it finds, the ant travels on well-known pathways to discover the 'freshest' morsel of data and return it to the colony where it can be merged with other such morsels for the good of the whole."

Andrews also predicted that:

Through the end of 2012, no Global 2000 enterprise will have standardized "absolutely" on a particular information access platform.

Business intelligence will work in concert with enterprise search at 90 percent of Global 2000 companies in 2012. "Absolute convergence between business intelligence and search applications will, typically, not occur, but business intelligence and search will work in tandem," he wrote. "However, a few exceptional vendors will emerge to provide a combined business intelligence and information access platform."

FT.com / In depth - Web giants take sides in battle for Yahoo

FT.com / In depth - Web giants take sides in battle for Yahoo

Web giants take sides in battle for Yahoo
By Richard Waters and Chris Nuttall in San Francisco and Joshua Chaffin in New York

Published: April 9 2008 21:23 | Last updated: April 10 2008 05:43

Yahoo ratcheted up its efforts on Wednesday to improve its negotiating position in the face of an unsolicited takeover bid from Microsoft, amid signs that it was edging towards a three-way alliance with Google and AOL that might protect its independence or at least force Microsoft to pay more.

There were also reports that News Corp was in talks with Microsoft about joining in that company’s bid for Yahoo. The talks involve News Corp combining MySpace, its social networking service, with the Microsoft and Yahoo internet businesses, according to the reports. Both News Corp and Microsoft refused to comment.

The flurry of activity came as Yahoo and Microsoft positioned themselves for the end-game in the takeover battle, which began at the end of January. Microsoft tried to turn up the heat on Yahoo over the weekend by threatening to take its cash-and-stock offer, currently worth $42bn, directly to the embattled company’s shareholders and hinting that it might even cut the value of its offer. Yahoo countered by repeating that the offer price was too low and that while it did not rule out a deal with Microsoft at a higher price, it was pursuing alternatives.

Yahoo gave the first public sign of one possible alternative on Wednesday when it announced the test of a potential advertising alliance with Google. The two-week experiment, due to start next week, will involve Google supplying relevant adverts alongside a small sample of Yahoo search results.

The test suggested that the two sides were once again discussing an alliance that would involve Yahoo closing down its own search advertising system and outsourcing the work to Google. The idea was discussed last year and again after Microsoft made its unsolicited bid, but Google had appeared to cool on the idea amid concerns that it would be blocked by anti-trust regulators.

The idea of Yahoo abandoning its own search advertising system and adopting Google’s has long been promoted by several Wall Street analysts. They see it as a way for Yahoo to cut costs and boost revenues, with Google yielding 30 to 40 per cent more revenue per search than Yahoo.

“What they’re doing now is testing revenue assumptions about what they could expect” from a search advertising alliance, said one person who is familiar with the situation.

Microsoft was quick to raise the anti-trust flag on Wednesday. “Any definitive agreement between Yahoo and Google would consolidate over 90 per cent of the search advertising market in Google’s hands; this would make the market far less competitive,” said Brad Smith, Microsoft general counsel.

Some analysts were also sceptical that the relationship could expand beyond a trial.

“We do not think a broader or longer-term Yahoo/Google search partnership would pass regulatory muster,” said Scott Kessler, Standard & Poor’s internet services analyst, in a note. Even some people close to the situation warned that the chances were small that the advertising test would eventually lead to a full-blown partnership.

Meanwhile, talks have been continuing over a separate deal involving Yahoo and AOL, according to people familiar with the situation. Accounts differed on Wednesday over how close the two sides were to an agreement. The two have for several weeks been discussing a deal that would involve Time Warner injecting its AOL division into Yahoo in return for a stake in the company.

One person close to the situation described the talks as “fluid” and said the two sides were still some way from any deal, though another person said that considerable headway had been made and an agreement could come as early as next week.

A deal with AOL alone would not create enough value for Yahoo shareholders to justify turning down the big takeover premium offered by Microsoft, according to one Yahoo investor. It has been seen instead as part of a three-way transaction also involving Google, since outsourcing search advertising would have a far bigger and more immediate impact on Yahoo’s earnings.
Copyright The Financial Times Limited 2008

Wednesday, April 09, 2008

Gartner Identifies Seven Grand Challenges Facing IT - Government Technology

Gartner Identifies Seven Grand Challenges Facing IT - Government Technology

Gartner Identifies Seven Grand Challenges Facing IT
Apr 9, 2008, News Report


Many of the emerging technologies that will be entering the market in 2033 are already known in some form in 2008, according to Gartner. Many of the innovations that will unfold during the next 25 years can be found today in research papers, patents, or are in a prototype in production.

These long-term innovations, taking place in five to 20 years, go beyond the range of the typical IT project portfolio planning cycle. These innovations are classified as "IT Grand Challenges." Gartner defines an IT Grand Challenge as a fundamental issue to be overcome within the field of IT whose resolutions will have broad and extremely beneficial economic, scientific or societal effects on all aspects of our lives.

Analysts said IT leaders must be more active in researching and identifying emerging technologies that will bring about benefits not realized today.

"IT leaders should always be looking ahead for the emerging technologies that will have a dramatic impact on their business, and information on many of these future innovations are already in some public domain," said Ken McGee, vice president and Gartner Fellow. "Today, CIOs should identify which Gartner IT Grand Challenges will be most meaningful for their enterprise. Then within the next 12 months, review patents for additional IT Grand Challenge candidates. Apply logical conclusions to Gartner emerging technologies, business and societal trends research to identify IT Grand Challenges. Lastly, identify preferred sites to monitor developing academic, government or corporate research on chosen Grand Challenges. There are technologies on the horizon that will completely transform your business."

Gartner has identified seven IT Grand Challenges. They include:
Never having to manually recharge devices: Today, the ubiquity of portable computing and communications devices powered by battery means that many people would find it highly desirable to either have their batteries charged remotely or their devices powered by a remote source, bypassing the use of batteries altogether. Despite more than 100 years of research since the invention of the Tesla Coil in the late nineteenth century, the most notable progress to date was achieved by the Massachusetts Institute of Technology (MIT) in July 2007 in their experiment to transfer nonradiative power. By this measure, any commercial application of wireless powering still seems a long way off.

Parallel Programming: Rather than simply creating faster single-core processors to perform tasks serially, another way to meet the constant demand for faster processor speed is to develop multiple, slower speed processors that perform tasks serially. Simulations, modeling, entertainment and massive data mining would all benefit from advances in parallel computing. However, a challenge with parallel computing is to create applications that fully exploit a "multi-core" architecture by dividing a problem into smaller individual problems addressed by individual processors. To overcome this, key issues will need to be addressed, including effectively breaking up processes into specific sub-processes, determining which tasks can be handled simultaneously by multiple processes, scheduling tasks to be processed simultaneously and designing the architecture of the parallel processing environment.

Non Tactile, Natural Computing Interface: The idea of interacting with computers without any mechanical interface has long been a desirable goal in computing. Some of the many challenges that remain in this area include the ability to detect gestures, developing a gesture dictionary and the need for real-time processing. Another set of challenges relate to natural language processing, which include speech synthesis, speech recognition, natural language understanding, natural language generation, machine translation and translating one natural language into another.

Automated Speech Translation: Once the many hurdles of natural language processing are overcome to yield human-to-computer communications in one language, the complexity extends further when translation and output is required to a target language that is understandable to a human. Some rudimentary systems have already been created to accomplish basic speech translation, such as one-way and two-way translations.

Persistent and Reliable Long-Term Storage: Current technologies are hard-pressed to perfectly preserve Dr. Francine Berman's 2006 estimate of 161 Exabytes (x10 to the 18th power) of digital information on digital media for more than 20 years. The barriers to long-term archiving (in excess of 100 years) that must be overcome include format, hardware, software, metadata, information retrieval, just to mention a few.

Increase Programmer Productivity 100-fold: As business and society's demand for software development increases, and the apparent decline of students pursuing software engineering and computer science degrees intensifies, removing uncertainty from meeting future demands will have to be met by increasing the output, or productivity, per programmer. While the exploration and development of tools to enhance productivity continues to capture attention, it would appear that effectively and efficiently exploiting reusable code is one of the most encouraging rays of hope to yield more output per programmer. But many challenges exist there as well. Minimizing the time required to find the perfect software module and avoiding the need to modify reusable software are among the many challenges.

Identifying the Financial Consequences of IT Investing: One of the most perplexing challenges faced by IT leaders has been to convey the business value of IT in terms readily understandable by business executives. As a discipline that conveys the business performance and results to internal executives and personnel only, management accounting could offer business advice and recommendations that would quantify the consequences of a particular IT deployment. Unlike financial accounting measurements which are standard across public companies, the particular management accounting metrics could be different for each company. This Grand Challenge would be considered conquered when a request for an IT project was argued with the following certainty: "If you invest in our IT proposal, you will see an additional $0.03 earnings per share directly attributable to this project by the third quarter of next year."

Tuesday, April 08, 2008

Microsoft Office SharePoint Server (MOSS)

Microsoft Office SharePoint Server (MOSS)
Stony Brook University’s Division of Information Technology recently issued its second edition of DoIT News. It is a newsletter geared towards faculty and staff that is currently being published once a semester. The big news item this spring was the announcement of Microsoft Office SharePoint Server (MOSS) being available on campus. For those who did not see the newsletter, here is the article along with related links. If you are a member of the Stony Brook University community, give SharePoint a try and let us know what you think!

Microsoft Office SharePoint Server Now Available

Improved team productivity, tools for collaboration and a means to keep people connected regardless of geographic location are three big reasons to try out Microsoft Office SharePoint Server.

Collaboration is a whole lot easier at Stony Brook University. Microsoft Office SharePoint Server technology provides freedom to create and manage Internet sites that allow for document sharing, team collaboration, Web-site authoring, content management and social networking.

To get started using SharePoint all you need is a Web browser (preferably Microsoft’s Internet Explorer) and your Stony Brook NetID.

Research teams, committees and student organizations are the likely groups to benefit from SharePoint. It works seamlessly with Microsoft Office products such as Word, Excel and Access.

My Site is a personal Web portal that allows individuals to create a profile and a personal Web site that can be shared with colleagues.

There is a calendar tool, a place to post announcements, task lists and relevant links. Templates are available for creating blogs and wikis.

SharePoint is a great tool for managing documents. Libraries can be created and multiple authoring is supported. You can be alerted if someone in your team modifies a shared document. A check-in and check-out system can be enabled to ensure that just one person is working on a document at any given time to prevent overwriting. Different versions of a document can be saved and restored.

As a Web site content management system, SharePoint helps you separate the site design from the content authoring. Permissions can be set, giving you control over who has access to your site, a particular page or even a page element. In addition, a feature called audiencing provides a means for filtering specific parts of a page to a targeted audience.

Build your own My Site right now by logging into https://mysite.stonybrook.edu and entering your NetID and NetID password. Your NetID will need to be entered in the following format: sunysb.edu\“Your NetID”.

Additional SharePoint sites can be requested by filling out the form provided at https://web.stonybrook.edu/sharepoint.

Related Links:
http://www.stonybrook.edu/it/sharepoint1.shtml
http://office.microsoft.com/en-us/sharepointserver/HA101087481033.aspx

Thursday, April 03, 2008

FT.com / Companies / IT - IBM ventures into 3D virtual world

FT.com / Companies / IT - IBM ventures into 3D virtual world

IBM ventures into 3D virtual world
By Chris Nuttall in San Francisco

Published: April 3 2008 01:06 | Last updated: April 3 2008 01:06

IBM and Linden Lab, the creator of Second Life, are to develop an enterprise-class virtual world to convince companies of the value of communicating through avatars and 3D environments.

The corporate world has been put off virtual worlds such as Second Life by a lack of security, control and stability. There have been sporadic incidents of residents staging terrorist-style attacks on in-world retailers and realtors with virtual bombs and flying genitalia.

The joint project between Linden Lab and IBM will put parts of Second Life’s “Grid” platform behind IBM’s company firewall for greater security and will host it on IBM Bladecenter servers for increased operational scale and stability.

This “first” for Second Life is aimed at creating a packaged product that businesses can deploy quickly. IBM and Linden’s efforts are an attempt to capitalise on a growing corporate demand for more bespoke virtual worlds and puts them in competition with developers including 500 Mirrors, Qwaq and Multiverse.

“The internet, not just Second Life, is the Wild West once you get beyond the corporate firewall,” says Neil Katz, chief technology officer of IBM’s Digital Convergence business.

“We are responding to what our customers have been asking for: they see the value in Second Life but are very concerned about security and want [internal] conversations to stay behind the firewall.”

Ginsu Yoon, head of business affairs at Linden Lab, says: “This is going to be a great test of the Second Life infrastructure in terms of its commercial deployment.”

IBM has staged meetings and demonstrations in Second Life for some time and Sam Palmisano, chief executive, has his own avatar. Last September, Italian IBM workers staged a virtual strike in Second Life over a pay cut.
Copyright The Financial Times Limited 2008

Wednesday, April 02, 2008

Enterprise tool halves Eversheds search time | 2 Apr 2008 | ComputerWeekly.com

Enterprise tool halves Eversheds search time | 2 Apr 2008 | ComputerWeekly.com

International law firm Eversheds is rolling out an enterprise search tool across its offices in the UK, Europe and Asia to around 4,000 users to save between 50% and 75% of time spent looking for information across internal and external sources.

FT.com / Technology - Translation technology: Language subtleties make full automation a myth

FT.com / Technology - Translation technology: Language subtleties make full automation a myth

Translation technology: Language subtleties make full automation a myth
By Alan Cane

Published: April 2 2008 02:23 | Last updated: April 2 2008 02:23

Machine translation has come a long way since the heady, early days of the digital computer, when it seemed only a matter of time before nation would speak unto nation through the intermediary of binary digits.

That this is taking longer to come about than originally hoped is a consequence of the complexity of natural language and the limited power of the machinery.

Even today, the most powerful supercomputers cannot guarantee a perfect translation of technical documents – and nobody is foolhardy enough to trust the intricacies of a novel or a poem to a machine.

“The myth of fully automated translation is just that – a myth,” says Mark Lancaster, chief executive of the UK company SDL, a leader in commercial technical translation. “Languages are just too complex for us to be able to automate the whole process.”

“Automated translation works well enough if you simply want to get an understanding of a document,” says Eric Blassin, head of technical development for Lionbridge, a US company that claims to be the world’s largest commercial translation services group.

This is described as a “gisting” system. It will give you the gist of a document, although at the risk of significant errors or loss of sense.

Automated translation can, however, save a translator time – he or she acts as a reviewer, correcting errors and mistranslations rather than working on the whole text from scratch.

There is a wide range of systems of varying degrees of power and flexibility available today to tackle technical translation.

At one end of the spectrum, there are gadgets such as a hand-held device from the US company, Franklin, priced at a modest £150 ($297), that will translate between 12 languages using 450,000 words and 12,000 phrases packed into its memory.

At the other end are the substantial enterprise systems from SDL or Lionbridge, capable of processing tens of millions of words every month for international customers. These two companies, leaders in their respective fields, present an interesting comparison in terms of business models.

Then there are mid-range systems from companies such as Systran – one of the pioneers of automated translation – that are suited to small and medium sized organisations. Systran technology powers online translation services such as Google Translate and Alta Vista’s Babelfish.

Gisting systems have to be treated with caution, however, as an unfortunate party of Israeli journalists found out last year. Invited to The Netherlands to interview the Dutch foreign minister, they were asked to submit their questions in advance in English.

The task, however, was left to the only non-English speaker among them, who used the online translation site http://www.babylon.com uncritically to create the text. The resulting nonsense came close to sparking an international incident and the visit was cancelled.

Why are translation services important? Companies hoping to succeed in global markets have, necessarily, to localise their products – that is, adapt them to local conditions and customs.

They can also contribute to matters of life and death. IBM, a leading researcher into translation systems, last year provided the coalition forces in Iraq with automatic translation devices and special software to recognise and translate more than 50,000 English and 100,000 Iraqi Arabic words.

The aim was to use the devices in hospitals and in training. The technology, known as Mastor, allows users to converse naturally producing audible and text translations of spoken words.

Mark Lancaster of SDL began his career as a software engineer working for the US companies Lotus Development Corporation and Ashton-Tate, now part of the Borland group.

“We were always having difficulty in getting our products into the global market quickly and efficiently. It took months or even years for Japanese double-byte type projects.” Double byte refers to the number of bytes required to code for a Japanese, Chinese or Korean character – English is a single-byte language.

Hardened by this experience, SDL was formed in 1992 to provide consultancy to US businesses hoping to break into global markets. Mr Lancaster swiftly saw a need for technology to help the translation process: “We created technology for translators. It was a sort of word processor: it helped with the translation in a word-processor-like environment.”

The company was providing both translation and localisation services. “We helped to project-manage the product. We would say to clients: ‘If you want to get this into 20 languages, you probably need to re-architect the product, so it supports accented characters, sort sequences and date formats’ – things specific to the markets they wanted to enter.”

Today, SDL is a quoted company providing translation services to customers using its in-house team of 700 translators working from 30 offices worldwide and up to 10,000 freelances.

Its principal activity, however, is licensing its translation management software to large corporate users such as Hewlett-Packard, Philips, Dell and Bosch.

There are two parts to the software: the translator’s workstation and the central language repository which grows continually as words and phrases are added: “The more they use it, the more they can leverage content that has previously been translated,” Mr Lancaster says.

He describes a typical translation process: “Someone writes, for example, a new product summary on a website. It is likely this will be created in a content management system which will alert our translation management system that there is new text on the website.

“It will take that piece of text and compare it with text in the multi-lingual repository. It will look for any text that has already been translated or is similar – something we call fuzzy matching – so that can be used again.

“It will never be completely automatic. For existing customers, we think that 50 per cent of new text can be matched to existing text, 10 to 20 per cent gives a partial match and the remaining 30 per cent has to be done from scratch.

“The trick is that we believe about 90 per cent of the professional translation community use our desktop technology. The joy is that when customers buy our enterprise technology, they know the translators, whoever they may be, can plug in seamlessly to the management system. This smooths the supply chain and saves everybody a lot of money.”

He quotes the example of the investment bank Morgan Stanley. SDL translates its research into several languages. Before the use of the technology it used to take up to five Morgan Stanley translators several months to convert a single document. Now the same document can be translated into four languages within 24 hours.

Lionbridge, now 12 years old and a spin-off from R.R. Donnelley, the large commercial printer, operates a different business model. It has developed central translation management software like SDL but for its own use: it provides only translation services.

Mr Blassin says: “We have developed a core platform [software system] that we use across all our vertical markets and all our customers. This enables us to recycle everything that has previously been translated.

“This platform is probably unique in that we were the first to introduce a pure, internet-based platform in this industry. Translators around the world are connected to the platform and translate online. They have only a small amount of software on their desktop machines and all the data stays in Lionbridge’s central system.

“This makes for consistency when many translators are working on the same project. The platform is partitioned and each customer has its own area.” Last year, 2,000 translators working online processed about 500m words using the Lionbridge software for companies such as Cisco, Du Pont and General Electric.

Lionbridge applies automation to the quality control process checking for accuracy, consistency and the elimination of “false friends” – words which look similar in two languages but which have entirely different meanings: a British stationery company, for example, tried to launch a non-leaking fountain pen in Spain with the angle that “it won’t leak in your pocket and embarrass you”. It assumed you the Spanish word “embarazar” means embarrassed. In fact, it means pregnant.

The aim of companies like SDL and Lionbridge is to automate as far as possible the entire life cycle of text production providing aids for the author, translation, intelligibility, review, compliance, desktop publishing and distribution.

Mark Lancaster says: “Companies’ websites are still mostly primitive and trivial. Updating them and company brochures and getting them out to customers can be expensive if you don’t have a system to manage it.”
Copyright The Financial Times Limited 2008

FT.com / Technology - What IT means to me: ‘I’m a fan of IT, but I’m still a bit cynical’

FT.com / Technology - What IT means to me: ‘I’m a fan of IT, but I’m still a bit cynical’

What IT means to me: ‘I’m a fan of IT, but I’m still a bit cynical’
By Stephen Pritchard

Published: April 2 2008 02:23 | Last updated: April 2 2008 02:23

Alan Middleton has the builders in. The London headquarters of PA Consulting, where Mr Middleton is chief executive, smells of fresh paint. Hoardings in the lobby and atrium show how the new, extended offices will look, with space for more staff and – vitally – more space for meetings, too.

Perhaps surprisingly for a CEO who has done much to bring his company into the digital world, meetings matter for Mr Middleton. On his watch, PA has become one of the first management consultancies to build a presence in Second Life, the online virtual world, providing experience that PA has drawn on to build virtual worlds for clients.

Mr Middleton previously served as head of IT for PA, overseeing significant advances in the company’s technology infrastructure and its ability to support remote and mobile working. He has backed investment in knowledge management, blogging, wikis and podcasts at PA. But he still puts much store on face-to-face meetings.

“I am not fearful of IT,” he explains. “I live in a 17th century house which is fully automated: the heating, light, sound system and even the garden. I can switch on the electric blanket from Hong Kong and toast my wife! In that sense I am a fan of IT, but I’m still a bit cynical.

“I get very frustrated by people’s dependence on e-mail, and everything else that reduces personal contact, but I’ve not been able to reduce it. We are a people business and we need to bring people together.”

Connecting people, he says, should really be why large companies invest in enterprise resource planning (ERP) and knowledge management systems.

“Our system captures who was in the team that worked on a project,” he says. “We publish that information internally, so I can run a simple search to find out who knows what. At that point we have a human bond, I ring that person and say ‘give me a hand’. The human link is simple but very powerful knowledge management.”

Such systems, Mr Middleton concedes, fall short of the sophistication often demanded by the knowledge management purists, with their multi-tiered systems and complex tables of metadata that require hours of consultants’ time to fill out. Yet they work. “If you come at it purely from an IT angle, these projects will fail,” he says.

According to Mr Middleton, PA’s internal business system Mipac (management information, planning and control) is driven by one objective: to connect people.

The first generation of Mipac, created in 1995, used Microsoft Exchange for messaging and accounting; and for HR, one of the first UK installations of PeopleSoft. On top of this came KnowledgeNet, the company’s knowledge management system, and the whole was linked by a hard-wired global network at “enormous cost”.


“That was five to eight years ahead of its time,” says Mr Middleton. “We still use the same business solution, but have evolved to what you would expect: lower cost, delivering the same functionality on new technologies and over IP networks and VPNs.”

PA was also an early adopter, and advocate, of mobile working. “We had some of the early [Apple] Macs and the first Mac laptops; I ruined several suits carrying those,” recalls Mr Middleton. “Then we moved to Toshiba laptops. Within a year, we had 2,500 consultants using them. At that time, it was a genuine business advantage and a differentiator.”

The challenge for chief executives, he suggests, is to keep up with technological change and not to become too satisfied with the status quo.

“Attitudes to technology are age-dependent,” he says. “If we have a partner joining us from another firm, he or she will say that our core stuff is fabulous and makes their life hugely easier. That is the traditional role of IT.

“But our younger people have a different view. I think that this is happening everywhere: the existing generation see new joiners as anarchists, while the younger generation sees existing business people as fuddy-duddies.

“People joining see technology as bringing the capability to interact with others, an enabler for their networking and zany ideas. This is a healthy tension that will drive change.”

One example was a presentation, eight years ago, of something that looked like today’s mobile e-mail devices.

“A partner from the PA Technology Centre held up a thing that we had built: a Palmpilot with a GSM module,” says Mr Middleton.

“He told us this was the future, and that in a few years we would all have a device with a camera on it, access to our diaries and corporate e-mail. It would be a phone and we would use it to surf the web – and that it would be about the size of a cigarette packet. Many said that the chap must be past his sell-by date. But he was right.”

One way companies can learn to spot such changes is to ensure their future managers spend time in IT.

Mr Middleton firmly believes that time spent running an IT project is just as important for executives as, say, a spell in finance.

All too often, large companies ask senior executives to “sponsor” IT projects, but those executives often lack the depth of knowledge, not to mention the time, to do so effectively.

At the same time, Mr Middleton has sympathy for the plight of the CIO, who is expected to innovate but also to deliver more with less.

“In recent times, by and large, CIOs have been squeezed really hard on cost. That is in a sense counter-intuitive, as revenues and profitability have been looking great in most organisations, yet CIOs were still being pushed to cut costs. “Now, as the world wobbles, the danger for CIOs is that the very things that bring value to the bottom line, the innovative things, have been squeezed in the last six years and there is no financial or human capacity left.

“Our business leaders are saying ‘be more innovative and funky’ but there is not a lot of bandwidth to play with.”

Increasingly, companies will look beyond conventional sources of business technology to deliver that innovation. PA, for example, has turned to social networking and user-generated content to help its own business.

“We are looking at how to further our knowledge management through the power of social networking techniques – such as Facebook and Bebo – in a business context. These approaches and technologies will need to reach their second generation before they are fully usable, but they offer exciting opportunities,” he says.

PA’s excursion into virtual worlds, in the shape of Second Life, raised eyebrows both within and outside the company, but has resulted in business wins from organisations as diverse as telecoms operator Telenor and the Hong Kong Jockey Club.

“We built the Hong Kong Jockey Club in Second Life, new branch layouts for banks and have shown how it can be used to train oil tanker drivers and emergency services when responding to forecourt incidents,” Mr Middleton points out.

“We were the first management consulting firm to develop a presence in Second Life. When we started doing that, people said ‘It’s for the birds’. But I said that in 1994 about the internet. Yet within three years, you were dinosaurs if you were not online.

“I’m sure we’ve all made mistakes like that time and time again. If you don’t respond to the opportunities offered by the relentless change, then you’ll struggle.”
Copyright The Financial Times Limited 2008