Monday, January 25, 2010

From bad economies spring new media channels

From bad economies spring new media channels
By Mike DiFranza, founder and president of Captivate Network and chairman of OVAB

Published: January 25 2010 13:31 | Last updated: January 25 2010 13:31

“Progress” isn’t usually the first word that comes to mind when a recession hits, yet that’s often what recessions create.

Historical patterns suggest that recessions have given rise to transformational media, from radio to cable television. If that historical pattern holds up – and there’s no reason to believe it won’t – the current recession’s progeny will be mass acceptance of highly targeted and digital media channels.

For almost 80 years, every major economic crisis to hit the US has vaulted nascent communication mediums into prominence. The Great Depression was the catalyst for radio to evolve as a major communications medium in the 1930s.

Television took its place as the dominant national medium during the recession of the mid-1950s. Cable television moved from hotel rooms to homes during the energy crisis and subsequent recessions of the late 1970s and early 1980s.

The internet emerged from the military and academic realms into the mainstream during the 1988 recession. After the 2000 recession, online advertising growth exploded.

What is it about a struggling economy that nurtures new media?

The answer has more to do with human nature than economics, though there is some of that at work too.

People are naturally averse to change, but if they have to change to avoid risk, they will. When consumers are buying and profits are rolling in, corporate marketing organisations have no motive to risk a failed campaign by investing in an emerging medium.

But when a serious recession like the one we’re facing now hits, organisations have to reassess what is and isn’t working in their advertising programmes. John Wannamaker, founder of the US’s department-store industry, once lamented: “Half my advertising dollars work, I just don’t know which half!”

That ambiguity is not an option in today’s new economic reality. Every chief marketing officer understands that effectively engaging consumers who are capable of buying their company’s product is the top priority.

Consumer media consumption trends and technological changes accompanying the current recession portend a much larger shift in the media landscape this time around. Mobile advertising and digital place-based networks, which display content and advertising on screens in public places, are to this era what cable and radio were to years past.

Digital place-based networks turn venues such as elevators, lobbies, airport terminals and taxis, into communication channels for today’s marketers. They enable advertisers to target very specific audience segments with engaging content that draws attention to their advertising message. They are aimed at targeting consumers during the 44 per cent of the day that the consumers are actually awake and out of their homes (source: PQ Media) actively making purchase decisions.

Until recently, the public mainly consumed media in the home during predictable hours, such as evening prime time. Today, market dynamics demonstrate that a big percentage of the public gets its news, information and advertising on the go.

According to a recent BIA/Kelsey forecast, digital out-of-home advertising will grow 13.5 per cent over the next four years, outpacing the 1.4 per cent growth for home-consumed advertising. BIA/Kelsey expects advertisers to spend $2.2bn on digital out-of-home advertising this year and $3.7bn by 2013.

A poor economy, however, is only one factor setting the stage for the emergence of transformational media. The other critical element accompanying the post-recession adoption of new media channels is third-party audience measurement data.

Organisations such as the newspaper industry’s Audit Bureau of Circulation (ABC) and broadcast’s Nielson and Arbitron ratings provide objective credibility for circulation and viewership claims.

The Out of Home Video Advertising Bureau, the North American industry organisation, has created guidelines for calculating audience sizes of “place based” digital networks, such as lobby and elevator screens.

Digital out-of-home networks are expected to release independent third-party research over the next few quarters and will provide advertisers with the equivalent of the broadcast industry commercial ratings.

Together, those new research sources will provide advertisers with the objective measurement – and the confidence – to consider and evaluate campaigns better on digital out-of-home networks.

Neither digital place-based media nor the recession started the mass media’s market share erosion; audiences have been splintering for years into finer and more elusive pieces.

In the 1940s, viewers would watch televisions through appliance store windows. When the masses could afford televisions, advertisers followed them into their homes.

Today, consumers are on the go and advertisers must engage them out of their homes, on the road, in the air or at the office if they are to prosper in the new economic reality.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Tuesday, January 19, 2010

Reality made larger than life

Reality made larger than life
By Alan Cane

Published: January 19 2010 16:41 | Last updated: January 19 2010 16:41

Audiences gasp at what they see: a presenter stands in front of them with nothing in his hands. Yet the large screens on either side of the stage show him holding a flower.

As he waves his clenched fist around, it is as if the flower were really there: it moves in perfect time. Then suddenly it becomes a light sabre; later, the presenter holds a model car and a helicopter, which – on the large screens – appears to fly around the lecture theatre.

A growing number of such awe-inspiring demonstrations of what is labelled “augmented reality” are appearing on the internet and businesses are being encouraged to consider the potential uses of this seamless interaction of the real and the virtual.

Advertising, product design simulation and visualisation, architect’s modelling and – importantly in today’s market – entertainment and sophisticated computer games are among the areas expected to find uses for augmented reality.

In fact, there are already some well-known examples. In televised sport, for instance, advertisers’ logos and advertisements appear on football and cricket pitches where no such logos exist in the physical world; they have been written on the pitch virtually.

Similarly, viewers can estimate how far a long jumper has progressed by comparing lines drawn digitally in the sand indicating the best jump so far, the world record and so on.

Outside sport, environmentalists can hold a pattern on a piece of paper in front of a videocamera and, on a screen, see it transformed into a three dimensional model of an electricity grid, for example.

A number of companies have already launched AR systems. Layar, for example, has combined the Global Positioning System (GPS) with a camera and a digital compass to create a system that enables users to identify their surroundings, extract information about the locality and combine it with their real-world view on the mobile device’s screen.

Metaio, a German group, has developed a product to help engineers service mechanical systems by creating a digital image that overlays the physical work in front of them through a head-mounted display. Such systems can act as real-time manuals, for example, displaying every move a mechanic needs to make to repair an engine as they work.

Other applications have yet to find public acceptance and some could prove controversial.

Daphna Steinmetz, chief innovation officer for Comverse, the telecommunications software group, describes research in her laboratory which could see an end to business cards: people would merely point their mobile phone at an individual to enable face recognition software to identify them and bring to the phone everything known about them on the internet.

“The user, with one click, will be able to see the tweets of this person or view their profile in Facebook. We would also add the ability to generate a message, create a phone call or voice message or add to an address book so that immediate communication can be created.”

The question remains whether people will be happy to have their lives exposed in this way to anyone with a mobile phone. Comverse is aware of this as it works to bring the application to market.

In essence, augmented reality is a kind of digital trompe l’oeil which overlays facts and figures from the internet and other sources on to the real world to create a combined image rich in functional benefits.

But in spite of some spectacular demonstrations, in practice, much AR is in a nascent form, awaiting technological advances to make commercial progress.

Industry watchers and investors alike, however, are excited by its potential.

AR is an old idea: the concept was first broached by a cinematographer, Morton Heilig, in 1957 and the term coined by Tom Caudell at Boeing in 1992.

It comprises two broad areas – “object level” AR, where physical objects are augmented by additional data or graphics (this would include computer screen-based AR) and “location level” AR where the users’ view of their surroundings is enriched through additional information.

The widespread development and acceptance of both, however, was hindered by a lack of appropriate technology. Kelly Dempski, director of research at Accenture’s Sophia Antipolis laboratories, points out that displays and tracking technology had been poor.

But then something happened: “The average consumer now has a piece of technology – a phone with a good screen and rich graphics, a camera and a variety of tracking mechanisms ranging from GPS to compasses to onboard image recognition.

“Suddenly everyone has an AR platform in their pocket and businesses are just beginning to find new uses for this platform.”

So the mobile phone was, it seems, the silver bullet.

The re-emergence of AR, however, depended on the maturation of a number of technologies: image sensors, video cameras, and displays – either head-mounted, handheld, fixed or spatial (the last involving the projection of digital information on to physical objects).

It also depended on an array of sensory devices – accelerometers, digital compasses, GPS sensors, wireless sensors and gyroscopes – and, crucially, communications networks and key databases. For example, the Apple iPhone 3GS, complete with compass and accelerometers, is very much the model for mobile AR.

According to Ken Blakeslee, an independent consultant who has worked with a number of AR companies, the tipping point has been reached in technological development: “We are in the very early stages of development in AR, but we can move rapidly. The key thing is that the databases exist.”

He sees potential business applications in retail, vehicle repair, safety and real estate, among others.

Retail has been an early adopter of the technology. David Grunwald of Deloitte, the consultancy, says many consumer technologies are coming together to create “a basket of readiness” for AR applications, including smartphones, Flash software, 3D bar codes and the like, as well as social media: “Together they are producing a rich and fertile set of applications for retail,” he says.

Examples include an online application from Holition, a London-based software developer, whose technology allows prospective buyers of expensive goods such as jewellery and watches to try them – even if they don’t exist. An image of the customer and the item are united on screen in a montage. It enables customers to try on items at home or in-store.

Lynne Murray, the group’s head of design, says it is working now to include clothing in the application.

Significant hurdles have yet to be overcome, however. Robin Gear, manager of the innovation unit for PA Consulting, points out that “registration” – aligning digital data with the real world view while it is moving – remains a problem, together with the interaction between the virtual image and the surrounding environment.

Rob Gonda of Sapient Interactive says this is because the processing power to superimpose digital elements on top of real time video captured on a webcam or mobile camera is not sufficient to give the illusion of a seamless image.

John Spindler of ADC, the US networking group, points to a more basic difficulty – the capacity of wireless networks to carry data in the volumes generated by AR: “You have to have the infrastructure in place from a network point of view to support these applications.”

In Mr Spindler’s view, to support AR and other data-heavy applications, US wireless carriers will have to invest heavily and create a different network topology with smaller cells.

Professor Jonathan Raper of the Information Science Department at City University London, underlines the technological questions of localisation in space and time which continue to dog development: “In my view, AR is in a waiting room, still looking for the right formula that engages the masses.”

He has no doubt, however, that most progress will be made in the immediate future in location-based AR (he is editor-in-chief of the Journal of Location Based Services) and he is generally optimistic about the future of AR technology: “The steps that AR needs now to occur are improvements in positioning integrity and positioning speed and pervasiveness of good positioning.

“The step that we can envisage making this happen is the second constellation of global positioning satellites which is expected to go live in a few years.”

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Thursday, January 14, 2010

Why customer technology will be the new battleground for retail banks

Why customer technology will be the new battleground for retail banks
By Stephen Haighton, Chordiant vice president for the Emea region

Published: January 14 2010 11:14 | Last updated: January 14 2010 11:14

For the traditional retail banking industry, competition is fiercer than ever and one of the biggest battlegrounds is likely to be the retention and acquisition of customers.

So what role does technology have to play in this fast evolving and customer-focused banking environment?

Banks have had a tough time trying to foster confidence and loyalty among their customer base against the backdrop of the recent credit crunch. At the same time, several non-banking players, such as supermarket chain Tesco in the UK, have entered the retail banking market looking to capitalise on growing consumer suspicion of traditional banks.

These new financial market players are pushing customer-centricity as a strong selling proposition for attracting new business. In turn, it seems that customers are ready to trust these non-banking institutions as they feel a more personal relationship with these brands.

At the same time, consumer needs are changing as expectations of service levels rise and more channels of communication become available.

As a result, traditional banks need to spend more time listening to their customer base across multiple channels and ensure they are being engaged in meaningful conversations.

However, the quality and relevance of a good conversation with a customer is often underestimated by financial institutions. This is exacerbated by the fact that the traditional financial institution is often encumbered with inflexible legacy systems which are not built with the customer in mind.

For example, recent research commissioned by Chordiant and conducted by Vanson Bourne found that only 57 per cent of customer representatives questioned in the UK’s high street retail banks have software that helps them suggest what products or services would be appropriate for an individual customer.

Furthermore, nearly half of those interviewed do not have software which supports them in conversations with customers who want to leave the bank.

As banks increasingly seek to place the customer at the heart of their business, they will need to deploy more sophisticated customer experience management (CEM) technology in order to maintain their position and their customer base in the market against new customer-centric banking organisations.

CEM technology enables banks to deliver intelligent conversations based upon analysis of past customer behaviour, as well as current responses and mood. This allows them to engage more effectively with customers, quickly measure how the strategy is working and change at new levels of speed and economy.

The key is to maximise the value of every conversation, consistently across every channel. Users should be able to deliver highly expressive customer experience strategies using models that predict and react to individual customer expectations, propensities and behaviours.

This behavioural segmentation is combined with powerful real-time decision-making and centrally deployed to any channel across the bank.

By implementing this kind of technology, traditional financial institutions are able to put an end to pre-scripted, inconsistent customer interactions based upon static, outdated market segmentation.

Following Next-Best-Action techniques also allows every customer interaction to become unique, appropriate and consistent. The conversation with the customer is continually guided, with actions adapting as the conversation is occurring. Recommendations are determined in real-time, based upon customer responses, mood and instant analysis of customer behaviour.

Financial institutions can also benefit from powerful Visual Command and Control capabilities to simulate different strategies and visualise their impact on customers and business metrics. Once optimised, customer strategies can be deployed at the touch of a button and changed on demand, without IT intervention.

Of course, building one-to-one relationships cannot come at the expense of profitability. Ideally, every decision a bank makes with regard to a customer should cater to that individual’s specific needs but do so in a manner that ensures profitability.

CEM takes the needs of the bank equally as seriously, so that customer offers and propositions, while tailored specifically for that customer, are also designed to support the bank’s own business goals.

The importance of banks employing this type of technology cannot be underestimated. Many of the non-banking players entering the market are already heavily customer focused and, with the absence of legacy systems to contend with, are well placed to invest in CEM solutions.

Therefore, gaining a full understanding of each customer as an individual, including their likely behaviour, and applying that to every interaction is not only critical for differentiation and loyalty, but it may be the key to survival amid increasing competition.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.

Monday, January 11, 2010

Who holds the keys to your organisation’s data?

Who holds the keys to your organisation’s data?
By Tim Dunn, vice-president of CA’s security business in the Emea region

Published: January 11 2010 12:47 | Last updated: January 11 2010 12:47

Organisations have a legal duty to protect their customers’ personal data – but should we really trust them?

Incidents such as unauthorised securities trading at Société Générale and hacking of the Pentagon’s system share a common thread – they were the work of people who gained access, legitimate or otherwise, to privileged user details, in other words, the security crown jewels.

Understanding privileged user management (PUM) requires knowledge on what a privileged user is. It might seem a simple question but it is often a hurdle at which organisations fall.

A privileged user is an individual who, by virtue of function, has significantly greater system access rights than most corporate users. They will include, for example, system administrators and those with emergency accounts.

But because a privileged user has access to various IT resources, they can make use of private and sensitive data within the organisation, create new user profiles as well as add to or amend the powers and access rights of existing users. All this can give them a higher level of access to sensitive date than any other employees in the business – the equivalent to the keys to the kingdom.

The ever-increasing wave of security threats and increasing regulatory burden means it is no surprise that IT managers tend to overlook the area of privileged access granted to themselves and others to carry out their job.

But mistakes made can have serious consequences for an organisation’s brand value, customer retention, revenue and support from investors and shareholders.

A growing list of compliance initiatives is aimed at protecting organisations from malicious or inadvertent abuses. The ISO27001 security standard that is commonly used around the world advocates that the allocation and use of privileges should be restricted and controlled. For example, access privileges associated with each system product – eg operating system, database management system and each application (and the users to which they need to be allocated) should be identified.

This means that organisations need an access control policy that allocates access on a need-to-use basis, plus an authorisation process and a record of all privileges allocated.

Corporate executives are pushing their organisations to comply with these regulations or face personal liability and the threat of criminal and civil penalties.

Almost all relevant legislation centres around the principle of “least privilege”. This requires that in a particular layer of a computing environment, every module – be it a process, a user or a program – must only be able to access such information and resources that are necessary for its legitimate purpose.

When applied to users, the terms “least user access” or “least privileged user account” (LUA) are also used, referring to the concept that all users at all times should run with as few privileges as possible, and also launch applications with as few privileges as possible.

The key step in addressing this challenge is first to look at the privileged user as a major business and risk management issue. Once understood at a strategic level an organisation is in a better position to deploy tools that control, monitor and measure its privileged users and make sure the solution helps the organisation move along a proven path or “maturity model” and one that adapts to the changing needs of the business.

An organisation must also adopt best practices throughout, including securing logged files, enforce segregation of duties and introduce individual accountability to ensure privileged accounts are not shared, privileges kept updated and user activity monitored.

Awareness of the issue is growing, although a recent study by software company CA and analysts Quocirca into the behaviour and management of privileged users, revealed that the security of European organisations and the trust placed in them is at risk because of non-compliance with industry standards, poor practice and manual error.

The study found that 41 per cent of 270 European organisations confirmed that while they had adopted the ISO27001 standard, non-compliant practices such as sharing privileged user account details and retaining default privileged account user names and passwords still prevail.

More than one third (36 per cent) stated they had implemented ISO27001 and had it certified by an external auditor.

The main problem highlighted by the study was awareness. Respondents admitted to overlooking risks associated with poor PUM because other security threats, such as malware, the internet and Web 2.0 tools, ranked higher in their priority list.

While the majority of privileged users are highly trustworthy, organisations face a growing problem of managing privileged users and their access rights. Abuse is often not intentional, which means there is a need not just to protect the business from its employees, but the employees from themselves.

Clearly, it is in the interest of individual IT managers, the IT department and the overall business to have measures in place to control and monitor privileged users.

Copyright The Financial Times Limited 2010. Print a single copy of this article for personal use. Contact us if you wish to print more to distribute to others.