Are we holding back artificial intelligence?
If technology doesn’t serve our day-to-day lives, we will be hesitant in adopting it.
For decades, creative minds have conceived futuristic worlds in which digital technologies — namely artificial intelligence (AI) — are ingrained into human societies. But if we want any of these concepts to transform into a reality of ‘smart cities’, both public and private sector organisations must create networks of open communications technologies so that machines and devices can interact with humans in familiar ways.
Presently, the industry is facing a roadblock: while technical capabilities are expanding, uptake remains limited. This comes down to human behaviour. If technology does not serve as a natural extension to day-to-day life, consumers will be hesitant in adopting it, subsequently inhibiting the road to smart cities.
Through an open approach, we can begin to introduce automation and intelligent machine learning so that AI can provide higher quality and more life-like services that engage with consumers with a level of empathy.
After all, humans are the consumers of technology, and if it is to be effective, gimmicks must make way for tangible value — an experience that feels more natural to the end user.
And while the consumer is engaging with these services, the technology itself will be able to learn by tapping into the knowledge of other forms of AI which are part of the wider, open network of machines.
This open network comes down to the availability of APIs, enabling third parties to extend the precedent that vendors set, and explore addition domains for enabling AI-driven value.
This is invaluable when you consider just how much information is generated each day from which machines can learn — 2.5 quintillion bytes, according to IBM.
However, as industry works towards building smart cities, we must appreciate that there is no one-size-fits-all when it comes to AI. Therefore, rather than attempting the impossible task of coding an intelligent platform that can do everything, we need to be use case driven.
Avaya is currently working with organisations worldwide which are departmentalising their AI implementations to focus on specific challenges and objectives, and then socialising those task-orientated digital machines through automation and machine learning.
For example, there are banks introducing conversational AI into their IVRs and customer interaction channels, so that customers can speak to chatbots as they would to a human. This creates the opportunity to leverage other innovations such as blockchain, communicating information between bots managing those respective projects.
Meanwhile, leading hotels are experimenting with intelligent rooms to provide guests with full control of their rooms via a bespoke tablet. However, the device also gathers information from other devices and various digital sources, anything from tour bookings to restaurant reservations, all controlled through voice.
Hospitals are also piloting AI projects to consolidate the patient experience, starting inside ambulances and extending right through to emergency rooms and across their wards. Once in production, this kind of AI network will accelerate diagnosis and treatment.
As it stands, it will take about another three years for humans to adjust to and embrace the possibilities that AI offers. But as we work towards a future of smart cities, both the public and private sector must open their technology platforms to the wider IT community to enable more relevant solutions for consumers’ problems, and provide experiences that align to their daily lives.
An upgraded and standardised PDF solution has been rolled out across six Queensland Government...
The NSW Department of Communities and Justice has partnered with Taysols to upgrade its corporate...
In a world of voice-activated, 'pageless' services, your 20-year-old apps can't keep...