Posts Tagged ‘enable’

How is AI used? What does it mean?

Thursday, December 24th, 2009

What is AI?

The term ‘AI’ can conjure wildly different (and usually far-fetched) ideas compared what is currently possible. Yet the AI we do today is a step along the way to the Speilberg and sci-fi image the mass media provides us.

To AI researchers ‘AI’ is commonly seen as a set of mathematical tools that can be programmed into computers to enable software to ‘learn’ in some fashion in order to enable it to be flexible, adaptive, and self-tuning. The scale it does any of these things would normally determine the complexity and computational burden of the task.

In all normal cases it’s a case of either:

– cheaper, better, faster, more reliable (eg engine management)
– impossible to do with manual encoding (eg vision)

A few examples:

Data Mining:

More sophisticated data mining uses ‘learning’ by data mining tools to discover key relationships between the inputs and the desired outputs – such as profitability relative to certain expenditures or management strategies. We want to find what key performance indicators really are… – not just the ones management ‘imagine’ they might be. Rather than guessing and writing code to mechanistically generate some output based on guessed approximations to the truth, AI can be used to learn the real relationships hidden in the data.

Currently we are working with financial transactions to identify performance metrics.

Artificial Vision:

Vision one of the most complex domains with a lot of active research. We humans are unaware of the complexity or difficulty due to our powerful natural image processing ability. Our visual system is plastic and constantly tuning to the world around us to enable us to see. About 60% of our brain is fired in understanding a new scene, which is an astronomical amount of computation compared to present day computers’ ability. This seemingly effortless task is requires us to work out how the streams of information coming through our eyes matches up, and what is meaningful, what is not, and to generate a 3D perception of the information from a 2D retina. There are no solutions that are manually coded that can do this type of thing. Machine learning is a key part of automatic machine vision, which is still in it’s relative infancy. We currently undertake Imaging research.

Natural Language Processing:

This is another active area of our research. As humans, we find understanding a news-feed very easy, given that we understand the context and background to the text. There is a compelling drive to enable computers to understand text. AI can be used to provide relationships between the text context and to automatically build and use taxonomies to store, sort, file and retrieve the text. Yahoo! Use AI tools to generate their menus and taxonomies automatically, which are machine sorted. This is a relatively basic use of AI.

More advanced ones require better understanding of context and content. We’re working to provide automatic learning and rating of financial and business news-feeds.

Engine Management:

A simple example would be the engine management system of a modern quality car. This has to adapt to the real and unique state of your engine all the time. In the past, the engine was tuned only by a garage. This was a terrible compromise in both engineering and settings that just got worse until the next time you could take it to the garage.

The adaptive versions are constantly tuning to cater for both instantaneous changes in load, humidity, temperature, fuel quality, and longer term wear & tear. The controller can ‘learn’ the best way to tune your engine for each operating condition you use the car. Thus overall the efficiency goes up, the engine lasts longer and it can compensate for wear and tear as well as notifying you that something is wrong while it still works. In the past I built an engine management system to optimally adjust petrol engines fuel efficiency.

Speech Recognition:

The speech recognition in your phone ‘learns’ to understand your voice. Initially we make more mistakes, and repeatedly ask new speakers, especially with strong accents, to repeat what they said. Once we are familiar with them, we don’t need to do this anymore.

The same applies to speech generation. We have done some work on speech applications.

Stock Price Prediction:

This is commonly researched by most financial institutional investors. We have also worked in this field with very good results in off-line testing and will be returning to it soon.

Special Effects:

In the movie industry the best animations with totally natural movement are generated by AI. Before AI was used to ‘discover’ the best models for motion, the human coded attempts (with 1000’s of human years of programming at vast cost) were obviously ugly and unnatural. Now in special effects, many animals and humans in dangerous action scenes are simulations based on AI solutions to natural motion.

Computer Games:

A common test bed for certain types of AI where the creatures learn your movements and capitalise on patterns. This makes for more interesting and challenging gameplay.

Adaptive Computing Infrastructure:

We work on adaptive computer systems that can manage themselves across failures without going down.

All these things have in common a need for the system to ‘learn’ and to self-optimise to achieve certain goals, like lower cost, faster, more accurate, more efficient etc.

Temporary Solution for a Lifetime

Saturday, November 28th, 2009

Transaction processing, interpretation, translation, cleansing and disambiguation middle ware.

There are many situations where data translation/interpretation need to be made on-the-fly to integrate disparate corporate systems. Some transactions are clean and others can be ambiguous and complex that require sophisticated adaptive and self structuring tools to perform high quality translation and interpretation where otherwise expensive human labour would be required.

One such example involved the communication and data translation/interpretation for a multinational with 4 corporate entities. After 6 months of works between our clients and their partners – in their attempts to inter-operate – we were invited to take over the transaction translation operation. Within two weeks the entities were able to transact cleanly in their own custom dialects and protocols. Aeye middleware did the translation work on the fly. The benefit was to enable automated transactions to begin flowing as soon as possible within the speed of the various partners to agree to the connects and provide authorisation and protocol information. Our clients were very happy, although it could have been viewed as a temporary solution (by the client) to bridge the gap between the various partners giving them valuable time to solve their technical hurdles. Our clients found it simpler, safer and more cost effective to continue to use our adaptive, responsive services as a buffer between them and the other parties they wish to connect to, freeing them to focus on their core value generation.

Engineering the Unseen

We were hired by a multi-national transportation company with 3 months to deliver on a project as a result of their project R&D team leaving the company and two previous consulting companies having attempted a solution.

The task was a safety-critical control system for a world-first technique. The time frame included the commissioning in an off-shore location.

Our job consisted of developing a bespoke control system (including operating system) and control software to fuse data from a set of unreliable sensors to result in reliable, safety critical control of key heavy machinery. The physical equipment to be used was under final stages of manufacture, thus there was little leeway for improvement.

We produced the solution on time for a system we had never physically seen and for a design we argued required significant changes to improve the quality and simplify the task. However, without the requested modifications, we still delivered a successful solution and commissioned it on-site in sub zero temperatures within the target week. The management on all sides were impressed. The project was a success and further solutions were provided downstream to other projects. It was a world first in-the-bag for our client, on-time.

What special did we do?

We modelled the physics of the system to an appropriate level of accuracy
which enabled us to test our solution before arriving on site.
It required only minor tuning to adapt it to the target machinery and was able to further adapt to the physical system once it was in-place. The software was able to handle the statistically noisy sensor data in a stable manner.