What is artificial intelligence?

What is artificial intelligence?

 

Every day, our use of digital applications and services generates a considerable volume of data. With artificial intelligence (AI), we can now classify this information and give it new meaning. This technology accelerates innovation and digital transformation. Video recommendations, search engine results, voice recognition, personal assistants and self-driving cars are all examples of concrete applications of AI in our daily lives.

AI and Machine Learning OVHcloud

Artificial intelligence (AI) is based on a platform’s ability to make algorithms interact, the same way that the human brain’s neural networks interact with one another. To understand a question, predict an intention or recognise an element in an image, AI accesses a database instantly. It calculates probabilities, then offers a clear, natural answer, as if it were human intelligence.

OVHcloud AI Training

The definition of artificial intelligence (AI)

 

Artificial intelligence depends on the ability of computing units to replicate human learning. It begins with mimicry — and then as it analyses an increasing volume of data, a network of neurones will grow. It carries on developing this network until it is able to reproduce — or even surpass — human capacities for reflection. AI can then recognise elements in images, model data, synthesise information, predict trends, and show accurate results — all with a natural language.

 

History of artificial intelligence

In the 1940s, when computers were first being developed, many scientists like Alan Turing wanted to develop an “artificial brain.” These specialists conducted a variety of experiments to achieve their goal: creating a machine that is smarter than humans. To prove their goal, they came up with a well-known evaluation system: the Turing test.

Its principle is simple: for five minutes, a person sends computer messages to two strangers, a human and a machine. At the end of the conversation, if the person cannot tell them apart, then the test is successful. AI proves that it has developed sufficient skills to reproduce natural language and engage in intelligent dialogue, sometimes even with a touch of humour.

In 1956, three mathematicians created the first artificial intelligence language, IPL-11, by training a program to solve mathematical problems. It was quickly able to demonstrate established theorems with an intelligible explanation. Scientists then developed a self-learning technology, machine learning. The machine learning technology builds its own AI, by training itself to replicate human analysis on a high volume of data.

 

Machine learning and big data (or the ability to gather an exponential volume of data to train AI) are an integral part of ensuring that an AI develops successfully.

Machine Learning Data Analytics OVHcloud

How does artificial intelligence work?

 

The scientific design of artificial intelligence is based on three fundamental steps: assimilating data, analysing data, then implementing appropriate responses or actions.
 

  • Data assimilation

Today, every interaction on a website will generate data. This data needs to be captured and stored, so that it can be used by artificial intelligence. Sometimes it also needs to be anonymised. Under the GDPR, an individual’s digital activity must not be linked to their personal data.

To capture information and transcribe it in a way that is intelligible to AI programs, scientists create machine learning algorithms. They are sometimes referred to as artificial neurones, and enable thousands of images to be decrypted, for example. They re-transcribe each image pixel by pixel, until it forms a data set. Other programs and algorithms are then designed to capture and redistribute information from big data to machine learning algorithms.

Cloud computing offers cutting-edge computing power for this technology. You can leverage the potential of this data instantly, and make full use of it.

 

  • Data analysis

The data processed and decrypted by artificial intelligence is the very basis of its self-learning: deep learning. To provide a response that is adapted to different challenges, data scientists establish criteria that must be applied in each situation. Gradually, machine learning algorithms enrich this list of criteria and offer new, more relevant responses.

The more artificial intelligence is powered by new data, the more the artificial neural network dedicated to deep learning expands. AI learns and understands, which also has a technical impact. We need to guarantee the constant availability of big data, and we also need to ensure that incoming data is recent and reliable. Data mining in particular is an analysis process that retrieves and compares data to identify similarities, market trends and actionable data. Progress in AI relies on these powerful algorithms, and access to data.

The computing power of all these algorithms is what drives artificial intelligence. If it is not streamlined, it can become time-consuming and slow interactions between artificial neural networks, which reduces the potential of AI as a result. This is why data scientists quickly needed more powerful graphics cards. They also sought to optimise the technical resources — like the GPU compute power — available to them.

For these professionals, we have implemented solutions like OVHcloud AI Training, within our Public Cloud. This way, data scientists can automatically streamline these resources for AI training without any human intervention.

Deep learning networks are becoming quicker and more powerful. Data processing is almost instantaneous, which multiplies the number of ways in which artificial intelligence can be used. Reinforcement learning is limitless: each new piece of information reinforces the accuracy of AI.

 

  • Setting up appropriate responses and actions

Once the data is interpreted, compared and analysed, artificial intelligence can offer a response or reaction that meets the user’s expectations. This is a form of “cognitive reflex”, which data scientists must define in order for AI to:

  • Offer video recommendations adapted to the web user (relational intelligence)
  • Automate repetitive tasks (intelligent automation)
  • Perform real-time translations (semantic analysis, language interpretation)
  • Identify a promising market trend for a startup or project manager (business intelligence)
  • Distinguish an object or animal in thousands of photographs (visual recognition)
  • Park an autonomous vehicle in the right location (cognitive reflexes, proprioception)
machine learning model deployment OVHcloud

Examples of uses for artificial intelligence

 

Predictive models in the public sector

Without the aid of artificial intelligence, managing public infrastructure and services is a complex and time-consuming task. Each decision requires a multitude of factors to be considered. If a local council decides to renovate a station, for example, they will need to deal with with issues as diverse as prioritising the work to be carried out according to the planned budget, studying traffic flows to offer alternative routes, optimising replacement transport services, and much more.

By training artificial intelligence (AI) to capture, process and analyse available data, we can make more accurate projections. It also enables users to develop decision trees specifying the costs and benefits of each option. These predictions are then used as decision-making tools (business intelligence).

AI is popular solution for operations planning — and this applies to all public administrations and services.

 

Healthcare research and development

Artificial intelligence is at the heart of many healthcare research and development (R&D) programmes. This field is where it offers the most promising results.

For example, early detection of cancers and serious diseases is a major challenge for the medical sector. Following successful visual recognition tests, AI algorithms were integrated into early tumour screening programmes. Compared to a team of neurosurgeons, the MRI recognition algorithm is already 10 times quicker and more effective at screening for brain tumours. It compares each new brain image to thousands of databases.

Using predictive analysis and image recognition can help save patients’ lives, even before the illness breaks out — so it is a game-changer in the medical field.

By extracting knowledge from thousands of scientific reports, data mining enables scientists to:

  • Better understand the causes of certain illnesses
  • Detect potential contraindications between different medicines
  • Accelerate large-scale scientific research

 

IT security and data protection

Data protection and governance are critical issues. The access, sharing and use of information is essential for both large research centres and small businesses.

Cybersecurity software and AI-based environments can detect potential network vulnerabilities. They can also block malware that mimics human user behaviour.

 

Real-time machine translation

Customer relationship management is a key challenge for all companies that are committed to delivering outstanding service quality. With natural language processing (NLP), artificial intelligence centralises requests from international customers. They are then instantly translated into the customer service representative’s language, in the style of an interpreter.

The representative can provide an adapted response, which is then perfectly written and translated into the user’s language. Answer templates are also provided by a chatbot, based on previous exchanges.

 

Planning, inventory tracking and flow management

Commodity management, demand forecasting and real-time inventory management are vital challenges for major distribution and e-commerce players.

By using predictive algorithms to anticipate sales, you can order the right quantity of products from the manufacturer. Until very recently, this seemed almost impossible. With sales figures and data from previous years, professionals can now train AI to manage the flows of a high number of items in real time. The business intelligence program can quickly identify the best sales for a specific period. With the help of predictive models, control algorithms and data visualisation, buyers can order the right number of items at the right time — avoiding waste and stock disruptions as a result.

 

Robotics and embedded intelligence

Robotics is a form of AI centred on cognitive intelligence. Developers need to design much more than a smart computer connected to a network: a stand-alone machine.

Robots are equipped with sensors (on-board cameras, microphones, radars, networks of connected devices, etc.) to provide the right response to their environment. They also define the cognitive reflexes adapted to each situation.

Embedded intelligence has already proven to have outstanding potential, with the launch of self-driving cars. They are able to understand road safety rules, park alone, and brake to the nearest millisecond if there is any danger.

GPU as a service OVHcloud

Artificial intelligence (AI) solutions

 

NVIDIA GPU Cloud

A graphics processor (GPU) is a computing unit. It can be present on a graphics card, such as the NVIDIA V100S, or on a motherboard. The effectiveness of artificial intelligence depends on the efficiency of these computing units.

The NVIDIA GPU Cloud (NGC) includes all of the GPU software that automatically optimises NVIDIA hardware computing units. This facilitates deep learning and AI graphical computing.

 

Jupyter Notebook

Jupyter is an open-source, free and interactive computing notebook (or web application). Explanations, user codes, equations and visualisations are freely shared among users.

Jupyter works with many programming languages and development environments (frameworks), such as TensorFlow, PyTorch and MxNet. From its interface, users can view, edit and run live code, then check the result of each change immediately. Data scientists use Jupyter Notebook to create a variety of automated systems and artificial intelligence.

 

Apache Spark

Apache Spark is one of the most popular frameworks on the market for massively parallel data processing. It scans the database information, loads the operations to performed into memory, then performs all of the computing tasks at once. When the analysis is complete, the computing resources are freed up.

Apache Spark is used to aggregate massive volumes of data and provide detailed analysis reports.