A medical doctor looks at an X-ray image of a woman’s breast. He considers an area with a minor anomaly, which he believes is the fatty tissue normal among young women. Still, aware that 1 in 5 breast cancers goes undetected, he asks the Mammography AI for a second opinion. He shares the depersonalized data with the service and retrieves instant feedback that the spot has a 90 percent chance of being cancerous. The doctor reconsiders all the information, arranges for more comprehensive testing, and the patient is diagnosed with early-stage breast cancer a few days later.
Many real-world cases follow the script above: the once futuristic image of close collaboration between experts and machines is rapidly becoming reality. Deep learning opens up a realm of cognitive prediction tasks to machines, which they often do better and faster than humans. Today, computers are capable of synthesizing chemical compounds, creating realistic images of people that have never existed, managing energy consumption in data centres, translating and directing conversations in real-time - it’s surprisingly hard to name a domain that cannot be enhanced by the contemporary wave of Artificial Intelligence (AI).
Artificial General Intelligence (AGI) marks the point when a machine shows intelligent behaviour equivalent to, or indistinguishable from, that of a human. Luckily we have not reached that point (just yet). Current Artificial Intelligence (AI) still makes plenty of silly mistakes and biases, often because it lacks a human representation of how the world works to spot wrong predictions easily. Machines need humans to set their goals, gather labelled data from which they can generalize, give them feedback when they make mistakes, and decide what to do with all the predictions they make. Humans need machines, but machines need humans too.
Artificial General Intelligence (AGI) is the intelligence of a machine that can successfully perform any intellectual task that a human being can.
Artificial Intelligence (AI) today is focused on performing well at one narrow task which it is designed to do (e.g. only facial recognition, only translations, only internet searches or only driving a car).
Have you figured out how you will adapt and team up with Artificial Intelligence in the years to come? Here are some changes you should expect.
A growing proportion of software is written by computers in the form of neural networks. Think of a machine learning model as a simple function that merely consumes information and returns predictions. That makes it easy to understand that functions can be programmed as a series of if/else statements by a human, but also as a sequence of neurons and activations optimized by a machine. Such code, written by a machine, is appropriately named Software 2.0.
Boom, you just hit our 'paywall'. No stress, we won't ask your money, but if you'd like to continue reading, you can download Shift below. Our annual report on the next big trends in digital is read by more than 1,000 experts worldwide. Join the conversation and get your free copy today! The article on artificial intelligence continues on page 15.