When we talk about intelligence, we usually refer to rational decision making or any kind of logical conclusion. Creativity is only seldomly deemed as element of intelligence, although one could argue, that creativity is the source for everything new and as such the building block for new intelligence. But there is one new topic that has risen to prominence, when it comes to discussing intelligence: emotional intelligence.

Many books have been written about it and we can probably agree that emotional intelligence is a fundamental element to intelligence, although the discussion of it sometimes entails some rather vague concepts, such as building rapport, eye-contact or some other of the relation-oriented ‘intelligence’. Bottomline difference is, that emotional intelligence can and will not be measured by standards like right or wrong, good or bad, complete or incomplete. Emotional intelligence is to a certain extend purpose driven and its measures will depend on the objectives. So far, a universal definition for a standard of emotional intelligence has yet to be adopted.

But the point of bringing emotional intelligence into the equation, when discussing artificial intelligence, is not its purpose or how it is measured, but what it involves. Emotional intelligence is admitedly something that has to do with emotions, with feelings – hence the term: ‘gut feeling’. Emotions and feelings need a nervous system to function. Does an artificial intelligence have a nervous system: no, or at least: not yet.

If we agree that human intelligence includes emotional intelligence, then we must conclude that artificial intelligence is – right now – far from any human intelligence and probably will be for a long time. It is also highly questionable whether artificial intelligence, as we now it today, based on machine learning algorithms that categorize, group, detect and extrapolate, will soon grow beyond simply – albeit being highly advanced – statistical results. ChatGPT for example, is only producing output based on a probability for a certain word or group of words to be the most likely suitable.

The models we have in place right now, would need to incorporate something that at least remotely resembles a nervous system (with all its sensors) to even get a tiny little step towards adopting emotional intelligence into their system. Having said that, it is also fair to say that – even though we don´t call it a nervous system – many pieces that make up to being one are still being put together. Lets ask ourselfs, what happens in artifial reality (AR) or augmented reality (AR) scenarios. There, the users experience will feedback into the system, with all its reactions, which will no doubt about it involve emotions. Also, what happens during a interaction with a robot or a man-machine-interface? Voice and facial recognition surely include elements that are the expressions of emotion.

The big question then is: will we be able to put together models that consider such ’emotion-deferred’ data points and use them adequately to represent a certain state – excerting the complex equivivalent of an emotion.

In order to make the next step in this evolution, participants need to create environments where the user is not just asking questions or writing prompts, but actually gets involved. This is way application spaces will be the cornerstone of progress in developing something that could at least remotely be called ‘true artificial intelligence’.

Learn more about Data-Centric AI and Controlled Application Spaces. https://lakeside-analytics.com/data-centric-ai/