Emotions
and ethics

LI-In-Bug.png

Cover Illustration Andrea Manzati

People still count in a world of AI, but emotional intelligence will be critical to understanding the ethical challenges that technology brings. Felicity Long, Managing Director, Connected Execution, MediaCom, explains

 

Technology is changing the way we work. Many repeatable tasks are being automated and machine learning is fast becoming entwined in our day-to-day lives via computers, phones and smart devices.

 

This is creating a huge number of benefits in many industries – such as disease diagnosis and predications of natural disasters and farming, for example – but as people become over-reliant on tech, it’s also creating unexpected issues, too. Isolation and dependency are just two.

 

Perhaps, as Roy Amara, a past president of the US-based Institute for the Future, famously pointed out: “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.”

 

Data, and how we use it, is one of the biggest areas of concern. Our data footprints are getting larger every day, as we send countless emails, search Google and share personal information online for free which is then used to tell us what to buy, what to read and what to view – all in the name of convenience.

 

This is important, not just because of data privacy laws, but also because flawed data (or even its absence) can result in serious harm. As Caroline Criado Perez reveals in Invisible Women, for example, feeding gender-skewed data into an AI tool will only cause it to duplicate those errors, not fix them.

 

In a world where AI is becoming increasingly powerful, handing over our privacy and freedom of choice to the machines raises some serious ethical questions. The simplest of these is: “What data are the algorithms actually using?”

 

Ethics requires a strong degree of both social and self-awareness – something that people in corporate, results-driven environments can leave by the wayside.

 

Applying emotional intelligence

Despite the increasing influence of tech, Deloitte argues that the future of work will continue to be human because our interpersonal skills are hard to mechanise. Indeed, two-thirds of jobs will be soft-skill intensive by 2030, so emotional intelligence will become one of the most important skills in society and any organisation.

 

Emotional intelligence is defined as the ability to understand and manage your own emotions, as well as recognise and influence the emotions of the people around you.

According to Deloitte, two-thirds of jobs will be soft-skill intensive by 2030

It’s based around four key components: self-awareness, social awareness, self-management and relationship management.

 

Developing a strong EQ can be difficult, but – and I know this might sound left-field – we can learn an awful lot from horses and their riders.

 

That’s because horses are creatures with highly developed emotional intelligence. They can detect positive and negative emotions in humans based on facial expressions and are adept at using emotion to facilitate thought.

 

As prey animals, they also have high self-awareness and social awareness. They are consistently hyper-vigilant in assessing their surroundings through sight, smell, hearing and kinaesthetic senses, and are brilliant at detecting intention and authenticity in people.

As marketers, we need to apply the same approach to our relationship with technology. Data and AI allow us to enter consumers’ lives in increasingly powerful ways, but like the horse rider, we must develop our own EQs to make sure we harness it in the right way: ethically. This tension between ethics and technology is not a new one, of course. Similar questions are asked every time a new piece of technology impacts society and changes the way people live.

 

Take the invention of the respirator in the 1950s, for instance; it made it possible to keep patients alive who were unable to breathe unaided. However, it also raised ethical questions, especially in light of successful heart transplant operations in the 1960s.

 

It forced society to ask what should be done with patients on respirators who show no brain response and will never regain consciousness. The response was to change how we define death.

 

Whether this was the correct response is debatable, but my point is that the development of technology, data and artificial intelligence will force us to answer similarly challenging ethical questions. Like the horse rider, we will have to rely on our own emotional intelligence to answer them.

 

emotional LEADERS

To respond to these challenges, we need leaders who can ensure that we are making the right decisions for society as well as business performance.

 

But while many understand the value of EQ, our world is still built for IQ. From our education system to the way we define success in business, we still reward intellect over emotional intelligence.

 

To build a better world, we need to put more focus on helping people develop their EQ. Right now, companies may send out EQ tests to assess everyone, but they don’t explain how to improve or follow up.

 

Which brings me back to horses and a programme called equine-assisted learning; it’s a training course that invites people to interact with horses. A recent study from the University of Kentucky showed that leaders recorded a significant impact on their self-awareness and social-awareness after completion.

 

As the original horse whisperer, Buck Brannaman said: “The horse is a mirror to the soul. Sometimes you won’t like what you see. Sometimes you will.”

 

Technology, data and AI are asking all of us the same challenging questions, and we need to look inside our own souls to find the right answers. An ethical future demands it.

techvpeople.jpg

Enjoy this article?

Read this next...