Artificial Intelligence:
The ethical and legal implications



There is much excitement and hype about artificial intelligence

Both incumbent and start-up technology vendors are offering new and innovative AI-enabled products and services.

Businesses in a wide range of industry sectors are pursuing AI strategies.

AI is now firmly on the Board agenda. It could contribute as much as US$15.7 trillion to the global economy in 2030. Of this, US$6.6 trillion is likely to come from the increased productivity and US$9.1 trillion from consumption effects.

(PWC, Sizing the Prize: PWC’s Global Artificial Intelligence Study – Exploiting the AI Revolution, 2017)


3815 A1 Key messages-01.png (1)
3815 A1 Key messages-02.png (1)
3815 A1 Key messages_Artboard 3.png (1)
3815 A1 Key messages-04.png (1)

The automation of decision-making processes is not new

Software can make “decisions” when specified criteria are satisfied (for example, “buy” and “sell” decisions); and humans can use AI to help improve the quality of their own decision-making. Unlike other software, however, AI can make decisions autonomously without any human involvement.

AI has huge potential to bring accuracy, efficiencies, cost savings and speed to a whole range of formerly human activities and to provide entirely new insights into market and customer behaviour. It  has the capability to transform businesses and the services and products they offer.

A decision to adopt AI can raise fundamental ethical and moral issues for society. These complex issues are of vital importance to our future, but they are not typically the domain of lawyers.

In this briefing we focus on the more granular ethical and related legal risks that need to be managed by a business developing or using AI in whatever industry sector it occupies. As legal responsibility is a subset of moral (or ethical) responsibility, for AI to gain acceptance and be trusted in a given sector, a business will need to take into account the ethical considerations and the legal factors that flow from them.


Find out more