AI & Algorithms, CMA’s Research on Algorithms

17.04.2023

Over the past months, with the introduction and development of new tools, the use of Artificial Intelligence has become one of the hotly debated topics in many aspects. While this brings excitement and joy in terms of innovation and discovering new methodologies in the modern prospect, it also brings some concerns about how far it could reach and potentially, wave many industries unexpectedly.

On the matter of the competition, while we are seeing different updates and incidents regarding the AI tools, there has not been a major case at competition level. However, CMA’s research study on Algorithms back in 2021, might give some clue how CMA would approach AI. Respectfully, the concept of algorithms has become one of the crucial tools in many social media, browsers and e-commerce platforms. This writing aims to give highlight of the CMA’s research on algorithms.

Algorithms

We spend a large portion of our life online, whether it's for news consumption, networking, dating, ordering food, or planning trips. Without algorithms, which are frequently in the form of artificial intelligence, many of these online activities and the markets that support them would not be possible. Algorithms have made significant improvements in efficiency and effectiveness possible, such as the real-time repricing of portfolios including hundreds of goods.

Additionally, many technology companies, including some of the most strategically important businesses in the world, are built around algorithms. Algorithms, however, can be applied in ways that hurt consumers and restrict competition. The complexity of algorithmic systems increases along with their lack of transparency and difficulty in determining when they are harmful.

Direct Harms to Consumers

The direct harms to consumers that are related to unfairness, manipulation and exploitation. Algorithmic systems can be used to personalise a consumer’s experience of a service. If consumers are not aware that it is occurring, or if it gives rise to unfair distributive effects or harms consumers who are vulnerable, it is more likely to be exploitative. In addition, there are potential consumer harms that are not related to personalisation, but instead relate to unfair ranking and design of online services.

In the UK, traders have a general duty not to trade unfairly by acting contrary to the requirements of professional diligence which distorts the average consumer’s decisions in relation to a product or service. This can be broadly understood as failing to act in accordance with acceptable trading practice that a reasonable person would expect. In addition, misleading and aggressive practices are prohibited. This includes omission of material information from consumers which impairs their ability to make an informed choice.

a) Personalised pricing harms

Advertising various rates to different groups of individuals is an example of personalised pricing, as are strategies that have the same goal, including offering special discounts to certain clients. In order to boost earnings, businesses tailor rates based on what they believe various customers are willing to pay.

The capacity to access enormous amounts of personal data and use analytics to estimate the user's willingness to pay is one of the key features of digital markets. Such practises may be more potent when digital market providers, for instance platforms, have control over the user interface and the range of available alternatives, which restricts a user's ability to transfer providers.[1]

Personalised pricing can frequently be advantageous, boosting overall output and consumer welfare. Personalised pricing, for instance, can reduce consumers' search expenses and result in a more exact match between consumers and goods and services. Additionally, it might make it possible for businesses to sell to customers at a lower price point who wouldn't otherwise be willing to pay the standard price they would establish. In a similar vein, the capacity to provide focused discounts could aid new entrants in competing, especially in areas with high switching costs.[2]

Algorithmic Discrimination

The improper use of algorithmic systems to tailor services and offers to customers may lead to unjustified discrimination. In the UK, it is typically against the law for individuals and businesses to discriminate against customers based on "protected characteristics." This covers indirect discrimination, which occurs when a rule is administered equally but nonetheless has disproportionately negative consequences on a certain group without a legitimate reason.[3]

Equality law, in particular through the concept of indirect discrimination, prohibits many discriminatory effects of algorithmic systems that have been reported in recent years. However, enforcement is difficult, and to date, no legislation has been passed in the UK that has been designed specifically to tackle discrimination in AI systems[4]. Nonetheless, data protection law, through the General Data Protection Regulation (GDPR)[5], contains the fairness principle, which requires the ICO to intercede on issues where an individual is unjustly affected. Therefore, the processing of personal data that leads to unjust discrimination would contravene the fairness principle under the GDPR.

As was previously said, traders in the UK are generally required to operate ethically and refrain from engaging in unfair business practises. One of these requirements is compliance with equality and data protection laws.[6]

Unfair Ranking and Design

Finding a good or service requires time and effort. By gathering, sorting, and retrieving options that best fit customers' demands, businesses can add value for customers. Consumers can make decisions more effectively by using a well-designed choice architecture that includes default alternatives and rankings. If there is enough competition, consumers who are knowledgeable and engaged can change platforms if they are dissatisfied with the outcomes of one platform.[7]

The default choices and rankings, however, may reflect what is in the company's interest, perhaps at the expense of the interests of consumers, if companies are not upfront about the criteria they apply. This is particularly true when customers are likely to mistake sorted results and default settings for objective advice.

The CMA defines[8] unfair ranking and design as the use of algorithmic systems to modify rankings or other design features to influence what a consumer sees to gain commercial advantage, but that ultimately degrades or misrepresents the offering to the consumer.

The CMA highlights two important ways in which platforms may use unfair ranking and design:

(a) A platform may manipulate rankings of results to favour certain options, because it derives benefit from a commercial relationship, such as higher commission payments or revenue shares.

(b) Platforms may use other unfair design practices (‘dark patterns’) to exploit consumers’ behavioural biases for commercial gain, including the use of misleading scarcity messages, which exploit consumers’ loss aversion and tendency to be influenced by the actions of others (social proof).

An example case, The Australian Competition and Consumer Commission (ACCC) initially launched legal proceedings against Trivago in 2018, alleging the site’s hotel comparison service had been misleading consumers for nearly four years around advertising for what it claimed were the best rates on rooms. In fact, the highlighted prices were often not the cheapest, but instead those Trivago was prioritising for advertisers willing to pay the best cost-per-click rates for promotion. The promotion was achieved using an algorithm, which placed significant weight on online hotel booking sites that paid these higher feeds.[9]

In a judgment made in the Federal Court of Victoria on 20 January 2020, Trivago was found to have misled consumers since at least December 2016 by representing its website as helping to impartially identify the cheapest rates available for a given holiday globally.[10]

In conclusion, taking everything into consideration, many marketplaces and businesses run in a way that depends on algorithms. They have significantly increased productivity and make it possible for businesses to provide customers with better goods and services. However, businesses may also abuse them, whether purposefully or accidentally, and can harm consumers and competition, frequently by escalating or maximising the existing issues and weaknesses in markets and customers.

As CMA discussed above, it could be claimed that where the abuse of algorithms restricts competition on the matter of consumer such as misleading, the Competition authorities would likely to step in. In the future, use of AI seems likely to be on discussion within many aspects.


References

  • Ali, M, Sapiezynski, P, Bogen, M, Korolova, A, Mislove, A, & Rieke, A (2019), ‘Discrimination through optimization: How Facebook's ad delivery can lead to skewed outcomes’.
  • Cameron (CMO), N. (n.d.). Trivago fined $44.7m by Federal Court for misleading consumers. [online] www.cmo.com.au. Available at: https://www.cmo.com.au/article/697458/trivago-fined-44-7m-by-federal-court-misleading-consumers/.
  • CMA, “Algorithms: How they can reduce competition and harm consumers” (2021). Available at: ‘https://www.gov.uk/government/publications/algorithms-how-they-can-reduce-competition-and-harm-consumers’
  • Equality and Human Rights Commission. www.equalityhumanrights.com. (n.d.). Delivering services and the law | Equality and Human Rights Commission. [online] Available at: https://www.equalityhumanrights.com/en/multipage-guide/delivering-services-and-law [Accessed 16 Apr. 2023].
  • Zuiderveen Borgesius, F (2018), ‘Discrimination, artificial intelligence, and algorithmic decision-making’, Council of Europe.

Cases

Australian Competition and Consumer Commission v Trivago N.V. [2020] FCA 16


[1] CMA, “Algorithms: How they can reduce competition and harm consumers” (2021). Available at: ‘https://www.gov.uk/government/publications/algorithms-how-they-can-reduce-competition-and-harm-consumers’

[2] Ibid

[3] Ibid i

[4] Zuiderveen Borgesius, F (2018), ‘Discrimination, artificial intelligence, and algorithmic decision-making’, Council of Europe.

[5] GDPR

[6] Equality and Human Rights Commission. www.equalityhumanrights.com. (n.d.). Delivering services and the law | Equality and Human Rights Commission. [online] Available at: https://www.equalityhumanrights.com/en/multipage-guide/delivering-services-and-law [Accessed 16 Apr. 2023].

[7] Ali, M, Sapiezynski, P, Bogen, M, Korolova, A, Mislove, A, & Rieke, A (2019), ‘Discrimination through optimization: How Facebook's ad delivery can lead to skewed outcomes’.

[8] CMA, “Algorithms: How they can reduce competition and harm consumers” (2021). Available at: ‘https://www.gov.uk/government/publications/algorithms-how-they-can-reduce-competition-and-harm-consumers’

[9] Australian Competition and Consumer Commission v Trivago N.V. [2020] FCA 16

[10] Cameron (CMO), N. (n.d.). Trivago fined $44.7m by Federal Court for misleading consumers. [online] www.cmo.com.au. Available at: https://www.cmo.com.au/article/697458/trivago-fined-44-7m-by-federal-court-misleading-consumers/.


Tagged with: Technology, Artificial Intelligence, AI, AlgorithmsCompetition, GDPR

This website is available “as is. Turkish Law Blog is not responsible for any actions (or lack thereof) taken as a result of relying on or in any way using information contained in this website, and in no event shall they be liable for any loss or damages.

The content and materials published on this website are provided for informational purposes only and should not be used as a legal opinion in any way. This website and the information contained are not intended to establish an attorney-client relationship.
Ready to stay ahead of the curve?
Share your interest anonymously and let us guide you through the informative articles on the hottest legal topics.
|
Successful Your message has been sent