Intelligent procurement of AI solutions
24 June 2020
The use of artificial intelligence in commerce, especially online commerce, is fast gaining traction, and sometimes in a manner that is not so visible to consumers or other purchasers.
In light of this, the ICO and The Alan Turing Institute have published guidance for businesses (here) on how to explain decisions made by AI, which should be of interest to all businesses trading in the EU.
The guidance is designed to help organisations navigate the regulatory requirements, legal risks and ethical issues involved in using AI to make decisions about individuals. It is separated into three parts, each of which is largely aimed at addressing certain groups:
1.The Basics of Explaining AI: outlines the legal framework for explaining AI to individuals, including what needs to go in to an explanation and the benefits and risks associated with doing so.
2. Explaining AI in Practice: aimed at technical teams and sets out tasks that organisations can use to assist them when explaining AI-assisted decisions to individuals.
3. What Explaining AI means for your Organisation: aimed at senior management, and covers the roles, policies, procedures and documentation that organisations can put in place to ensure they provide meaningful explanations.
This short article highlights key areas to consider when procuring AI, and how you can de-risk your purchase and use of AI solutions.
- Due diligence
Have you done diligence on your supplier (including from a physical, financial and data privacy perspective), and obtained references from other customers?
- Type of access/licence
Where licensing ready-made solutions (licence or SaaS), think about the contract term. The supplier may favour a longer-term contract to ensure a healthy revenue stream, but that might not be the best position for your business. So, consider how long you would like the agreement to last for initially – do you need a trial licence/access? Are you being offered a fixed term licence, perpetual licence or subscription-based access? If fixed term, see what extension options are available to you and at what cost. Also, think about whether you need to negotiate any form of exclusivity.
- Third party access
If you think group companies or third-party suppliers/contractors may need access to the software to use, modify or support it, these rights should be set out.
- Software development
If the solution is being developed or customised for you, see if you can articulate key requirements, deliverables, timescales and associated milestones. Setting this out upfront in the contract will ensure that you and your supplier have a clear understanding of what is expected, by when and for what cost. This route (with certainty of cost and timing) tends to suggest a traditional ‘waterfall’ approach should be used. However, if your requirements are less certain and/or your speed to market/deployment is more important, an agile approach may be better (albeit riskier from a budget and cost perspective).
Acceptance testing is also key and will enable you to set out how and against what criteria the deliverables will be tested. Ideally, milestone payments should be linked to testing of key deliverables. The provision should also set out what happens if the deliverables fail to meet the acceptance criteria. Here, the supplier can be required to undertake root cause analyses and correct the failures at no additional cost to the customer.
If there are people in the development team who have particular knowledge or skills that are vital to the project, such as technical lead, architect or project manager, a ‘key personnel’ clause will help to ensure you have and retain access to the right people for the job.
- Maintenance and hosting
It’s important that support obligations are clearly defined (including patching if applicable) and that adequate SLAs are agreed for response and fix times. You may also need suitable uptime/availability commitments if the software is being hosted.
- Use of Open Source
Open source has become a key driving force in the development of AI. However, the benefits gained by the lack of barriers can create a heightened risk from a quality perspective and potentially jeopardise IP ownership, depending on how permissive or restrictive the open source licence is. To gain some control over this risk, your contract should contain a provision that requires the supplier to seek your advance consent to the use of open source code.
- Who owns any new IP
If the supplier is developing code for you, the contract should address who owns the new IP in the solution. Do you need to own the IP or, if that is not negotiable from the supplier’s perspective, is there alternative contractual protection that allows you to achieve your objective but allows the IP to reside with the supplier, such as the supplier agreeing not to use or license the new IP in certain fields?
What does data protection law have to do with AI?
Where AI doesn’t involve the use of personal data, it falls outside the remit of privacy law. However, in many cases, especially health and marketing activities, vast amounts of personal data (which may include special (formerly, ‘sensitive’) data) are processed to train and test AI models. On deployment, personal data is collected and fed through the model to make decisions about individuals. Those decisions about individuals – even if they are only predictions or inferences – are themselves personal data.
Data privacy law is technology neutral – although the GDPR and the DPA 2018 do not directly reference AI or machine learning, they have a significant focus on large scale automated processing of personal data. Specific reference is made to the use of profiling and automated decision-making. The GPDR gives data subjects various rights:
Right to be informed
Articles 13 and 14 of the GDPR give individuals the right to be informed of:
- the existence of solely automated decision-making producing legal or similarly significant effects;
- meaningful information about the logic involved; and
- the significance and envisaged consequences for the individual.
Right of access
Article 15 of the GDPR gives individuals the right of access to:
- information on the existence of solely automated decision-making producing legal or similarly significant effects;
- meaningful information about the logic involved; and
- the significance and envisaged consequences for the individual.
Right to object
Article 21 of the GDPR gives individuals the right to object to processing of their personal data, specifically including profiling, in certain circumstances. There is an absolute right to object to profiling for direct marketing purposes. It also gives rights related to automated decision-making including profiling.
Rights related to automated decision-making including profiling
Article 22 of the GDPR gives individuals the right not to be subject to a solely automated decision producing legal or similarly significant effects. There are some exceptions to this and in those cases it obliges organisations to:
- adopt suitable measures to safeguard individuals, including the right to obtain human intervention;
- express their view; and
- contest the decision.
DPIAs
Article 35 of the GDPR requires organisations to carry out DPIAs (Data Protection Impact Assessments) if their processing of personal data, particularly when using new technologies, is likely to result in a high risk to individuals. A DPIA is always required for any systematic and extensive profiling or other automated evaluation of personal data, which are used for decisions that produce legal or similarly significant effects on people.
DPIAs are therefore likely to be an obligation if you are looking to use AI systems to process personal data, and you should carry them out prior to the processing in order to identify and assess the levels of risk involved. DPIAs should be ‘living documents’ that you review regularly, including when there is any change to the nature, scope, context or purposes of the processing.
The ICO has published additional guidance on DPIAs, including a list of processing operations which require a DPIA. The list mentions AI, machine learning, large-scale profiling and automated decision-making resulting in denial of a service, product or benefit.
If a DPIA indicates there are residual high risks to the rights and freedoms of individuals that cannot be reduced, you should consider taking legal advice and must consult with the ICO prior to the processing.
Privacy by design
Last but not least, Article 25 of the GPDR requires controllers to implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation. These measures are intended to integrate the necessary safeguards to meet the requirements of the GPDR and protect the rights of data subjects. It also requires controllers to implement appropriate technical and organisational measures for ensuring that, by default, only personal data necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of processing, the period of storage and their accessibility. This may create some tension with the use of AI solutions which needs careful planning and consideration from the outset, all of which should be explored when considering which supplier to use.
For another example of how data privacy can have an impact on AI, read our article: The privacy risks of an artificially intelligent exit to lockdown.
If you need help procuring your AI solution or have questions regarding the regulatory implications of how it can be used, do get in touch.