Out-Law News 5 min. read
25 Aug 2022, 3:34 pm
Businesses planning to deploy artificial intelligence (AI) as part of their ESG strategy must consider data and intellectual property (IP) related issues, two legal experts have warned.
Speaking at an online event covering various IP and data issues, Cerys Wyn Davies and Mark Marfé of Pinsent Masons spoke about the increasing use of AI by businesses to accelerate their climate change and sustainability goals. However, AI requires investment, commitment to scale, visibility with a supply chain, prioritisation and collaboration.
Bella Philips of Pinsent Masons, who chaired the event, said that we are seeing many businesses across many sectors partner with technology suppliers to create new AI systems. For example, technology supplier mCloud has recently signed a MOU with Aramco to explore the co-development of AI-powered innovations to facilitate the carbon reduction of complex energy-intensive assets throughout Saudi Arabia.
Marfé highlighted one of the key challenges of developing AI systems with a partner. These systems rely on input data. Some of this data may be of core importance and highly confidential to the business – such as, in this case, Aramco – providing it. Aramco may wish to limit access, both in terms of personnel and purposes, to its data, while the technology supplier – in this case, mCloud – will want to be free to work on other similar projects in the future including with Aramco’s competitors.
“Aramco could limit access to its data – particularly highly confidential information – by anonymising it or reducing the amount of data that's being shared. But AI tools are only as good as the data on which they're trained, so if you limit what you provide, you're going to reduce the value in the outputs,” Marfé said.
Mark Marfé
Partner
AI tools are only as good as the data on which they're trained, so if you limit what you provide, you're going to reduce the value in the outputs.
Marfé explained: “The challenge for the data provider is that data is not ‘owned’ – rather, there are ‘ownership-like’ rights that subsist over data. Because the data provider doesn’t own the data by default – nor can it simply limit what data it provides- it needs to build and maintain data ownership into the project documentation. This means being wary of contract terms that may erode that data ownership.
Marfé said that the data provider should consider how ‘derived data’ – created by combining or processing existing training data – may be used by the supplier outside of the project or indeed, how much derived data needs to be created.
“Another question is whether Aramco’s data should be combined with third party data. Combinations of datasets can create some very valuable outputs, but what is the knock-on effect? Does the third-party data have restrictive terms of use meaning Aramco might be unable to sell or sublicense the AI system because of the third-party data resting within it?” Marfé said.
“Also consider whether there are any time limits on non-disclosure or non-use on your data. If, for example, the obligations on the supplier not to use Aramco data on other projects expires after five years, mCloud will be free to use it on projects for Aramco’s competitors,” Marfé said.
Marfé added that while confidentiality agreements between AI development partners can provide certainty about data use, there is a “positive obligation” to restrict the use of sensitive data and keep it secret.
Cerys Wyn Davies highlighted the IP considerations around AI development. She explained that an AI solution will be made up of a number of different elements and that it was important to assess how each element would be protected by IP.
“Visible hardware relevant to the AI solution will be protected by copyright or design rights but protection for software within the tool is more challenging,” Wyn Davies said. “Software as such is excluded from patent protection but showing that the AI tool can be used to create an invention that has a technical effect can open the door to patentability. Copyright can protect software but the limitation to be aware of here is that copyright will only prevent your source code being copied and will not protect the functionality of your software more broadly.”
Protecting details of your AI tool as trade secrets could also be a valuable form of protection in conjunction with other IP rights, according to Wyn Davies, but the ability to enforce your rights will depend upon you consistently treating this information as confidential, she added.
“One of the hot topics at the moment is IP protection for the outputs of AI, whether that be an invention, design or other work,” Wyn Davies said. “There is doubt about the patentability of AI developed inventions because of the lack of a human inventor, but there is clarity that in the UK at least certain computer generated works such as a piece of art will attract copyright protection even though no human author was involved in its creation. This is an area that IP offices around the world are looking at now and is one to watch,” she added.
Where parties are collaborating on AI development, Wyn Davies emphasised the importance of agreeing in advance what background IP each party is to contribute to the project and the ownership and user rights in respect of developed IP. She said: “When there are many collaborators working together, it's absolutely essential that they consider how the IP is going to be apportioned. We often see disputes – particularly where one party’s contribution to a collaboration has been purely financial. If, for example, a firm just pays a developer to carry out the AI development work, then the developer will own the IP despite the fact that the firm has paid them for it.”
Cerys Wyn Davies
Partner
When there are many collaborators working together, it's absolutely essential that they consider how the IP is going to be apportioned. We often see disputes – particularly where one party’s contribution to a collaboration has been purely financial.
Wyn Davies explained that it can often be difficult for the parties to decide whether to contribute significant amounts of background IP to the development project or whether to hold back their ‘crown jewels’ but explained that, generally, “the more you put in the better the results” – provided that use of that IP by the other party was carefully contractually regulated.
“Who will own the IP in the AI solution that you have created can often be an emotive question,” Wyn Davies said. “Even parties who agree to joint ownership of any arising IP can run into problems. Companies need to closely regulate what that actually means, because joint ownership of intellectual property means different things in different jurisdictions”.
“It also depends on which intellectual property rights are involved and where the IP was developed, so the contract needs to spell out exactly what joint ownership looks like in any particular case,” she said.
Agreements for collaboration on AI development should also deal with additional issues, including which party should carry out freedom to operate checks, who is responsible for applying for registered IP protection and who will pay for that. What will happen if the operation of the AI tool is alleged to infringe third party rights should also be dealt with. Wyn Davies said: “In order to preserve the true value of your AI solution you need to protect the IP in it and carefully contractually regulate the way in which you are going to commercialise the fruits of your collaboration.”
Wyn Davies said reform to IP regulatory frameworks around AI across the world was “fast moving”, and that different approaches were being taken in Europe, the UK and the US. “Higher levels of regulation are being proposed within Europe, but we're waiting for the autumn to see the direction of travel,” she added.