Intellectual property (IP) rights, such as copyright, have proved to be extremely adaptable over the years in accommodating new technologies, but probably the greatest challenge IP faces is accommodating the advances in artificial intelligence (AI”), often referred to as “machine learning”.
AI covers many things. In essence, it encompasses a range of algorithm-based technologies, which in some forms seek to copy human thought processes to solve complex tasks.
An AI system usually involves the creation of an algorithm, which uses a collection of data to model some aspect of the world. It then applies this model to a new data set, in order to make predictions, recommendations or classifications.
Some AI systems are fully automated when put into use, so that the AI output and any decisions based on it, are implemented without any human intervention or oversight.
In other systems, the AI output is combined with other information and then considered by a human, who makes a decision based in it. (This is often referred to as “having a human in the loop”).
The most relevant IP rights when considering AI are copyright and patents.
It may seem a strange place to start when considering IP and a machine, but a case involving a monkey is informative.
The so-called “money selfie case” (Naruto -v- Slater) in the USA was concerned with the authorship of a photograph. The camera equipment had been set up by Mr Slater (shutter speed, lens, aperture, etc) near a troupe of monkeys. One of them pressed the shutter button and took a photograph of himself.
After a protracted series of cases, with the monkey being represented by PETA, (an organisation for the ethical treatment of animals), the US courts decided that a non-human could not be the author of the copyright work.
A similar result would have occurred under English law. Under the Copyright, Designs and Patents Act 1988 (CDPA) (s11) the author is the first owner of a work (generally speaking).
While a human photographer would normally be the author and first owner of a photographic work taken by him, if an animal (or machine?) “took” a photograph, when the photographer had set everything up, who would own the copyright?
S9(3) of the CDPA provides that if a photograph (an artistic work) is “computer generated”, the author will be the person by whom the arrangements necessary for the creation of the work were undertaken. “Computer generated” means that the work is generated by a computer in circumstances such that there is no human author of the work.
It is worth noting that s 9(3) only applies in the UK and very few other countries have similar legal provisions. This means that the copyright protection for AI generated works in other jurisdictions may not exist.
Computer generated works have a shorter period of protection than other works – 50 years from the end of the year when the work was made.
This leaves open the question of what would be the position with AI machines which are capable of making their own judgments and decisions, with no human input?
Under the current law, it is likely that the author and first owner of any copyright work created by such a machine would be the legal person who created the AI machine – under s 9(3), they would be the person who made the arrangements necessary for the work to be created.
There are two aspects of patentability to consider. First, the AI machine itself and second, the output from an AI machine.
Certainly, some AI machines have been patented, but there are a number of hurdles to overcome. One problem is that it may be difficult or impossible to describe how the invention works – because it all occurs in a “black box”. An essential requirement of a patent is that it must be able to teach an ordinary skilled person how to perform the invention by including a description in the claims of the patent.
This problem can sometimes be overcome by setting out the algorithms, source code or other description of how the invention works. However, that has its own issues, because a patent cannot be granted for such things as a mathematical method, a method for performing a mental act, pure business methods and programs for computers “as such”.
It may be possible to overcome the problem if it can be claimed successfully that the invention makes a technical contribution or has a technical effect – perhaps a new method or approach derived from using the AI machine. Alternatively, the AI machine creator may try to keep the algorithms or machine logic confidential and protect it as a trade secret.
Turing to the output issue, it is conceivable that an AI machine could make or develop a patentable invention, but can it own that invention and any resulting patent?
Section 7 of the Patent Act 1977 provides that a patent for an invention will be granted primarily to the inventor. An “inventor” is defined as the actual deviser of the invention. The devising will most likely relate to the development of the programming logic or development of the algorithms. Since devising an invention is a human activity and patents are property rights which can only be held by a legal person, logically, the “deviser” cannot be the AI machine.
There is currently a gap in the law where an AI machine makes an invention without any human intervention. There is no equivalent to s 9(3) CDPA, which only applies to copyright works. No one (except perhaps the devisers of the underlying algorithms) can claim to be the inventor and so no one can apply for a patent. (Some applicants for patents have tried to name their AI machines as inventors, leading the UK Intellectual Property Office to issue a statement:” An AI inventor is not acceptable as this does not identify a “person”, which is required by law. The consequence of failing to supply this information is that the application is taken to be withdrawn.”)
In the DABUS case, Dr Stephen Thaler applied for patents in the USA, the UK and the EU, claiming that the inventor was an AI machine called DABUS (Device for the Autonomous Bootstrapping of Unified Sentience). Dr Thaler also claimed that he had acquired the right to the grant of the patents by “ownership of the creativity machine, DABUS”.
In the UK, the case came before the High Court. The Court had little difficulty in confirming the decision of the Intellectual Property Office. In essence:
Interestingly, the High Court Judge added a postscript to the judgment. He made the point that the question of whether the owner/controller of an AI machine that *invents” something can be said, himself, to be the inventor, had not been raised in the case. He went on to say: “I in no way regard the argument that the owner/controller of an AI machine is the “actual deviser of the invention” as an improper one….. It would be wrong to regard this judgment as discouraging an applicant from at least advancing that contention, if so advised.”
With increasing levels of sophistication, the creative link between the creator of the AI machine and the work created by the AI machine will inevitably reduce. This may give rise to the argument that a truly autonomous and “free thinking” AI machine should be entitled to own the copyright in creative works made by it. The same arguments would apply to patents and possibly also the one outlined in the postscript to the DABUS judgment.
However, this in turn would create further questions. For example, who would be entitled to any royalty stream or other revenue from the exploitation of the AI created work or patented invention? Should it be the owner of the AI machine at the time of the creation? Should the rewards be shared with the original manufacturer of the AI machine? How could an AI machine effectively assign any IP rights?
The UK Intellectual Property Office recently called for views on various issues concerning AI and IP rights. However, for the time being, we simply do not have definitive answers to the various questions raised.
Barely was the ink dry on this article when we had a potentially ground-breaking decision of the Australian Federal Court (Thaler -v- Commissioner of Patents 30 July 2021), in which it was decided that an AI machine (DABUS again) could be an “inventor” for the purposes of a patent application. The decision may be appealed, but for now Australian patent law (which has no provision precluding an AI machine from being an inventor – unlike the legal systems referred to above) recognises an AI machine as an “inventor”. The judge went on to say that, Dr Thaler, as the owner and controller of DABUS, would own any inventions made by DABUS when they came into his possession, without the need for an assignment.
It remains to be seen whether the decision will be overturned on appeal or if any other jurisdictions adopt similar reasoning to that of the Australian Court, but the unanswered questions outlined above remain – for now.