June 14, 2022

Using AI to Develop a Better Clean Room

Holland & Knight IP/Decode Blog
Jacob W. S. Schneider
IP/Decode Blog

If you send your computer-generated MacGyver episode ideas to the production company, it is unlikely to review them. When companies do solicit creative ideas from the general public, they will likely require a signed agreement before you share anything. Such an agreement often says that with whomever you share the idea will not share it with others at the company unless and until the idea is licensed or purchased from you. Those individuals tasked with reviewing idea submissions are subject to policies that prevent them from sharing the ideas with others at the company.

The reason for these precautions is simple: If the company reviews your ideas, rejects them and later produces something even remotely similar to your ideas, then you will feel wronged. And people who feel wronged file lawsuits.

Despite these precautions, others' ideas do spread, or can even incorrectly appear to spread, in companies, and companies may use "clean rooms" (also called "white rooms") to limit their spread and ensure that what the company creates is truly independent and homegrown. This post explores the use of clean rooms and how AI could be leveraged to develop a better clean room.

Ideas Are Stubborn, Infectious

The reason companies either ignore idea submissions or carefully wall them off from most of their employees is because intellectual property is a unique area of law. Intellectual property largely protects intangible things: ideas, inventions, creative expression and brand goodwill (among others). These intangible things can spread fast, and you cannot delete an idea from someone's mind once it is shared, nor can you easily fix the erroneous appearance that an idea has spread.

That final point is on the mind of every trial lawyer. Lawyers will say that we "cannot unring the bell" when explaining that a statement made in front of the jury cannot be taken back. Once a jury hears that a civil defendant is wealthy, his wealth will necessarily rest in their minds. And while the judge may give an instruction to ignore the defendant's wealth, the fact may affect a juror's judgment of the case. Such statements could lead to a mistrial, and lawyers and judges often huddle together in a sidebar during trial to prevent the idea's disclosure before it happens. The inability to unring a bell is also why we are culturally so protective of TV/movie spoilers. For example: Once you know that Darth Vader is Luke Skywalker's father, you will never forget that fact or the person who ruined it for you.

Ideas are like files you cannot delete, but they are also "infectious." If an engineer hears an idea from you, her mind is "infected" with the idea and it could influence her development of products. That fact pattern has resulted in many lawsuits for trade secret misappropriation and other claims. And when litigation arises, it is difficult to prove or disprove that the engineer used your idea in the product. The facts will show that she knew about the idea and created the product, but because she cannot "unring the bell," she cannot say for certain that her mind did not use the idea in some way, however slight.

Our minds are swirling soups of ideas from various places, which leads to this ambiguity regarding what inspires us and how we reach conclusions. Take two Beatles as examples. Paul McCartney wrote "Yesterday" while he was dreaming. After he woke up, fell out of bed, dragged a comb across his head, he asked everyone where the melody came from because he was certain he had heard it somewhere else. In time, however, he learned that it was his. George Harrison had the opposite experience. He wrote "My Sweet Lord" in 1969 while the Beatles were together, but released it shortly after the band broke up. The song has a chord progression and structure that is very similar to the 1962 Chiffons hit "He's So Fine," which sparked a lawsuit that Harrison lost. The court credited Harrison's testimony that he did not have his mind set on "He's So Fine" while he wrote "My Sweet Lord," but instead subconsciously conjured the older tune:

What happened? I conclude that [Harrison], in seeking musical materials to clothe his thoughts, was working with various possibilities. As he tried this possibility and that, there came to the surface of his mind a particular combination that pleased him as being one he felt would be appealing to a prospective listener; in other words, that this combination of sounds would work. Why? Because his subconscious knew it already had worked in a song his conscious mind did not remember. Bright Tunes Music Corp. v. Harrisongs Music, Ltd., 420 F. Supp. 177, 180 (S.D.N.Y. 1976).

Neither Paul nor George could explain the inspirations for their hit songs. Paul's sprouted from his own brilliant mind, while George subconsciously used another songwriter's idea.

Because ideas can infect engineers (just like anyone else), companies often conclude that this risk of idea contagion is far too high: It is better to throw the idea in the trash before reading it or ensure that whomever does read it could not infect those responsible for product development. When those policies fail, however, we first investigate how far the idea spread, then determine whether anyone infected with the idea worked on the product. If an infected engineer worked on the product, that is a fact that the plaintiff will exploit during litigation.

Enter the Clean Room

Companies borrow from virology to prevent the spread of an idea by keeping a group of engineers separate from anyone who has come in contact with the idea. These "clean" engineers operate in a clean room environment that is completely separate from the idea. There is no need to perform the impossible probing of these engineers' minds or deleting facts from their minds, because whatever product they develop could not have been influenced by the idea. They never learned of the idea, so it could not have had any influence on the final product.

Imagine, however, a scenario where a company is exposed to an idea, but it is not one that they want to avoid. The idea a great one. The company would relish the opportunity to develop the idea, but the idea's originator refuses to sell or license the idea. In this scenario, the company could establish a clean room and hope that the clean engineers independently develop the same idea or something equally strong and valuable. It is a gamble that invites litigation because the idea's originator will not know that the company's engineers independently developed the same idea. And when litigation arises, there will be intense scrutiny surrounding what the inputs to the clean room were (what the engineers were told to do) and how "clean" the clean room really was (did the engineers have any contact with others who knew of the idea?). Engineers tend to be smart and organized people (and some even keep notebooks of their work and discoveries), but even they will forget exactly what inspired them to pursue a particular idea three years ago or precisely how they developed that idea into a product.

The AI Clean Room

AI is an ideal candidate for clean room work where a company hopes that the output is a specific idea because AI suffers none of the human mind's ambiguity.

This blog has cataloged a few examples of AI solving problems: generating new MacGyver plotlines, creating works of art from natural language and writing prior art "inventions." Like all other software, these AI programs are a matter of input, instructions and output. The programs take input (text), run a series of instructions (the AI's source code) and produce output (e.g., a piece of art).

This process is in sharp contrast to the human mind's process because AI's input, instructions and output are known to us, even if they are complex. An AI clean room can do what we could never do with engineers: remove or omit facts from their mind and confirm that others' original ideas were not included in their decision-making.

Should litigation result, a company defending an AI's independent discovery of the idea could effectively "open the mind" of the software to show what it took as input and what instructions it executed to independently derive the idea. If the input and instructions are free from the original idea's influence, then that would be a powerful fact in the company's favor that no set of clean engineers could provide.

There are some practical considerations with using an AI cleanroom. First, with today's technology, an AI clean room would work only for limited, well-structured (i.e., easy for computers to process) problems. AI is currently incapable of astonishing leaps in understanding. AI will not guess the two postulates of Special Relativity, as Einstein did before proving them correct. AI can, however, explore the best or all possible solutions to a well-structured problem. Google's AI solution to the protein folding problem is an example of how AI, with the right input and instructions, can derive astonishing solutions that are independent from human thought.

Second, companies that seek to use an AI clean room will also have to confront the argument that they would not have even pursued the AI if the original idea was never submitted. For example, one might argue that the company sought to build a better mousetrap only because it received the original mousetrap idea. Even if the AI was completely clean of the original idea, the motivation to build the AI was the original idea and could be questioned. The rebuttal to this argument would be fact-specific. If the company at issue was in the business of making mousetraps, then the decision to create an AI mousetrap designer appears less inspired by the originator's idea than a desire to more generally leverage technology to build better products. Also, the AI could be tuned to solve a more abstract problem (trapping any animal) and hope that it produces a trap suitable for mice. This more abstract AI would produce many useless trap designs, but its input and instructions would be further removed from the original idea.

Third, it would be wise to develop the AI clean room with engineers not contaminated by the original idea. Although the AI clean room would be provably free of the original idea's influence, if engineers contaminated by the idea develop the AI cleanroom, then jurors may incorrectly conclude that the engineers biased the AI somehow toward the original idea.

Fourth, an AI clean room is likely to produce a large number of potential solutions. Each should be cataloged to help demonstrate the AI's effort because it tends to show that the AI was not results-based. In other words, the AI was not searching only for the original idea, but for any and all ideas that solved the problem at hand.

Even with these considerations in mind, an AI clean room would be a powerful tool for provably independent idea development. Companies should consider their use when faced with well-structured problems that AI could solve via known input and instructions.

Related Insights