Since the release at the end of June 2021 of Copilot, Github’s programming artificial intelligence, the software has been at the heart of many debates among developers. Numerama asked them for their opinions: a real tool of the future allowing faster coding or window dressing?
Entire functions, APIs and lines of code: this is what Copilot, the artificial intelligence of Github and OpenAI, promises to write all by itself. Unveiled last week, Copilot is presented as the future of computing by its creators, a “” tool that would reduce programming time. The promise of Copilot is immense: the software would understand the context of the code and even adapt to the style of the developers who use it.
Yet, barely announced, the arrival of Copilot is already creating a stir and fueling many debates in the coding community. From MEP Julia Reda to countless blog posts, lives Twitch, videos YouTube or forum topics, on the Internet, many experts or testers are pouring out on the subject. Not reliable enough, full of errors, lack of understanding, reflection, licensing problems, or the end of inventiveness in the code: the list of faults and criticisms addressed to the software is long. But why do we blame Copilot so much?
Copilot’s impressive promises
What is Copilot? Developed by the Github platform and by OpenAI, a company specializing in the design of artificial intelligence, Copilot is presented as a revolutionary feature. Unveiled on June 29 by Github, Copilot promises to help developers write faster, in particular by offering them snippets of code, or even entire functions, and thus automating a whole part of the process.
On its site, Github Copilot boasts of being able to automate certain repetitive tasks, to offer code based on developer feedback, to understand the context of a project in order to adapt, to learn the style of the developers in order to s ” adjust to their way of working, or to know what to offer to users who would be blocked.
In short, Copilot’s promises are particularly impressive. For now, however, the software is not yet available to the general public: only a few developers registered with the have been able to get their hands on the tool, and are starting to share their impressions, little by little. While many agree that Copilot is going to make a big difference in IT, not all agree on one point in particular: is it for better or for worse?
Solve “niche problems”
The first question to answer is: does Copilot live up to its promises? Does the software allow you to write better, faster? Does it suggest functions and code snippets relevant to the project, does it understand the context?
For the developers we interviewed, the answers vary. Copilot is already a tool that has some usefulness. “”, Explains Kyle, an American developer who was able to have access to the technical preview, and with whom Numerama spoke. On the internet, many of them share his opinion. “”, Estimates a person interviewed by the specialized newspaper Infoworld, “”, recognizes InfoQ, “”, tells another tester on his blog.
“” Kyle adds. Copilot is, for the moment, too limited a tool. “” Kyle continues.
Of course, we must not forget that Copilot is still in its phase of, a stage precisely dedicated to tests and the latest improvements to be made to the product. Not to mention the fact that the more AI trains with coders and developers, the more relevant it will be. There is a good chance that the most mundane mistakes that have so far been spotted will not be repeated.
But that’s not the only downside of Copilot.
Lots of limitations
For engineer Stanislas Signoud, Copilot fluctuates between a “” tool. “”, He asserts. A particularly harsh opinion, but which is shared by several other engineers or specialists.
First of all, because there are some glaring examples where Copilot does not work at all, when these are simple problems. Stanislas Signoud thus spotted many errors. “”, He regrets.
The consequences of these bad writings can be found in very many lines of code, which could, lengthen the work of the programmers more than anything else. In the end, with Copilot “”, estimates the media specializing in IT InfoQ.
“”, Explains Stanislas Signoud. Same story with Kyle, who works on Rust: “”.
This is not really a surprise: Github has recognized on his own that Copilot works best on widely used languages, those on which he has, logically, been able to train the most. But this lack of training in certain languages could have a direct consequence on many projects in the future: “”, continues Stanislas Signoud.
Copyleft lines of code?
Another criticism leveled at Copilot concerns the source of the code it offers to its users. And this represents one of the thorniest issues facing the platform. The reason for all this is however due to what makes the strength of Copilot: the fact that artificial intelligence has trained with billions of lines of open source code, some of which (repositories where data is stored of a project and shared files) hosted publicly on Github.
Contrary to what one would have expected, it is the open source character that is the problem here. Indeed, an open source code is accessible to everyone for free. However, the use that can be made of it is controlled. This is particularly the case for software under the copyleft license: their code is accessible free of charge, and can be reused, but only on condition that the product for which it is used is, at the end of the day, also accessible free of charge.
However, the problem with Copilot is that we cannot know if the pieces of code or the functions it offers are original creations, or copy and paste of codes found in the open source archives – and therefore potentially under copyleft.
“”, Explains Me Bérénice Ferrand, lawyer specializing in intellectual property law. “”.
Contacted, Github told us that they were not answering questions for the moment, but that they would make available more information about the design of artificial intelligence “”. For the moment, it is impossible to know where the lines of code suggested by Copilot come from. And, unless you’re doing manual comparisons, there’s no real way to tell for sure if they’re really original.
The fact that Copilot may one day become chargeable adds another problem to this legal headache: what if users unknowingly used snippets under the copyleft license? If Copilot offered them a copy and paste of a feature under copyleft, and a developer uses it on a project that is not open source, how can this problem be legally resolved?
No legal framework
Above all, for the moment, there is not yet a legal framework in France that can protect the users or designers of software such as Copilot. “Data mining,” says Me Ferrand. This directive provides a legal framework for companies using Copilot, “” she adds.
But, for the moment, this directive does not yet exist legally. “”, She explains. Even if the (analyzing a large amount of data to extract interesting information from it) is technically not allowed by intellectual property law, there is an indulgence. “”
This lack of a legal framework may not be a problem in the long term, however: France must transpose the European Union directive into its body of law. There is for the moment no date defined for the transposition of the directive, even if Me Ferrand thinks that this could happen soon. But the legal vagueness does not really encourage confidence.
What future for Copilot?
Apart from all the technical and legal limits, the last brake that Copilot encounters is above all the fears and opinions of the developers themselves. Far from being enthusiastic about being able to be assisted by artificial intelligence, some are really reluctant. Stanislas Signoud is categorical: “”
Still others criticize the fact that Github “” of the whole free culture, taking as an example the problems around copyleft and the freedoms they have taken by training their AI on the open source codes of users of the platform. “”, Continues Stanislas.
But whether or not we agree with Github’s methods and how to use Copilot, there are more and more programming support software out there, and they are growing in popularity. It’s been a few years since Microsoft’s IntelliCode tool came out, and it relies a bit on the same operation, suggesting snippets of code to developers. For now, Stanislas Signoud believes that Copilot will certainly remain a niche software, and little used for the next 5 or 10 years. “”, He explains.
Either way, the movement has already been underway for years, and certainly won’t stop there: Copilot may be the first artificial intelligence for programming assistance, but it certainly won’t be the last. .