Your IP: Unknown · Your Status: ProtectedUnprotectedUnknown

Skip to main content

Is GitHub Copilot safe to use at work, or should you avoid it?

GitHub Copilot made news as a revolutionary, AI-powered tool designed to help with coding and reducing software developers’ workloads. Sounds good, doesn’t it? In reality, it has both advantages and disadvantages. Find out if GitHub Copilot is secure and legal and what risks are associated with it.

Is GitHub Copilot safe to use at work, or should you avoid it?

What is GitHub Copilot?

GitHub Copilot is an AI-based tool designed to interpret natural language prompts and turn them into code suggestions based on project context. It was launched in 2021 and is the result of a collaboration between GitHub, Microsoft, and OpenAI.

GitHub Copilot is based on a modified version of the GPT language model, which uses machine learning to “understand” human prompts and mimic human responses by repeatedly predicting the next word that should appear in a text. In addition, Copilot has been trained on a large amount of open-source code to enhance its ability to generate accurate and functional code suggestions.

While the tool was designed to make developers’ lives easier and reduce their workload, experts disagree on whether or not it works as intended. Some have expressed concerns about the security of GitHub Copilot, mainly about licensing, fair use, telemetry, and the general safety of relying on artificial intelligence in cybersecurity.

Is GitHub Copilot safe to use at work?

GitHub Copilot is advertised as a revolutionary tool capable of completely changing how the software development process will look in the future. However, some experts point out that while it shows promise, it may not be entirely safe to work with in business settings due to the risks involved with it.

These risks include unclear legal aspects of using the Copilot-generated code as well as the danger posed by the fact that it’s a cloud service that sends data between the user and its own server. Such data might contain vulnerable information, which is why using Copilot in confidential projects might not be the best idea.

Knowing the risks associated with Copilot is important because developers can still use it as a shadow IT tool even if a company does not officially allow it. The term “shadow IT” refers to using applications and tools inside an organization without obtaining approval, and GitHub Copilot may very well fall into this category.

The potential risks of using GitHub Copilot at work

As mentioned, GitHub Copilot involves some risks. These risks are mainly related to cybersecurity and legality but also include ethical issues. If you’re one of the people asking themselves if Copilot is safe to use at work, then it’s worth looking into these problems.

Security and privacy concerns

According to a 2021 study, 40% of Copilot-generated code contained security vulnerabilities. Of course, you need to remember that this study was conducted on an early version of Copilot, but that doesn’t change the fact that artificial intelligence is exactly that: artificial. The generated code is not written with intent but predicted based on training data, and GitHub is not responsible for errors. The tool’s operator is responsible for checking suggestions for possible vulnerabilities.

On top of that, GitHub Copilot uses telemetry to collect and analyze some information related to the use of the tool. This may include code snippets. The data collected by the tool is sent to GitHub’s servers and thus outside the company, which may violate security and confidentiality policies. Telemetry can be turned off in Copilot’s settings. Nevertheless, Copilot doesn’t run locally, but in the cloud, communicating with GitHub’s servers, which means you should always think twice before including it in a highly confidential project.

GitHub Copilot has been trained on countless GitHub projects, which were theoretically public. The problem is that these projects used different forms of licenses. Open-source projects use several licenses, and while some allow free code modification and redistribution, others do not. For example, the GPL license requires that all derivative works have source code available and are placed under the same license. In other words, derivation from GPL-licensed code in a commercial close-sourced project may be considered copyright infringement.

Copilot’s training data doesn’t distinguish between different licenses. There is also no guarantee that the tool won’t suggest code that is simply copied from another public project, so if you accept Copilot’s suggestions, you could use plagiarized and licensed code in your project without knowing it.

GitHub has addressed this problem, and Copilot now warns users if it detects plagiarized code. However, this feature is not perfect, and errors can happen. If the tool makes a mistake and nevertheless suggests license-covered code, and you fail to comply with its terms, you risk legal consequences.

Code quality

GitHub Copilot generates code by predicting what should come next after each part. It doesn’t write it with intent or understand what it’s doing on a human-programmer level. It can certainly be helpful, but it’s important to remember that its suggestions might be flawed or contain mistakes, biases, and bugs derived from the training data.

Like all generative AI programs, Copilot is limited to its training data and is not very good at coming up with something new and unique. It can generate inefficient code while other, better solutions exist.

Efficiency

Copilot’s impact on programmer efficiency is inconsistent: on the one hand, it’s a tool designed to reduce the burden on programmers by suggesting ready-to-use code snippets. On the other hand, it can provide more work for code reviewers.

Reviewing AI-generated code is different from reviewing human work. If someone has repeatedly done an excellent job on a task, you can generally trust them to maintain that performance. Artificially generated code, however, can be unpredictable and contain errors and vulnerabilities in unexpected places. This adds work for programmers, who have to fix these problems, and for reviewers, who have to check AI-generated code more carefully.

Other concerns

Last but not least, GitHub Copilot, like all generative AI tools, raises concerns about the future of education and the job market. Some experts note that its widespread use could lead to over-dependence and stifle developers’ growth.

It’s also impossible to predict what generative artificial intelligence will look like in the future. Chances are that currently flawed and limited tools will become so good that hiring junior developers won’t be necessary. This, in turn, could lead to an unsustainable future in which no organization will want to train beginners, but everyone will be desperate to hire experienced senior programmers.

How to reduce the risk of using GitHub Copilot at work

GitHub Copilot is certainly a promising tool, but by no means can it be considered the end-all solution for any coding project. If you want to implement it into your processes, it’s worth following some ground rules:

  • Review the code. Generative AI is not perfect and requires human oversight. It’s safe to say that AI-generated code needs more review than human-written code because it can be more unpredictable. Copilot’s suggestions should be treated as what they are: just suggestions.
  • Test the code. Copilot’s code may be imperfect and contain security vulnerabilities. Always check and test it before adding it to your project.
  • Watch out for unauthorized access to data. GitHub Copilot doesn’t run locally, but it’s a cloud-based service. GitHub may store information about the tool’s usage on its servers. Keep this in mind if you are working on a confidential project that contains sensitive data.
  • Check for plagiarism. We recommend using additional tools to check code for plagiarism and possible license violations. Copilot compares short code snippets with public data to warn of plagiarism, but it’s not perfect. It’s always better to double-check than to risk legal consequences.
  • Keep Copilot updated. GitHub updates Copilot, continuously providing it with better functionalities and security features. Keep the software updated to benefit from the latest improvements.

Can I use GitHub Copilot at work?

Although GitHub Copilot is a promising tool, it’s a bit problematic. We don’t discourage you from using it altogether, but we recommend caution, especially if you want to use it to help with your work. Since Copilot may not be the most secure solution and may reveal confidential data, it would be best to consult with your supervisors before deciding to implement it into your coding process.

If you’re already using Copilot, remember that the responsibility for writing secure and legal code lies with you, not your tools. Use its suggestions and work smarter, but keep in mind that all AI-powered tools should be treated as help, not as a final solution.