G

GitHub Copilot EU AI Act Compliance Profile

GitHub (Microsoft)

Risk Classification
MINIMAL
Voluntary
Minimal Risk AI System
Model Info
Provider Info
ProviderGitHub (Microsoft)
Categorycoding
Obligations
4apply
~20h effort
Ensure AI Literacy of Staff
Label Deep Fakes and AI-Generated Content for Public
Provide Explanation of AI Decisions to Affected Persons
Cooperate with Regulatory Authorities
$ npx complior scan

Your risk depends on how you use GitHub Copilot

Usage ContextRisk LevelObligations
Internal coding toolMINIMAL3 obligations (~12h)
Customer support botLIMITED7 obligations (~32h)
HR screening / hiringHIGH19 obligations (~120h)
Credit decisionsHIGH19 obligations (~120h)
Medical triageHIGH19 obligations (~120h)

Why this tool is classified as MINIMAL

AI pair programmer providing code completions and suggestions in IDEs.

Applicable Articles

Article 4Ensure AI Literacy of Staff
REQUIREDDEADLINE PASSED
Obligation under Article 4 for GitHub Copilot deployers.
Article 50(4)Label Deep Fakes and AI-Generated Content for Public
REQUIREDAUG 2026
Article 26(11) / Article 86Provide Explanation of AI Decisions to Affected Persons
REQUIREDAUG 2026
Article 26(10) / Article 21Cooperate with Regulatory Authorities
REQUIREDAUG 2026
Article 50(1)Disclose AI Interaction to Users — Chatbot/Assistant
PROVIDER: GitHub (Microsoft)
Article 50(2)Mark AI-Generated Content — Machine-Readable
PROVIDER: GitHub (Microsoft)
Article 53(1)(a)-(b) / Annex XI / Annex XIIGPAI: Technical Documentation per Annex XI
PROVIDER: GitHub (Microsoft)
Article 53(1)(b) / Annex XIIGPAI: Downstream Provider Information (Annex XII)
PROVIDER: GitHub (Microsoft)
Article 53(1)(c)GPAI: Copyright Compliance Policy
PROVIDER: GitHub (Microsoft)
Article 53(1)(d)GPAI: Publish Training Data Summary
PROVIDER: GitHub (Microsoft)
Article 55GPAI Systemic Risk: Model Evaluation and Adversarial Testing
PROVIDER: GitHub (Microsoft)

Who does what

GitHub (Microsoft) (provider)Their job

  • Ensure AI Literacy of Staff (Article 4)
  • Disclose AI Interaction to Users — Chatbot/Assistant (Article 50(1))
  • Mark AI-Generated Content — Machine-Readable (Article 50(2))
  • Mark AI-Generated Images — C2PA/Watermark (Article 50(2))
  • GPAI: Technical Documentation per Annex XI (Article 53(1)(a)-(b) / Annex XI / Annex XII)

You (deployer)Your job

  • Ensure AI Literacy of Staff (Article 4)
  • Label Deep Fakes and AI-Generated Content for Public (Article 50(4))
  • Provide Explanation of AI Decisions to Affected Persons (Article 26(11) / Article 86)
  • Cooperate with Regulatory Authorities (Article 26(10) / Article 21)
See full obligation checklist

Risk Assessment Reasoning

GitHub Copilot classified as GPAI model with systemic risk under Article 51-56 EU AI Act. Model exceeds 10M MAU threshold. Additional obligations apply including adversarial testing and incident reporting.

Similar Models

Frequently Asked Questions

What is GitHub Copilot's EU AI Act risk classification?

+

GitHub Copilot is classified as MINIMAL under the EU AI Act.

What are my obligations if I deploy GitHub Copilot?

+

As a GitHub Copilot deployer, you have 4 base obligations (~20 hours estimated effort). Key articles: Article 4, Article 50(4), Article 26(11) / Article 86, Article 26(10) / Article 21.

What is GitHub Copilot?

+

GitHub Copilot is a Unknown model. It has 0 downloads on HuggingFace.

What are the EU AI Act deadlines for GitHub Copilot?

+

Already passed: Ensure AI Literacy of Staff — 2025-02-02. Upcoming: Label Deep Fakes and AI-Generated Content for Public — 2026-08-02. Upcoming: Provide Explanation of AI Decisions to Affected Persons — 2026-08-02. Upcoming: Cooperate with Regulatory Authorities — 2026-08-02.

Check GitHub Copilot compliance in your codebase

One command to scan. Open-source CLI.

$ npx complior scan