gbert large paraphrase cosine EU AI Act Compliance Profile
deutsche-telekom
Your risk depends on how you use gbert large paraphrase cosine
| Usage Context | Risk Level | Obligations |
|---|---|---|
| Internal coding tool | MINIMAL | 3 obligations (~12h) |
| Customer support bot | LIMITED | 7 obligations (~32h) |
| HR screening / hiring | HIGH | 19 obligations (~120h) |
| Credit decisions | HIGH | 19 obligations (~120h) |
| Medical triage | HIGH | 19 obligations (~120h) |
Why this tool is classified as MINIMAL
gbert large paraphrase cosine is a sentence similarity model by deutsche-telekom. Fine-tuned from deepset/gbert-large. Built with sentence-transformers. Supports de. Licensed under mit. 7K downloads on HuggingFace.
Applicable Articles
Who does what
deutsche-telekom (provider)Their job
- Provider obligations being compiled
Risk Assessment Reasoning
This model is classified as Minimal Risk under the EU AI Act. No mandatory compliance obligations apply, but voluntary codes of practice are encouraged. AI literacy training (Art. 4) is recommended for all deployers.
More models by deutsche-telekom
Similar Sentence Similarity models
Frequently Asked Questions
What is gbert large paraphrase cosine's EU AI Act risk classification?
+
gbert large paraphrase cosine is classified as MINIMAL under the EU AI Act.
What are my obligations if I deploy gbert large paraphrase cosine?
+
As a gbert large paraphrase cosine deployer, you have 1 base obligations (~8 hours estimated effort). Key articles: Art. 4.
What is gbert large paraphrase cosine?
+
gbert large paraphrase cosine is a Sentence Similarity model by deutsche-telekom. It has 7K downloads on HuggingFace. Licensed under mit.
What are the EU AI Act deadlines for gbert large paraphrase cosine?
+
Already passed: AI Literacy (Art. 4) — 2025-02-02.
Check gbert large paraphrase cosine compliance in your codebase
One command to scan. Open-source CLI.