LogoS 7Bx2 MoE 13B v0.2 EU AI Act Compliance Profile
RubielLabarta
Your risk depends on how you use LogoS 7Bx2 MoE 13B v0.2
| Usage Context | Risk Level | Obligations |
|---|---|---|
| Internal coding tool | MINIMAL | 3 obligations (~12h) |
| Customer support bot | LIMITED | 7 obligations (~32h) |
| HR screening / hiring | HIGH | 19 obligations (~120h) |
| Credit decisions | HIGH | 19 obligations (~120h) |
| Medical triage | HIGH | 19 obligations (~120h) |
Why this tool is classified as GPAI
LogoS 7Bx2 MoE 13B v0.2 is a text generation model by RubielLabarta. It has 13 billion parameters. Fine-tuned from yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B. Built with transformers. Supports en, es. Licensed under apache-2.0. 7.7K downloads on HuggingFace.
Applicable Articles
Who does what
RubielLabarta (provider)Their job
- Provider obligations being compiled
You (deployer)Your job
- •AI Literacy (Art. 4) (Art. 4)
- •AI Disclosure (Art. 50) (Art. 50)
- •Synthetic Content Labeling (Art. 50) (Art. 50)
Risk Assessment Reasoning
This model is classified as General-Purpose AI (GPAI) under the EU AI Act. GPAI providers must comply with transparency obligations (Art. 53), including technical documentation and copyright policy disclosure. Deployers must ensure AI literacy training (Art. 4) for all staff interacting with the system. Text generation models may produce synthetic content; deployers using this for public-facing applications must disclose AI-generated content (Art. 50).
Similar Models
Frequently Asked Questions
What is LogoS 7Bx2 MoE 13B v0.2's EU AI Act risk classification?
+
LogoS 7Bx2 MoE 13B v0.2 is classified as GPAI under the EU AI Act. However, the risk level of your specific deployment depends on your use case: internal tools may be Minimal risk, while HR screening or credit decisions escalate to High Risk. The model has 13B parameters.
What are my obligations if I deploy LogoS 7Bx2 MoE 13B v0.2?
+
As a LogoS 7Bx2 MoE 13B v0.2 deployer, you have 3 base obligations (~16 hours estimated effort). Key articles: Art. 4, Art. 50.
What is LogoS 7Bx2 MoE 13B v0.2?
+
LogoS 7Bx2 MoE 13B v0.2 is a Text Generation model by RubielLabarta. It has 7.7K downloads on HuggingFace. Licensed under apache-2.0.
What are the EU AI Act deadlines for LogoS 7Bx2 MoE 13B v0.2?
+
Already passed: AI Literacy (Art. 4) — 2025-02-02. Already passed: AI Disclosure (Art. 50) — 2025-08-02. Already passed: Synthetic Content Labeling (Art. 50) — 2025-08-02.
Check LogoS 7Bx2 MoE 13B v0.2 compliance in your codebase
One command to scan. Open-source CLI.