What Is EU AI Act? Medical Device Compliance Guide 2025
- Beng Ee Lim
- 1 day ago
- 8 min read
The EU AI Act is the world's first comprehensive artificial intelligence regulation, and it's already disrupting medical device compliance. With fines up to 7% of annual revenue and critical deadlines starting February 2025, medical device companies can't afford to get this wrong. Most companies still don't understand if they're affected or what they need to do.
Quick Answer: The EU AI Act is a risk-based regulation requiring AI medical devices to undergo dual compliance, single integrated CE: NB assesses AIA + MDR/IVDR together. High-risk AI systems need notified body assessment, AI literacy training, and enhanced post-market surveillance. Medical devices get 36 months to comply (until August 2027), but AI literacy requirements already apply from February 2025.
This guide covers everything you need to know about EU AI Act compliance for medical devices, including classification criteria, timeline requirements, and implementation roadmaps.

What Is the EU AI Act and Why Should Medical Device Companies Care?
The European Union Artificial Intelligence Act (Regulation EU 2024/1689) entered into force on August 1, 2024, establishing the world's first comprehensive AI regulatory framework. As such, the EU AI Act uses the New Legislative Framework (NLF) model like the EU Medical Device Regulation (MDR), and the EU In Vitro Diagnostic Medical Devices Regulation (IVDR).
The critical reality: The AI Act applies across all industry sectors and is considered part of the New Legislative Framework (NLF) portfolio of legislation. For medical device manufacturers, this means that if a medical device incorporates AI or ML-enabled device functions, then the AI Act will be applicable to those functionalities.
Why This Matters for Your Business
Financial Risk: Infringements/violations of the AI Act are punishable by administrative fines of up to €35 million or 7% of global annual turnover, whichever is higher.
Market Access Impact: For med-device AI, AI Act is applied within the MDR/IVDR CE route (one integrated NB assessment; single CE mark). See MDCG 2025-6 and AIA integrated approach.
Competitive Pressure: Companies that delay implementation face regulatory barriers while competitors with compliant AI systems capture market share.
Is Your Medical Device Affected by the EU AI Act?
The AI Act uses a risk-based classification system that determines compliance requirements. Most medical devices with AI components fall into the "high-risk" category, triggering substantial compliance obligations.
High-Risk AI Medical Device Classification
Automatic High-Risk Classification: Medical devices that constitute high-risk AI systems include AI systems that are intended to be used as a safety component of a product, or that are in itself a product, subject to the EU Medical Device Regulation (2017/745) or the EU In Vitro Diagnostic (IVD) Medical Device Regulation (2017/746).
This means: MDR IIa/IIb/III and MDR I (sterile/measuring/reusable) → high-risk. MDR I (plain) → not. IVDR B/C/D and IVDR A sterile → high-risk; IVDR A non-sterile → not.
AI System Definition Under the Act
The AI Act defines an AI system as "A machine-based system designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, which infers how to generate outputs (predictions, content, recommendations, decisions) that influence physical or virtual environments."
Practical Examples:
Diagnostic algorithms analyzing medical images
Clinical decision support systems
Predictive analytics for patient outcomes
AI-powered surgical navigation systems
Machine learning models for drug dosing
Limited-Risk vs. High-Risk Distinction
Limited-Risk AI Systems:
Simple rule-based systems
Basic data processing without learning capabilities
Systems with minimal impact on clinical decisions
High-Risk Requirements: While manufacturers of medical devices deemed limited-risk under the AI Act will escape the complexity that comes with regulation by a notified body, there are still provisions that they must comply with in addition to the medical device regulations.
Critical EU AI Act Timeline for Medical Devices
Understanding the phased implementation is crucial for compliance planning. The AI Act obligations apply at different times based on risk classification and system type.
Already in Effect (February 2, 2025)
AI Literacy Requirements: All providers of AI systems must ensure "a sufficient level of AI literacy" of their staff and other persons dealing with the operation and use of AI systems. This means manufacturers of medical devices including an AI system have to implement additional staff training.
Prohibited AI Practices: Systems using manipulative techniques or exploiting vulnerabilities are banned.
Near-Term Deadlines (2025)
July 10, 2025: GPAI Code of Practice published
August 2, 2025: General-purpose AI obligations apply 12 months after entry into force
Critical Implementation Date: August 2027
Medical Device Extension: Medical devices that qualify as high-risk AI systems will have an extra year (until 2 August 2027) to comply with the applicable requirements for high-risk systems.
What This Means:
New medical devices must comply before market entry after August 2027
Existing devices need compliance only if they undergo "significant changes"
The term "placing on the market" refers to individual devices, not device classes
Dual Compliance Requirements: AI Act + MDR/IVDR
The most complex aspect of AI Act compliance is managing dual regulatory pathways that impose overlapping but distinct requirements.
AI-Specific Compliance Requirements
Risk Management Systems:
AI-specific risk assessment protocols
Algorithmic bias detection and mitigation
Human oversight and intervention capabilities
Performance monitoring throughout lifecycle
Data Governance:
High-quality dataset requirements with documentation
Data bias assessment and mitigation strategies
Training data version control and traceability
Input data relevance and quality validation
Transparency and Explainability:
Algorithm decision-making transparency
User notification of AI system interaction
Documentation of AI system capabilities and limitations
Instructions for use with AI-specific information
MDR/IVDR Integration Challenges
Documentation Overlap: While some requirements seemingly overlap with the requirements under the EU Medical Devices Regulation (MDR) and In Vitro Diagnostic Medical Devices Regulation (IVDR), manufacturers will need to conduct a gap analysis to remediate any process shortcomings.
Notified Body Coordination: A single conformity assessment is allowed by notified bodies who are accredited both under the AI Act and MDR/IVDR. However, duration of conformity assessments may add additional burden and negatively affect the timeline necessary to affix the CE mark to medical devices in the EU.
Step-by-Step AI Act Compliance Implementation
Phase 1: Assessment and Gap Analysis (Months 1-3)
Determine AI Act Applicability:
Evaluate if your medical devices include "AI systems" as defined
Assess high-risk classification criteria
Identify prohibited AI practices to avoid
Conduct Compliance Gap Analysis:
Compare current MDR/IVDR processes against AI Act requirements
Identify documentation and process gaps
Assess staff training needs for AI literacy
Key Deliverables:
AI system classification report
Compliance gap analysis document
Resource allocation plan for implementation
Phase 2: Foundation Building (Months 4-8)
Implement AI Literacy Programs: Since February 2, 2025, organizations must ensure employees involved in AI use and deployment have adequate AI literacy.
Training Requirements:
Technical staff understanding AI system operations
Regulatory team knowledge of AI Act requirements
Quality team capability for AI-specific assessments
Clinical team understanding of AI decision-making
Establish Dual Documentation Systems: Due to specific requirements under each regulation, it may be more coherent to have distinct technical documentation to support compliance under the AI Act, MDR and/or IVDR.
Phase 3: System Development and Validation (Months 9-18)
Risk Management Integration:
Extend existing ISO 14971 risk management to include AI-specific risks
Implement algorithmic bias detection protocols
Establish human oversight mechanisms
Develop performance monitoring systems
Data Governance Implementation:
Access and use reliable datasets in compliance with the AI Act
Implement data quality management systems
Establish data bias assessment protocols
Create training data documentation standards
Phase 4: Notified Body Engagement (Months 19-24)
Notified Body Selection: Reach out to your notified body to learn about their accreditation under the AI Act and their expectations on AI Act requirements.
Submission Preparation:
Prepare dual certification documentation
Coordinate AI Act and MDR/IVDR assessments
Plan for extended review timelines
Prepare for potential additional requirements
Estimated Cost Analysis: EU AI Act Compliance Investment
Initial Implementation Costs
AI Literacy Training: €25,000-75,000
Staff training program development
External training provider costs
Internal resource allocation for training delivery
Documentation and Process Development: €75,000-200,000
AI-specific technical documentation
Process integration and gap remediation
Legal and regulatory consulting
System Development: €100,000-500,000
Risk management system enhancements
Data governance infrastructure
Performance monitoring capabilities
Human oversight mechanism implementation
Notified Body Assessment: €100,000-300,000
Dual certification assessment fees
Extended review timelines
Potential additional testing requirements
Ongoing Compliance Costs
Annual AI Audits: €50,000-100,000
Enhanced Post-Market Surveillance: €25,000-75,000 annually
Staff Training Updates: €15,000-30,000 annually
ROI and Business Impact
Market Access Value: Companies with compliant AI systems gain competitive advantages in the €160 billion European medical device market.
Innovation Acceleration: Structured AI governance enables more systematic and safer AI development, potentially reducing development risks and improving product quality.
Global Positioning: EU AI Act compliance positions companies for similar requirements emerging in other markets.
Common Compliance Mistakes to Avoid
Critical Implementation Errors
Underestimating Timeline Requirements: Many companies assume the August 2027 deadline provides ample time, but the complexity of dual certification requires early start.
Inadequate AI Literacy Investment: Treating AI literacy as a checkbox requirement rather than building genuine organizational capability for long-term compliance.
Siloed Compliance Approach: Managing AI Act compliance separately from MDR/IVDR rather than integrating requirements into unified systems.
Insufficient Notified Body Planning: Waiting too long to engage notified bodies who may have limited capacity for dual assessments.
Documentation and Process Pitfalls
Generic Risk Assessments: Using standard risk management approaches without addressing AI-specific risks like algorithmic bias and data quality.
Poor Data Documentation: Insufficient documentation of training data sources, bias assessment, and data governance protocols.
Inadequate Human Oversight Design: Implementing superficial human oversight rather than meaningful intervention capabilities.
Future Outlook and Strategic Recommendations
Regulatory Evolution Trends
Standards Development: Keep an eye out for new European and international standards (e.g. on quality management or data quality) that will be developed as conformity with these standards creates a presumption of conformity with the AI Act.
Guidance Publications: The European Commission is published guidance in June 2025 regarding the interplay between the AI Act and MDR/IVDR.
Global Harmonization: The EU AI Act is influencing AI regulation development worldwide, creating opportunities for companies with robust compliance frameworks.
Strategic Positioning for Success
Proactive Implementation: Companies implementing AI Act compliance early gain competitive advantages through:
Streamlined market access processes
Enhanced investor confidence
Improved product quality and safety
Global regulatory leadership positioning
Innovation Enablement: Rather than viewing AI Act compliance as burden, leading companies use it as framework for:
Systematic AI development processes
Enhanced clinical validation protocols
Improved post-market surveillance capabilities
Stronger patient safety outcomes
About Complizen
Complizen simplifies regulatory compliance for medtech companies using AI, helping life-saving innovations reach patients faster. We've guided hundreds of companies through FDA, EU MDR, and global regulatory pathways, from early-stage startups to established medical device manufacturers. Our mission is to make regulatory expertise accessible so breakthrough medical technologies can improve lives worldwide.
Frequently Asked Questions
When do EU AI Act requirements actually apply to my medical device?
AI literacy requirements apply now (February 2025). High-risk AI system requirements apply August 2027 for medical devices. However, new devices placed on market after August 2027 must be compliant from day one.
Can I use existing MDR/IVDR documentation for AI Act compliance?
Partial overlap exists, but AI Act requires specific documentation for data governance, algorithmic transparency, and AI-specific risk management that typically requires separate or enhanced documentation.
What happens if my medical device AI doesn't qualify as "high-risk"?
You still need AI literacy compliance and must inform users they're interacting with AI systems. However, you avoid the complex notified body certification requirements.
How do I know if my notified body can handle AI Act assessments?
Contact your notified body directly about their AI Act accreditation status and timeline. Many are still developing AI assessment capabilities.
Can I delay compliance until August 2027?
While medical devices have until August 2027, early implementation provides competitive advantages and avoids last-minute compliance risks when notified body capacity may be constrained.