Definition of Shadow AI
Shadow AI refers to the use of artificial intelligence tools and services by employees within an organization without the approval, oversight, or even knowledge of the IT department or Chief Information Security Officer (CISO).
Shadow AI is an extension of the Shadow IT concept, but with specific characteristics that make it harder to detect and potentially more dangerous.
Examples of Shadow AI
- A developer who pastes source code into ChatGPT for debugging
- A manager who uses Claude to draft strategic memos from internal data
- A sales rep who uploads customer data into an AI tool to generate personalized emails
- A designer who uses Midjourney or DALL-E with briefs containing confidential information
- An HR professional who uses an AI tool to screen resumes without evaluating for bias
- An employee who installs an AI browser extension without checking its data access permissions
Why Shadow AI Is Riskier Than Traditional Shadow IT
| Factor | Shadow IT | Shadow AI |
|---|
| Detection | Invoices, network traffic, SSO | Often invisible (free, web-based tools) |
| Data | Stored on the platform | Potentially used to train models |
| Speed of adoption | Weeks/months | Minutes |
| Regulation | GDPR, NIS2 | GDPR + EU AI Act + NIS2 + DORA |
| Irreversibility | Data can be recovered | Data potentially embedded in AI models |
Legal and Regulatory Risks
Shadow AI exposes organizations to sanctions under multiple regulations:
- GDPR: Transfer of personal data to servers outside the EU, lack of legal basis for AI processing
- EU AI Act: Use of AI systems without risk classification, absence of mandatory documentation
- NIS2 and DORA: Failure to map ICT suppliers, unmanaged supply chain risks
The CNIL explicitly recommends maintaining an AI processing register and conducting impact assessments for large-scale personal data processing.
Key Figures
- 75% of French professionals use generative AI at work, mostly without formal approval (Odoxa/Microsoft, 2024)
- 50%+ of enterprises use generative AI in production (Gartner, 2024)
- 214: average number of AI tools detected by Avanoo at client organizations, compared to 7 known by IT
- $4.88M: average cost of a data breach (IBM, 2024)
How to Govern Shadow AI
Shadow AI governance rests on three pillars:
- Visibility: Map all AI usage across the organization using combined methods (SSO, browser extension, proxy)
- Classification: Assess each tool's risk based on objective criteria (data handling, hosting, compliance)
- Framework: Define clear policies and offer approved alternatives rather than outright bans
For a comprehensive guide on this topic, see our Shadow AI in the workplace guide 2026.
Related Terms
- Shadow IT: IT technologies used without IT approval
- SaaS Management: Centralized governance of SaaS applications
- NIS2/DORA Compliance: European cybersecurity regulations
- EU AI Act: European regulation on artificial intelligence
Olivia Dubois is Shadow AI Expert and Chief AI Officer at Avanoo. An HEC Paris graduate and former BCG consultant, she helps enterprises detect and govern Shadow AI and Shadow IT.