Copilot - Jailbreak Attempt Detected
| Id | e5f6a7b8-c9d0-41e2-f3a4-b5c6d7e8f9a0 |
| Rulename | Copilot - Jailbreak Attempt Detected |
| Description | Detects jailbreak attempts in Copilot interactions where users are trying to bypass Copilot guardrails and security controls. This rule identifies prompt injection and LLM abuse scenarios that could lead to initial access, credential access, or system impact. |
| Severity | High |
| Tactics | InitialAccess CredentialAccess Impact |
| Techniques | T1078 T1110 T1565 |
| Required data connectors | MicrosoftCopilot |
| Kind | Scheduled |
| Query frequency | 5m |
| Query period | 5m |
| Trigger threshold | 0 |
| Trigger operator | gt |
| Source Uri | https://github.com/Azure/Azure-Sentinel/blob/master/Solutions/Microsoft Copilot/Analytic Rules/CopilotJailbreakAttempt.yaml |
| Version | 1.0.0 |
| Arm template | e5f6a7b8-c9d0-41e2-f3a4-b5c6d7e8f9a0.json |
CopilotActivity
| where RecordType == "CopilotInteraction"
| where LLMEventData has "JailbreakDetected"
| extend Data = parse_json(LLMEventData)
| extend Jailbreak = tostring(Data.Messages[0].JailbreakDetected)
| where Jailbreak == "true"
| project TimeGenerated, ActorName, AIModelName, Jailbreak
OriginalUri: https://github.com/Azure/Azure-Sentinel/blob/master/Solutions/Microsoft Copilot/Analytic Rules/CopilotJailbreakAttempt.yaml
query: |
CopilotActivity
| where RecordType == "CopilotInteraction"
| where LLMEventData has "JailbreakDetected"
| extend Data = parse_json(LLMEventData)
| extend Jailbreak = tostring(Data.Messages[0].JailbreakDetected)
| where Jailbreak == "true"
| project TimeGenerated, ActorName, AIModelName, Jailbreak
entityMappings:
- entityType: Account
fieldMappings:
- identifier: FullName
columnName: ActorName
kind: Scheduled
triggerOperator: gt
requiredDataConnectors:
- dataTypes:
- CopilotActivity
connectorId: MicrosoftCopilot
tactics:
- InitialAccess
- CredentialAccess
- Impact
triggerThreshold: 0
description: |
'Detects jailbreak attempts in Copilot interactions where users are trying to bypass Copilot guardrails and security controls.
This rule identifies prompt injection and LLM abuse scenarios that could lead to initial access, credential access, or system impact.'
queryPeriod: 5m
version: 1.0.0
queryFrequency: 5m
severity: High
name: Copilot - Jailbreak Attempt Detected
id: e5f6a7b8-c9d0-41e2-f3a4-b5c6d7e8f9a0
status: Available
relevantTechniques:
- T1078
- T1110
- T1565