Hi Xiao Bo Gu
Thanks for posting in the Microsoft Q&A Forum!
The suspension is not something Copilot decides on its own, it’s an automated enforcement of Microsoft’s Responsible AI and usage policies.
Here’s what’s happening:
- If a user violates policy guidelines (e.g., inappropriate content, unsafe prompts), the system applies a temporary lockout - in this case, 1 hour.
- This is a built-in safeguard, not a manual action by Copilot. It’s part of Microsoft’s compliance and safety framework to prevent misuse.
- After the suspension period ends, access is automatically restored.
If you believe this was a mistake, you could consider reporting your experience to Microsoft. You can report your concern here.
For more references:
Microsoft 365 Copilot Privacy and Security
Communication Compliance for Copilot
Hope this can provide more context to this. Let us know if there's anything else we can assist you with!
If the answer is helpful, please click "Accept Answer" and kindly upvote it. If you have extra questions about this answer, please click "Comment".
Note: Please follow the steps in our documentation to enable e-mail notifications if you want to receive the related email notification for this thread.