All
Search
Images
Videos
Shorts
Maps
News
Copilot
More
Shopping
Flights
Travel
Notebook
Report an inappropriate content
Please select one of the options below.
Not Relevant
Offensive
Adult
Child Sexual Abuse
Length
All
Short (less than 5 minutes)
Medium (5-20 minutes)
Long (more than 20 minutes)
Date
All
Past 24 hours
Past week
Past month
Past year
Resolution
All
Lower than 360p
360p or higher
480p or higher
720p or higher
1080p or higher
Source
All
Dailymotion
Vimeo
Metacafe
Hulu
VEVO
Myspace
MTV
CBS
Fox
CNN
MSN
Price
All
Free
Paid
Clear filters
SafeSearch:
Moderate
Strict
Moderate (default)
Off
Filter
8:47
AI Model Penetration: Testing LLMs for Prompt Injection & Jailbreaks
21.6K views
8 months ago
YouTube
IBM Technology
2:10
Poetic Prompts Can Jailbreak AI? New Study Shows 62% of Chatbot
…
431 views
4 months ago
YouTube
WION
6:41
AI Jailbreaking Demo: How Prompt Engineering Bypasses LLM Securi
…
3.1K views
Sep 26, 2024
YouTube
Packt
7:51
Find in video from 03:26
What is Prompt Injection?
What Is Prompt Injection Attack | Hacking LLMs With Prompt Injecti
…
5.3K views
Jun 20, 2024
YouTube
Simplilearn
15:47
Find in video from 04:54
Indirect prompt injection attack
How AI jailbreaks work and what stops them. (GPT, DeepSeek, Lla
…
11.1K views
Oct 21, 2024
YouTube
Microsoft Mechanics
Safeguarding AI against ‘jailbreaks’ and other prompt attacks
4 months ago
microsoft.com
1:00
What is Jailbreaking an AI? | What is by Digit EP7 | #jailbreakai #genera
…
1.9K views
Aug 18, 2024
YouTube
Digit
This AI Chatbot is Trained to Jailbreak Other Chatbots
Jan 3, 2024
vice.com
2:49
French hackers show how easy it is to 'jailbreak' Musk's Grok 3 • FRA
…
87.4K views
Feb 20, 2025
YouTube
FRANCE 24 English
10:05
We tried to jailbreak our AI (and Model Armor stopped it)
4.4K views
5 months ago
YouTube
Google Cloud Tech
0:53
Find in video from 00:18
What is Prompt Injection Attacks?
How to mitigate GenAI security threats with Azure AI Content Safe
…
1.4K views
Aug 15, 2024
YouTube
Microsoft Azure
Poetic prompts can jailbreak AI, study finds 62 per cent of chatbot
…
4 months ago
India Today
Armaan Agarwal
Study reveals poetic prompts could jailbreak AI
4 months ago
Mashable
Christianna Silva
Microsoft: 'Skeleton Key' Jailbreak Can Trick Major Chatbots Into Beh
…
Jun 26, 2024
pcmag.com
AI Jailbreak | IBM
Nov 12, 2024
ibm.com
3:01
[2025] How to use ChatGPT dan prompt - Unlock ChatGPT ( ChatG
…
33.6K views
11 months ago
YouTube
TenorshareOfficial
1:06
Lawmakers press experts on AI chatbot risks amid growing safety
…
4 months ago
The Independent
Vishwam Sankaran
9:03
Private & Uncensored Local LLMs in 5 minutes (DeepSeek and Dolphin)
605.1K views
Feb 5, 2025
YouTube
David Bombal
5:28
Notion AI with ChatGPT Hack!
55.1K views
Dec 26, 2024
YouTube
Tool Finder
1:52
What is a prompt and how do I write one?
8.4K views
7 months ago
YouTube
Microsoft 365
6:28
Episode 4: Indirect Prompt Injection Explained | AI Red Teaming 101
3K views
9 months ago
YouTube
Microsoft Developer
Jailbreak AI | IBM
Mar 11, 2025
ibm.com
The 2026 Guide to Prompt Engineering | IBM
8 months ago
ibm.com
It's Surprisingly Easy to Jailbreak LLM-Driven Robots
Nov 11, 2024
ieee.org
10:30
Find in video from 01:35
Jailbreaking Explained
How Jailbreakers Try to “Free” AI
258.9K views
Sep 28, 2024
YouTube
Sabine Hossenfelder
0:47
These AI Prompts Will 10x Your Coding Journey | Learn Faster & C
…
32.9K views
Mar 10, 2025
YouTube
GeeksforGeeks
41:28
Find in video from 15:00
Social Engineering and Jailbreaks
How Microsoft Approaches AI Red Teaming | BRK223
10K views
May 27, 2024
YouTube
Microsoft Developer
2:29
Azure AI Content Safety Prompt Shields
2.4K views
Jun 4, 2024
YouTube
Microsoft Developer
10:32
Introduction to Generative AI and LLMs (Part 1 of 18) | Generative A
…
Jun 25, 2024
Microsoft
v-trmyl
ChatGPT: cómo activar el modo DAN para hacer jailbreak y usar la
…
May 24, 2024
xataka.com
See more videos
More like this
Feedback