Basic Tutorials
Gehört zum Inventar
- Zum vollständigen Beitrag
- https://basic-tutorials.com/news/artprompt-ascii-art-ai-jailbreak/
AI tools and chatbots have become an integral part of many people’s everyday lives. Whether ChatGPT, Google Gemini or others. Security mechanisms are designed to ensure that the AI does not tell you how to build a bomb, for example. Security researchers have now found a jailbreak that can be used to circumvent this restriction: ArtPrompt relies on ASCII art. ArtPompt: bomb-making instructions with ASCII art AI tools are practical. They translate texts, summarize homework, provide useful assistance or create images, videos and now even entire video games in no time at all. Of course, developers have also developed various … (Weiterlesen...)