Asking ChatGPT to repeat a word may now be a violation of its terms December 5, 2023 by admin Tweet Researchers discovered a technique using ChatGPT to repeatedly say words that could reveal private details. The chatbot now refuses some repetitive requests, even though the terms allow repetition. Read more… Neowin Related Posts:OpenAI revokes ChatGPT Bing integration as users…Microsoft reportedly plans to reveal ChatGPT-like…OpenAI's new chatgpt-4o-latest model re-claims the…OpenAI looking to monetize its viral chatbot with…ChatGPT has now added "Browse with Bing" feature for…