Asking ChatGPT to repeat a word may now be a violation of its terms December 5, 2023 by admin Tweet Researchers discovered a technique using ChatGPT to repeatedly say words that could reveal private details. The chatbot now refuses some repetitive requests, even though the terms allow repetition. Read more… Neowin Related Posts:OpenAI revokes ChatGPT Bing integration as users…Apple might have an in-house chatbot in the pipeline…OpenAI's new chatgpt-4o-latest model re-claims the…ChatGPT has now added "Browse with Bing" feature for…ChatGPT has been experiencing outages in the past…