Image generators like Stable Diffusion can create what look like real photographs or hand-crafted illustrations depicting just about anything a person can imagine. This is possible thanks to algorithms that learn to associate the properties of a vast collection of images taken from the web and image databases with their associated text labels. Algorithms learn to render new images to match a text prompt in a process that involves adding and removing random noise to an image.
Because tools like Stable Diffusion use images scraped from the web, their training data often includes pornographic images, making the software capable of generating new sexually explicit pictures. Another concern is that such
→ Continue reading at WIRED