Adobe Beta-Testing New Tool to Detect Manipulated Images

Adobe released a beta version of a Photoshop tool that will make it easier to determine if an image is real or has been manipulated. The so-called attribution tool, which will first be tested with a select group of people, enables photo editors to attach more detailed, secure metadata to images. In addition to including who created the image, the metadata will provide information on how it was altered and if AI tools were used to do so. Adobe said it will also be clear if the metadata has been tampered with. This could be a step toward combatting deepfakes. Continue reading Adobe Beta-Testing New Tool to Detect Manipulated Images

Scientists and Military Look for Key to Identifying Deepfakes

The term “deepfakes” describes the use of artificial intelligence and computer-generated tricks to make a person (usually a well-known celebrity or politician) appear to do or say “fake” things. For example, actor Alden Ehrenreich’s face was recently replaced by Harrison Ford’s face in footage from “Solo: A Star Wars Story.” The technique could be meant simply for entertainment or for more sinister purposes. The more convincing deepfakes become, the more unease they create among AI scientists, and military and intelligence communities. As a result, new methods are being developed to help combat the technology. Continue reading Scientists and Military Look for Key to Identifying Deepfakes