A deepfake is a realistic artificial image or video created using deep learning AI trained on authentic images, videos, and audio clips of a target individual. As threat actors collect more data on those they want to impersonate, authenticity is improving, making it more difficult to detect.
To assess organizations’ preparedness against deepfakes targeting executives and board members, BlackCloak sponsored a study by the Ponemon Institute of nearly 600 U.S. security professionals. The results highlight some alarming trends.