AI Nvidia makes fake videos that can not be distinguished from real
AT pursuit of steep artificial intelligence (AI) Nvidia slowly opens a box of pandora. On days they presented an uncontrolled method of machine learning that can dramatically change the content of video content that you he was fed.
What he knows how
Many things. For example, to change the color of a person's skin (or military uniform), the weather for street or time of day, finish the car on road or remove stains from the leopard. AND this is just a few demos from Nvidia.
How it works
Previous methods of machine learning such AI required large amounts of data for comparison. Because of the large amounts of data, it was problematic to teach AI see the patterns and regularities. So they tried to teach AI see the picture as a coloring, each element of which - this is some typical task , like filling color, changing colors, increasing the resolution, zamylivaniya pictures , etc.
Why it's cool. AND why not cool
FROM On the one hand, this opens up a crazy opportunity for video processing. FROM The other side is frightening people. For example, AI from Nvidia already knows how to make fake videos that are not a trained person does not recognizes in 99% of the cases. A the Canadian start-up Lyrebird is not so long ago taught AI to speak with someone else's voices. Possessing such tools any person with fantasy and persistence can create plausible events that are not It was - Infants in crosses, armed conflicts, aliens and sold out at concert of Timati.
Source: The Next Web