Advertisers are starting to realise that they can use “deep fake” technology to better serve diverse communities without having to actually hire people from them.
Up to now, most people have seen or heard about deep fakes in situations that alter the perception of the norm for nefarious reasons – such as making a politician say something they didn’t, or a Hollywood actress appear in an adult movie they were never anywhere near.
However, altering the ethnicity of models in advertising is new to us. Advertising companies can generate realistic images of humans to cater to shopper’s preferences, regardless of their demographic.
It’s all about AI
One such company, Tangent.ai, claims its AI algorithms can help customers determine what products will look like on them, be it changing the colour of a model’s lipstick or going to greater lengths to completely change their appearance.
It says that 44 per cent of consumers say they will likely become a repeat buyer after a personalised experience. What better way, you could argue, than to personalise that experience by making the model look like someone from a similar background and age.
So, with the advent of the deep fake technology and the possibility to digitally make people appear different, it’s inevitable that we’ll start to see advertisers using it too.
After all, using computer-aided models to represent multiple demographics reduces the need and cost of actually hiring humans from the select backgrounds.
Just think about that for a moment. One model, but multiple ethnicities all covered in one shoot, all thanks to the power of AI. It’s like Photoshop turbo boosted.
For the greater good?
Advertisers will argue that it’s about creating a wider diversity in the faces we see, and therefore a good thing. On the other hand, those concerned will say that it’s about losing cultural identity and jobs.
However, what if it’s not about the colour of a model’s skin, but merely changing their make-up or his or her hair colour? Or maybe making it look like they are speaking in another language, all dubbed in time perfectly? What happens when we get to a point that these models aren’t even real to begin with?
It’s something already being experimented with in Asia. Digitally-created models and influencers have garnered huge followings. It’s also something author William Gibson touched on in his book Idoru in 1996.
For now though, the potential ethical and financial ramifications are mind-boggling.