fbpx

AI means influencer agencies are tightening creator regulations

Content creators and agencies are continuing to experiment with generative AI images and sounds to social media filters and AI influencers. For agencies and their clients, the risk of not disclosing use of AI content can lead to damage of the brand and its reputation. And for influencers, it could hurt their authenticity and engagement with followers. As a result, influencer agencies are having to tighten their safety measures. 

Ben Jeffries, CEO of influencer marketing company Influencer, added that audience trust could decline if followers perceive content as “manipulative.” Jeffries pointed to risks such as deepfake technology, misinformation or harmful biases and stereotypes that could be perpetuated by AI-generated content. “Brands must strike a balance between using technology to enhance their influencer campaigns and maintaining the human element that makes them truly resonate with consumers,” Jeffries said. “After all, AI is only as unbiased as the data it’s trained on, and if that data is flawed or discriminatory, the content it produces will be, too.”

In case you’re wondering, the adoption of AI for influencers is a mass one. Some 94.5% of creators surveyed in the U.S. this year said they use AI for content editing and generation, according to The Influencer Marketing Factory.

Amy Luca, EVP and global head of social at the agency network, said the legal contracts conversation will cover transparency and disclosure when using AI in content.

Influencers have to be “very clear and tacit about the ground rules for how that content is created,” Luca told Digiday. “I think that’s something we’re going to see a lot more from an ethics perspective, but more importantly from a brand safety and brand transparency perspective. Because at the end of the day, it only takes one influencer to produce content that is not authentic or is done through AI and for that to be found out.”

In the same way an influencer promoting a mascara shouldn’t use fake lashes to promote the products abilities, AI infiltration into content should be treated the same way. 

As more influencers turn to AI not just in content creation, but in scaling and business efficiency tools, Ryan Detert, CEO of influencer marketing company Influential, said the technology allows them to rapidly grow the business. Detert said his company ensures that data is consented and anonymized with AI brand safety tactics in order to protect clients – a safety mechanism that was implemented seven years ago.

“Many are also using AI to generate or inspire creative concepts, creative briefs, or customer and creator support tickets,” Detert said. “To us, AI is a tool just like a calculator, a force multiplier to get more efficient.”

Another use of AI that is blowing up for creators is to have their likeness or voices become licensed with AI integration – sort of like cloning themselves. This has already happened with a 23 year old Snapchat mega influencer whose AI version of herself ‘dates’ over 1,000 of her followers for $1 a minute. 

AI can actually be used to help brands. For example, several AI tools can help with brand safety by analysing the uploaded content or captions and providing insight on how the information will be perceived by different audiences, be it region, age or other demographics. 

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts