Publicis Media Updates Ad Technology Assessment Process To Avoid Cultural Stereotypes, But Getting Information About Algorithms ‘Is Difficult’
Agencies and their clients may be more aware of where they get their data and technology these days, but when it comes to delivering on promises of creating a more inclusive and less discriminatory advertising landscape. , there are a lot of dark corners. be inspected. Publicis Media is trying to shed light.
With the help of its multicultural practice group, the media agency arm of Publicis Groupe updates its process for evaluating technology and data providers, including demand and supply side platforms as well as management tools. Datas. Publicis Media also incorporates new methods to assess whether the systems of these companies perpetuate stereotypes of audience segmentation or unfairly reduce the priorities of the advertising inventory provided by minority publishers.
Specifically, Publicis Media is adding a new set of 30 to 40 questions to an already lengthy information request process that includes approximately 1,200 questions to which the agency submits providers on an annual basis, which takes approximately two months to complete. by supplier. The addition of new metrics to the process – which the agency has dubbed Verified – aims to help clients assess potential technology and data vendors by getting them to provide more information on topics such as modeling. audience, methodology and taxonomy. The process also aims to convince tech companies to provide access to the algorithmic systems used to make decisions such as audience targeting and inventory pricing.
While the answers to the added questions may not accurately reveal what goes on in often opaque ad technology systems, the new requests reflect issues that have not necessarily been addressed by a formula before.
The last list of questions includes, “Are you modeling your ethnic, racial, sexual orientation, gender, and religious segments?” and follows up by asking providers “for a detailed explanation of the process for each group”. Another question asks, “What guidelines or checks and balances do you have in place to ensure the accuracy of data relating to ethnic, racial, sexual orientation, gender and religion?” The responses can allow the agency to compare the level of detail provided by technology providers or the attention they have given to the impacts of their technology on multicultural equity.
“Currently, there is no market standard in terms of ethical review of data and algorithms,” said Shelley Pinsonneault of Publicis Media, who oversees the agency’s global technology team that conducts Audited audits. . The decade-old data and technology assessment process aims to help clients determine which approaches are best suited, rather than scoring or labeling technology vendors as doing something right or wrong. . “Each brand has a different threshold for how they approach data and use the information to make decisions, ”she said.
Every year, Publicis Media scales its assessments by category and has so far only incorporated new questions and criteria that address multicultural concerns and impact when inspecting data companies last year. Now he plans to add the questions to assessments from technology providers in a dozen other areas such as DSP or digital out-of-home advertising technology.
Explore the value of minority ad inventory by DSPs
The aim of Publicis Media is to obtain answers on how advertising systems are designed in order to understand whether their data flows or their automated rules and decision-making processes could negatively affect people of certain cultural groups or lead discrimination against media that target niche groups, explained Jennifer Garcia, SVP multicultural data science and research lead at Cultural Quotient, the agency’s multicultural practice group.
For example, to assess how algorithmic advertising systems make decisions that may affect the delivery of ads in media reflecting a diversity of groups and cultures, Garcia and her team could examine how DSPs decide to bid on inventory. a publisher, how much and how and where ad impressions show on web pages or in apps, to assess how ad systems prioritize – or de-emphasize – publishers of certain types of content.
“Depending on the type of relationship that exists between certain publishers and the DSP, it may or may not play into what a DSP makes preferential among publishers. And some endemic minority groups and publishers may be at a disadvantage, which also plays a role in the vicious cycle of lack of data or information to help inform future multicultural campaigns for clients genuinely invested in the segment, ”Garcia said. .
Publicis Media also aims to ensure that audience segmentation does not reinforce stereotypes or put all people in certain groups into a monolithic category, Garcia said. His team is therefore looking for details on how audiences are constructed. For example, when reviewing a segment reaching African American men ages 18 to 34, she said, “It’s not just the hip-hop listener and the sneaker wearer. There are things like photography and art, entrepreneurship, interest in all levels of college, food, family, health – there is so much more. When we don’t bring these opportunities to the fore, we are missing an extremely important opportunity to effectively reach an audience, with not only purchasing power but also influence in society, ”Garcia added.
Getting details on algorithms is not easy
Publicis Media may use millions of dollars for advertisers, but it still struggles to access all the details it could wish for about how data is obtained or how technologies are constructed, according to Pinsonneault. “The manner in which [tech firms] going to open the hood and talk about their algorithms is difficult, ”she said.
Nonetheless, the agency is lobbying tech companies for information about the ethical considerations underlying their algorithms. For example, a new survey in the evaluation questionnaire asks technology providers: “What is your approach to guarantee the algorithmic [or] AI bias doesn’t exist in your audience? “
“Of course, tech companies don’t [let] other companies have access to the source code or algorithms. It’s a trade secret, ”said Tae Wan Kim, associate professor of business ethics at the Tepper School of Business at Carnegie Mellon University, which studies ethical issues in AI. “A better approach is to rely on a trusted third party who verifies the algorithms for the company,” he said.
Although Publicis Media itself does not work with third-party auditors in its Verified assessment process, the company plans to add a query in its RFI asking if partners have engaged with third-party auditors, which could include companies like Neutronian, which assesses data supply. and data models, and TruthSet, which assesses the accuracy of demographic audience data.
Kim pointed out the tools that help explain algorithmic models, like LIME and SHAP, and argued that ultimately companies have a financial incentive to build trust in their algorithmic systems and data through auditing. Noting the increase in AI ethics roles in companies like Salesforce, he said, “An ethically justified algorithm will also be a great asset.