top of page

Data sold seperately: The AI action figure trend

  • Katrina Ingram
  • 23 hours ago
  • 4 min read

Updated: 18 minutes ago


Some call it the “Barbie box challenge” other refer to it as “AI action figures” (#actionfiguretrend) but the results are similar. People are using AI to prompt images of themselves as packaged doll-like figures, complete with accessories that relate to their life or job. 


This moment reminds me of a similar one that involved a personality quiz which people also thought was harmless fun.

Cambridge Analytica was a political consulting firm that become the centre of a scandal close to a decade ago. They harvested personal data from millions of Facebook users which was used to build psychological assessments to drive targeted political advertising. By 2018, whistleblowers like Christopher Wylie, were sharing intimate details of the company’s inner workings. Surprisingly, how Cambridge Analytica acquired their data wasn’t really all that secret - they just asked for it.


This is your digital life


The personality quiz at the centre of the scandal called “This is your digital life” was developed in 2013 by data scientist Aleksandr Kogan. People voluntarily completed versions of this quiz, which was marketed as sharing something insightful about a topic most people can’t resist learning more about - themselves! 


The quizzes promised to reveal important details about all kinds of personal topics - from your sexlife to worklife. Yet, the real objective was to collect data - not only from you but also from your connections. It was this later detail that was particularly salient once the scandal came to light. Not only had the quiz takers put themselves at risk, they had also implicated their friends and family because Facebook did not have appropriate safeguards in place. Ultimately, this led to a ban on personality quizzes on Facebook and a lot of outrage at the time of the scandal.


Same but different


If we think about the action figure trend, it has a similar dynamic. Tell this system a bunch of things about you and it will generate an image about you. It'll be fun! The more information you share, the more detailed the image the system can generate. We see this in three different levels of prompts in a USA Today story entitled We turned ourselves into action figures using AI. Here’s how it went.” The first reporter, Mary, entered a fairly basic prompt with a photo of herself and got a pretty basic rendering. 


"Using this reference picture, can you create a Barbie action figure in 3D of a journalist wearing jeans and a newsprint tank top with a laptop, book, and newspapers in the box? Can you make the box (black) colored, with the box labeled as (Mary)?" - USA Today


The next prompt gives more detail and the last one by reporter, Greta Cross, gives a very long and detailed prompt along with her image. You can check out the exact wording in the story but for comparison, Mary’s prompt is 45 words while Greta’s is 143. 


We know that generative AI will produce better results with more context. Thus, there is more incentive to share data.

People forget this information is data and that it might be shared with a bunch of third parties for other purposes. This is the piece which bears some similarity to the Cambridge Analytica personality quizzes. The data sharing is voluntary and is wrapped in a layer of innocuous fun. 


In addition, this task is also coaching people on how to use the AI tool providing a visual demonstration of how more data in the prompt equals a better, more precise result. It's not too much of a stretch to imagine someone applying that learning and oversharing sensitive details of their lives. What goes into a chatbot, doesn't stay with a chatbot. This same caution applies to hackers and social engineering attacks. They also find your personal data incredibly useful.


How to become a Narcissist-in-Chief


But, its more than just data sharing that's concerning about this trend. Indulging in 'Barbi-fying' oneself speaks volumes. In the ‘look at me’ world of social media, generative AI kicks things up to whole other level. It proves to be the perfect tool to create more ‘look at me’ moments. It’s optimized for self-indulgence and personalization. Trends like this one and the recent Studio Ghibli trend - applying the animation style of Hayao Miyazaki to personal photos - feed those impulses. People take part in this even when they are aware of the harms such as the high environmental costs to generate these images and the use of copyrighted data to train these models. Miyazaki has called this use of AI an 'insult to life itself'.


There is evidence to suggest that our digital lives are at least part of why we are becoming more narcissistic. The irony of narcissistic personalities is that while they seem on the surface to be uber-confident, they are often very fragile.


What are the social, political and cultural implications of our self-absorbed society?

We’re seeing this experiment unfold in real time. It’s displayed in the behaviour of our elected officials and business leaders, some of whom are the most self-absorbed, emotionally fragile folks imaginable. It’s in the influencer market place where ‘success’ can yield millions, but can also take its toll on mental health when life becomes a never ending public relations exercise. Even regular people need to participate in this social theatre at some level as it becomes a necessary aspect of building a business or a career. 


In a sad irony, it is also AI chatbots that people are turning to for support. A Harvard Business Review report ranks mental health/companionship as the number one use for generative AI in 2025. One can only imagine how much sensitive and detailed data sharing must be taking place when a chatbot is used as a therapist or friend. The product becomes a 'solution' to the problem created by the product.


By Katrina Ingram, CEO, Ethically Aligned AI

 

Ethically Aligned AI is a social enterprise aimed at helping organizations make better choices about designing and deploying technology. Find out more at ethicallyalignedai.com     

© 2025 Ethically Aligned AI Inc. All right reserved.


 
 
bottom of page