LIMN

Search Search
A person in a hoodie and blazer taking a selfie with a smartphone

Orit’s Two Bodies

Limn’s AI gets ahead (and torso) of itself

In December 2024, Limn received an email from contributor Orit Halpern. She had visited our new website and noticed that her author photo, mysteriously, had been digitally manipulated, with an unfamiliar body appended onto her headshot.

When Limn built the new website, one designer put together a Photoshop script that cropped the images our contributors had shared with us over the past decade to fit a certain aspect ratio—and, where necessary, expanded the background using the software’s generative fill feature. To expand the top and bottom of the photo in question, a body in a conservative sleeveless dress was thus created.

Limn editor Gökçe Günel interviewed Orit about the experience. By then, Limn had removed the image, but it remained the top search engine result for her name. We share excerpts from the conversation below that reveal how Limn itself is not immune to ghostwriters.

Gökçe Günel: Could you tell me how you became aware of the digital manipulation?

Orit Halpern: I was googling my citation index, or something self-centered like that. I noticed this image and wondered, why does it look wrong to me? Then I realized, oh, because it’s not my body.

A student in a class also warned me, “Professor Halpern, you’ve got an AI-generated fake of yourself online.” A designer on my [research] team then said, “They were trying to increase the resolution of the picture. You had given them a picture with different specs, and so they just tried to fill in the rest. The generative AI decided that you had, like, pudgy little forearms and made you wear a top you would never pick.” It is super normative.

But it’s not only people who love to look in mirrors. The image shows that machines love themselves, too. It’s machine narcissism at play. It’s the first image that comes up if you search “Orit Halpern,” even though there are so many other images of me on the internet. You can only understand this as AI narcissism. “We made it. We have to pick that picture.” As you can see, it’s all about narcissism here, whether it’s mine or the machine’s.

GG: As editors, we discussed that generative AI had corporatized your image—and I should say, I agree, I don’t think you’d ever wear that top. Did you do anything to get rid of the photograph?

OH: Well, Limn removed the image [from the website], and I thought that would solve the problem. For a while, it did, but then suddenly the image reappeared. Now I’m really curious about the way it haunts us, because none of us know where it is. We’re always worried about data being erased, but what happens when you can’t erase it?

A curious question for you as an editor: Where is our information actually going, and who controls it, and who’s managing it? And can you erase the data you select? Maybe these images were stored somewhere else, and maybe you didn’t have full control over that archive. Maybe it’s continuing to train AI? It leads to interesting questions about the actual organization of the infrastructure underneath websites.

GG: What would it take to have that kind of independence for storing or erasing your data?

OH: Less cloud, I guess, would be the answer. We’ve opted for an energy-intensive computational solution for building network infrastructure, which is very fast and seamless. But what about its political economy?  As we all know, the cloud comes with new kinds of platform organizations, which change who owns, who stores, and who deals with data. We’ve had many different network infrastructures throughout the history of computing, and opting for platforms where all our data is online has certain implications, both for our personal lives and also, more broadly, the right to free speech, public democracy, and all the systems that depend on information transparency. It leads to questions about corporate power and the informational asymmetries such power creates.

GG: This digitally altered author photograph says something about the digitally altered nature of global democracy.

OH: Oh, absolutely, it’s definitely a giveaway. Democracy is about representation. In this case, my representation is being determined by the demand to train AI, and not by the media ecology of my activities or my own will to represent myself. It reveals how little we own our own media assets.

The shocking thing would have been if the image had actually disappeared. You would think, “Oh, wow, nobody stole our data. I can’t believe it!” But maybe I’m also pleased that they cared enough to steal it, but not enough to hide that fact. Corporate negligence offers a sort of ghost transparency that allows us to glimpse the desert of the real, a state where the distinction between reality and its representation has completely collapsed.

That said, I wouldn’t want to rely on the sloppiness of large platforms as the only safeguard for democracy and education [against] propaganda and fake news.

GG: Basically, the AI search engine liked the AI-generated image and didn’t want to let go.

OH: Totally. AI search platforms can’t stop showing that image. They think, “That image was made by us. We love it.” Not that AIs are conscious. We should not be anthropomorphizing them this way. But still, I’m gifting them my narcissism.

GG: Do you have a copy of the unaltered image?

OH: I can’t find it. The irony is, this image will live forever, but the original is lost to history. As a historian, I perceive AI as a storage medium, a form of memory. What gets maintained even as the original data sets are lost? What happens when the digitally altered image is actually the only one you have?