fbpx

۰ͼ

3D model will let people face the future

Computer scientist Roni Sengupta develops AI-based technology to predict facial changes as someone ages or transitions.

The many faces of Carolina computer scientist Roni Sengupta, who is working on a facial remodeling technology that uses AI. (photos by Megan Mendenhall | composite by Corina Prassos)

In the early 2000s, coming out as transgender wasn’t an option in Kolkata, India.

At 10,  didn’t have the language for why she secretly dressed up in her mother’s clothes. As a teen, Sengupta told her partners, but not her parents, she knew she was a woman.

“I hoped initially it would go away, like if I pretended to be masculine enough it would eventually go away. But I always knew I was a woman,” says Sengupta, an assistant professor in the College of Arts and Sciences’ computer science department. In spring 2023, Sengupta began the transition process. Her colleagues, friends and wife have been incredibly supportive.

Sengupta’s experience inspired her work in , a field of artificial intelligence that trains computers to see like a human eye. Sengupta’s lab is creating a 3D model that tracks and predicts changes in someone’s face as they age. In the next five years, she hopes it will help people who are transitioning from one gender to another.

The heart of her work

Sengupta began with degrees in engineering then pursued a more creative career as a research intern working with 3D reconstruction, computer graphics and AI techniques. In a postdoctoral research position at the University of Washington, she explored projects and worked with mentors on other graphic technology.

“I want to tell a story,” Sengupta says. “I have always felt like images, videos and other visual mediums were the best way to do that.”

One of Sengupta’s current projects tackles . She and her lab aim to create a high-quality model that the AI on someone’s phone or computer could efficiently read to design a realistic face or make realistic changes to someone’s face.

The AI part of Sengupta’s current work relies on artificial neural networks. One model uses the cloud of images provided to AI by the internet. The other is a small, separate neural model just for one person that no one else can access. Someone could use it to change their facial expression, create a realistic avatar of themself or privately track physical changes and create personalized predictions.

“Most trans women and men capture photos of themselves every day just to notice the smallest changes,” Sengupta shares. “This data exists, and to build accurate models from this would help physicians see how many changes have happened because of the hormone replacement therapy dosage. And they would have a better understanding of the entire timeline.”

Once the technology and trans community become more secure, Sengupta wants to give people undergoing hormone therapy — a process that can take anywhere from two to five years — a personalized image of what they might look like at the end of their treatments.

The technology also has the potential to help people recovering from injuries see what they’ll look like after healing or facial reconstruction surgery and to provide emotional support for families with missing or deceased relatives.

Ethical unknowns

This kind of advanced computer vision, specifically facial modeling, is expensive, high-powered and already available to corporations like Disney or Netflix. While some might object to making such technology publicly accessible, Sengupta focuses on the good it can offer, such as soothing the anxiety of a young trans person during hormone therapy.

“There is a sense of concern and anxiety for a lot of transgender people,” Sengupta says. “We just don’t know what is going to happen.”

Sengupta hopes to ease this anxiety for others — and that drives her research at Carolina.