Sora is upending video production. It’s now remarkably easy to generate production-quality clips with AI using just a few prompts. I was curious to see if Sora lived up to the hype in creating an AI-generated video that looked realistic. 

As a reporter and writer currently going through IVF, I’ve been sitting on an idea to do an explainer or exposé on the fertility industry, especially on cryobanks. A story of this subject matter deserves to be as realistic as possible. But maybe Sora could help me get started and, possibly, create some of the aerial shots and street views as B-roll to complement my commentary. 

AI Atlas logo
CNET

I can be shy on camera, so I wanted to see if Sora could create a clip that didn’t require me to agonize over light, lines and accidental laughs. 

When fertility meets machine learning, something gets lost in the algorithm. Sora couldn’t even spell the word “uterus.”

Here’s my hands-on experience with Sora. I’ll also share what I’ve learned about identifying fake AI-generated videos.

(Disclosure: Ziff Davis, CNET’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.)

My (AI) directorial debut 

OpenAI’s Sora 2 was released in September 2025, an extension of the flagship model from 2024. As of today, Sora is free, and Sora 2 no longer requires an invite or a code. 

I used the original Sora and went straight to the chatbox to describe my video. 

Don’t miss any of our unbiased tech content and lab-based reviews. Add CNET as a preferred Google source.

First prompt: “I am going through IVF as a reporter/writer. I want to produce an explainer video, which will become part of a series of videos, ultimately leading to an exposé on the fertility industry. For this first explainer video, create a montage of IVF clips, pull in news headlines and create custom graphics.” 

The resulting video was unusual. The “embryologist” literally face-planted into the precious dish, which was a clear indication that it was AI-generated. The embryos in those dishes are worth tens of thousands of dollars. 

There was also all the nonsensical text around the clips, like “Brakfoots of tecmofolitgy” and “Breaknctve tennology.”

Two screenshots of AI-generated videos of an IVF lab using Sora

Hair down in the lab? Nonsensical words? Definitely AI-generated.

Amanda Smith/Sora

The second video I produced was better, but it still felt like it was generated by a machine. It’s all about context. I’ve never seen a fertility doctor with their hair down or with a stethoscope around their neck, and AI just doesn’t understand that. All the clips felt like stock medical imagery. 

I realized I had to do it one clip at a time, so I came up with a list of a few visual ideas. For example: “Show an embryo developing in the dish and reaching blastocyst” (a critical moment in IVF). 

It looked like a mass of cells, but it didn’t look 100% like an embryo.

A screenshot of an AI-generated video of an embryo using Sora

It almost looks like an embryo.

Amanda Smith/Sora

I followed up by editing my prompt, asking it to make the bubbles bigger and removing the light in the embryo. 

While that was loading, I added a few more videos to the queue. 

“Create a video of the female reproductive anatomy to use in an explainer.” Like the embryo above, it also didn’t look scientifically accurate (point out that it was a fake video), and it continued to make unusual spelling errors, such as “uteryus.”

I had to go back and remix clips, such as directing Sora to “make it look more clinical.” It was frustrating at times, especially with the poor spelling.

A screenshot of an AI-generated video of the human female reproductive system by Sora

Where are the ovaries?

Amanda Smith/Sora

At this point, Sora seems unable to produce text that’s real, and it certainly doesn’t know what female anatomy looks like.

A screenshot of an AI-generated videos of the human female reproductive system by Sora

Nonsensical language and extra parts added to the reproductive system: definitely AI-generated.

Amanda Smith/Sora

I moved away from science to see if Sora would do better at producing a cute baby video. I asked for “a close-up of a newborn, with golden light and purity.”

And finally, we got there.

A screenshot of AI-generated video of a newborn baby using Sora

This newborn baby looked accurate.

Amanda Smith/Sora

I continued with this theme, requesting clips of a newborn’s adorable feet. 

I asked Sora to remove the baby’s face and zoom in on the tiny feet, but it didn’t do that. It also added or subtracted the usual number of toes in several clips of the AI baby’s feet (these tools struggle with fingers and toes).

Four screenshots of AI-generated videos of newborn babies using Sora

These babies have too many or too few toes.

Amanda Smith/Sora

Next, I requested a pregnant woman holding her belly. This one worked, and it even got the hands right.

A screenshot of an AI-generated video of a pregnant person using Sora

Sora’s generation of a pregnant person looked decent, though a little lumpy.

Amanda Smith/Sora

Then, I asked for a table with all the IVF meds and needles spread out on it. The little vials could be passable, but the black fluid in the needles? Yikes.

A screenshot of an AI-generated video of IVF medications and needles using Sora

At first glance, these AI-generated IVF medications looked accurate. Then I noticed the black fluid in the needles.

Amanda Smith/Sora

For reference, here’s what my IVF meds looked like.

A photo of all my IVF medications and needles

My actual IVF medications and needles.

Amanda Smith/CNET

To continue testing out its capabilities, I prompted Sora to create a video of IVF and fertility news headlines. It struggles with specificity like this. 

I don’t even know what language it’s using.

A screenshot of an AI-generated video showing newspaper headlines and articles featuring IVF, generated by Sora

What is this AI language?

Amanda Smith/Sora

For one last shot, I asked for a baby growing in a woman’s belly, but it just showed the belly and added more (out of proportion) hands. I thought we’d moved on from the extra limbs issue.

Two screenshots of AI-generated videos of a pregnant belly using Sora

Extra fingers and hands featured in these AI-generated videos by Sora.

Amanda Smith/Sora

I wasn’t having the best luck with Sora, so I opened ChatGPT for some prompt ideas for my video, and it got weirder. 

ChatGPT recommended asking Sora for: 

“An egg floating in a galaxy, symbolizing creation and possibility.””A garden of glass flowers representing embryos growing in a lab.””A digital twin of the human reproductive system made of data streams and light.” “Futuristic fertility lab run by AI robots, sterile minimal white environment, cinematic lighting.””Butterfly emerging from a test tube, symbolizing hope and transformation.” 

You can’t even wear perfume or use scented shampoo on the egg retrieval and embryo transfer days, so the thought of a glass of flowers in an IVF lab is ludicrous. And most of the suggestions were a little too metaphorical to be helpful.

Time to call it. I downloaded the few videos that were passable.  

Four screenshots of AI-generated videos of an IVF lab, a pregnant person, a newborn baby and IVF medications using Sora

The four least bad video clips generated by Sora.

Amanda Smith/SoraThe verdict on Sora errors

While I didn’t create a cinematic masterpiece with Sora, I did get a few decent clips that I can use in my first explainer-style video. That video will be a mix of me talking to the camera, historical footage of our IVF journey, stock videos and these AI clips. 

Would I use Sora again? Sure, but I’ll wait for the newer version to see if it’s any better. This first attempt seemed to miss the point entirely. Using AI to visualize my IVF story only revealed the serious blind spots of the technology.  

And for a story as delicate as my fertility journey, there should be no errors or inconsistencies.