Actors should have as much control over the data harvested from scans of their body as they do over nudity scenes, the actor Olivia Williams has said, amid heightened concern over artificial intelligence’s impact on performers.

The star of Dune: Prophecy and The Crown said she and other actors were regularly pressed to have their bodies scanned by banks of cameras while on set, with few guarantees about how the data would be used or where it would end up.

“A reasonable request would be to follow the precedent of the ‘nudity rider’,” she said. “This footage can only be used in the action of that scene. It cannot be used in any other context at all, and when the scene has been edited it must be deleted on all formats.”

Williams pointed to vague clauses in contracts that appeared to give studios wide-ranging rights over a performer’s likeness “on all platforms now existing or yet to be devised throughout the universe in perpetuity”.

A renewed debate over the impact of artificial intelligence on actors has been prompted by widespread condemnation of the creation of an AI actor known as Tilly Norwood. Actors fear the data could be used to train AI models on their likenesses or poses, paving the way for the technology to eventually take away work.

Performing and supporting actors, as well as stunt performers and dancers, have told the Guardian they have been “ambushed” into undertaking the body scans while on set. Several said they had no time to agree how the data produced would be treated, or whether it could be used to train AI models.

Williams said she had tried and failed to have wide-ranging clauses removed from her contracts. She also investigated how to own her own body scan data to license it for limited use, but lawyers advised her the law was too unclear. Legal fees for her attempts to reclaim her data proved too high.

“I don’t necessarily want to be paid any more money for the use of my likeness,” she said. “I just don’t want my likeness to appear in places where I haven’t been, doing things I haven’t done, saying things I haven’t said.

“They make up the law as they go along and no one is stopping them – creating a precedent, reinforcing the precedent. I sign it, because if I don’t, I lose the job.”

Williams said she was speaking out for the sake of young actors who faced little choice but to go through the scans, with few guarantees about what would happen to the data. “I have known a 17-year-old who was persuaded into a scanner – and like the child-catcher scene in Chitty Chitty Bang Bang, she obliged,” she said. “She was a minor, so her chaperone had to give consent. Her chaperone was her grandmother, unaware of the law.”

The issue is the subject of talks between Equity, the UK performing arts union, and Pact, the UK screen sector’s trade body. “We’re demanding that AI protections are mainstreamed in the major film and TV agreements to put consent and transparency at the heart of scanning on set,” said Paul W Fleming, Equity’s general secretary.

“It is within the industry’s reach to implement basic minimum standards which would be a gamechanger for performers and artists working in UK TV and film.”

Pact said in a statement: “Producers are well aware of their obligations under data protection law and these issues are being considered as part of the collective negotiations between Pact and Equity. As the negotiations are ongoing, we cannot comment in any detail.”