I attended FGCU’s panel on AI and the Future of Creativity and surprisingly, it was not what I expected. I thought I would hear the usual warnings about automation and job loss. As a design student, AI has often felt less like a tool and more like a looming threat to the future of creative work.

I hoped to gain clarity from professionals working in a variety of different fields, including philosophy, music, entrepreneurship and business. 

What I did not expect was that rather than framing AI as a simple form of replacement or progress, the panel revealed a far more nuanced conversation about responsibility, creativity and what it means to be human in an increasingly automated world.

One discussion that stood out exposed clear disagreement among the speakers: transparency in AI and what it truly means for ownership and consent. Transparency was repeatedly emphasized as a cornerstone of ethical AI, yet Carolyn Culbertson challenged whether full transparency is even achievable. She pointed to anonymous user inputs and uncredited training data, arguing that questions of ownership inevitably extended to questions of consent.

Story continues below advertisement

Culbertson came from a relatable standpoint, given her experience with her own work being used without permission to generate content.

Another panelist suggested that users should credit AI when it is involved in the creative process, a point quickly contested by FGCU Associate Dean of Accreditation & Assurance of Learning Matthew Sheep, who argued that stigma often discourages disclosure. 

Seeing perspectives clash in real time made it clear that the issue is unresolved, and that panels like this push students to engage with ethical uncertainty rather than accept simplified conclusions. That nuance emerged specifically because the panel brought together speakers from different disciplines who were willing to challenge one another.

Throughout my studies, AI integration has been encouraged, and I have used it as a tool, particularly in early stages such as idea generation or visualization. 

However, this discussion revealed how inseparable those tools are from larger questions of regulation, ownership and responsibility. It made me realize that advocating for ethical boundaries in digital spaces is crucial to maintaining rightful ownership of your content. 

At the same time, the panel reframed AI as neither untouchable nor inevitable. These systems are human-made, which means people are accountable for how they function and who they benefit. I feel uneasy knowing major corporations largely fund and develop these technologies, leaving consumers with little control over how their data and creative work are used. 

When consent is abstracted or ignored, the promise of AI innovation comes at a major human cost.

FGCU students should pay attention to panel discussions like those hosted by the Office of Public Policy Events. These events are accessible, free and held on campus, but more importantly, they expose students to real-world issues through active dialogue rather than passive instruction. Hearing professionals disagree, question one another, and ground abstract ideas in lived experience is far more valuable than reading a biased article or sitting through a one-directional lecture.

For students preparing to enter the workforce, understanding how global issues shape industries is not optional. In a moment where technology is advancing faster than ever before, spaces that allow for open disagreement are more necessary than ever.