Âé¶¹Ó³»­´«Ã½ Center for Ethics Archives | Âé¶¹Ó³»­´«Ã½ News Central Florida Research, Arts, Technology, Student Life and College News, Stories and More Thu, 19 Mar 2026 18:06:00 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 /wp-content/blogs.dir/20/files/2019/05/cropped-logo-150x150.png Âé¶¹Ó³»­´«Ã½ Center for Ethics Archives | Âé¶¹Ó³»­´«Ã½ News 32 32 Âé¶¹Ó³»­´«Ã½ Researchers Receive Meta Support to Study Motor Learning in EMG-Based Interfaces /news/ucf-researchers-receive-meta-support-to-study-motor-learning-in-emg-based-interfaces/ Thu, 19 Mar 2026 13:00:54 +0000 /news/?p=151557 Meta funding will support research on gamified muscle-based human-computer interaction while embedding ethics directly into engineering design.

]]>
Âé¶¹Ó³»­´«Ã½ researchers are partnering with Meta Platforms Inc. to study how people learn to control digital systems using muscle signals, work that could improve human-computer interaction in virtual and augmented environments.

Supported by a gift from Meta, the two-year project uses electromyographic (EMG)-based human-machine interface technology as a platform for investigating motor learning through gamified training systems. While EMG systems are often studied in the context of prosthetic limb control, the broader goal of the project is to understand how adaptive interfaces can become more intuitive and embodied over time.

“This Meta support will enable my lab to work on real-world problems that can have an immediate impact on neurotechnologies.†— Mohsen Rakhshan, assistant professor

Âé¶¹Ó³»­´«Ã½ was selected through Meta’s competitive funding initiative, in part because of its interdisciplinary approach pairing engineering with philosophy and ethics.

Mohsen Rakhshan, an assistant professor in Âé¶¹Ó³»­´«Ã½â€™s Department of Electrical and Computer Engineering and the Disability, Aging and Technology (DAT) faculty cluster initiative, and Jonathan Beever, a professor of philosophy and director of the Âé¶¹Ó³»­´«Ã½ Center for Ethics, will lead the project.

“This Meta support will enable my lab to work on real-world problems that can have an immediate impact on neurotechnologies,†Rakhshan says. “The impact ranges from individuals using augmented and virtual reality for entertainment to individuals with amputation or paralysis seeking to improve their quality of life. It also gives my engineering students the opportunity to integrate ethics research into their technical work.â€

Advancing Motor Learning Through EMG

EMG-based interfaces translate electrical signals generated by muscle activity into digital commands, allowing users to control devices through subtle physical gestures. In immersive environments, these systems can enable more natural interaction with virtual objects. In rehabilitation settings, they can assist in training neural prostheses.

The Âé¶¹Ó³»­´«Ã½ team is using this technology to examine how people learn new motor skills in digital environments, particularly through gamified interaction tasks designed to strengthen human-computer coordination. By training both the participant and the signal-processing algorithm (often called a “decoderâ€) simultaneously, through a process known as co-adaptation, researchers aim to create systems that improve alongside the user.

Professor Jonathan Beever (left) and Assistant Professor Mohsen Rakhshan (right) discuss an EMG-based interface prototype.

“A significant challenge for most of these systems is that they require constant retraining or calibration of the decoder,†Rakhshan says. “Retraining after each use can discourage individuals from using these devices long term. The human nervous system is plastic — it can adapt and improve performance over time. But if the decoder is constantly reset or kept static, it may prevent the nervous system from leveraging that plasticity. We aim to develop a co-adaptive loop between the human and the device.â€

Rather than focusing solely on stable decoding, the project investigates how adaptive systems can enhance motor learning, improve user confidence and promote a stronger sense of embodiment in human-machine interaction.

If successful, the research could inform next-generation EMG systems used in immersive computing, rehabilitation technologies and assistive devices.

A prototype EMG-based interface device that will be used to explore how people interact with systems that translate muscle signals into digital commands.

Embedding Ethics Into Engineering

A defining feature of the project is the integration of ethics alongside engineering from the outset.

“Interdisciplinary collaboration between ethics and technical experts is the best path forward for responsible innovation.†— Jonathan Beever, professor

Longitudinal EMG studies can reveal subtle motor signatures that uniquely identify individuals, raising questions about privacy and data protection. Adaptive systems may also influence a user’s sense of agency, whether individuals feel genuinely in control of the interface. For example, if an EMG system begins adjusting its interpretation of muscle signals automatically, users may feel the device is responding to them intuitively or, in some cases, acting unpredictably. Researchers want to better understand how these dynamics affect trust, confidence, and long-term use.

To address these questions, Beever will be embedded within the Âé¶¹Ó³»­´«Ã½ Laboratory for Interaction of Machine and Brain (LIMB), contributing directly to experimental design and evaluation. The team will conduct structured assessments of agency and embodiment while examining potential privacy leakage from EMG signal data.

“Interdisciplinary collaboration between ethics and technical experts is the best path forward for responsible innovation,†Beever says. “Technological advancement must be guided toward good ends. Our work emphasizes not only ethical research practices but also deeper questions about autonomy and agency in human-machine interfaces.â€

A Three-Phase Study

The longitudinal study will involve 30 participants completing 10 sessions over two months, allowing researchers to measure both short-term and long-term motor learning outcomes.

The project will occur in three phases:

Phase 1: Standardizing muscle signal data so artificial intelligence systems can more accurately interpret user intent.

Phase 2: Training both participants and machine learning models simultaneously — a co-adaptive process designed to improve human-computer interaction through gamified tasks.

Phase 3: Conducting structured evaluation of agency, embodiment and privacy risks while developing a publishable ethics framework for adaptive EMG-based systems.

“There has been a significant increase in industry interest in using biological signals such as EMG, from muscles, and EEG, from the brain, to interact with virtual and augmented reality, consumer electronics, prostheses for individuals with amputation and robotic systems for individuals with paralysis,†Rakhshan says.


This research is supported by a gift from Meta. The project is conducted by faculty, staff and students in Âé¶¹Ó³»­´«Ã½â€™s Department of Electrical and Computer Engineering, the Disability, Aging and Technology research cluster and the Âé¶¹Ó³»­´«Ã½ Center for Ethics.

]]>
2Z7A6644.jpg Jonathan Beever (left) and Mohsen Rakhshan (right) discuss an EMG-based interface prototype in their Âé¶¹Ó³»­´«Ã½ lab. Âé¶¹Ó³»­´«Ã½_Meta Grant 2026 A prototype EMG-based interface device developed at Âé¶¹Ó³»­´«Ã½, used to explore how people interact with systems that translate muscle signals into digital commands.
Âé¶¹Ó³»­´«Ã½ Is Exploring the Intersection of Art and Artificial Intelligence /news/ucf-is-exploring-the-intersection-of-art-and-artificial-intelligence/ Fri, 14 Jul 2023 17:05:04 +0000 /news/?p=136207 In honor of Artificial Intelligence Appreciation Day (July 16), here’s how Âé¶¹Ó³»­´«Ã½ is exploring this rapidly evolving technology in the arts.

]]>
When photography was invented in 1822, some painters believed it was the end of art. But in fact, photography became its own medium that helped launch the modern art movement. Today, artificial intelligence (AI) — the simulation of human intelligence processes by computer systems — is transforming the art world, among several other industries, as we know it.

The impact of this cutting-edge technology on the arts is a hot topic as a new generation of “generative†AI applications can create works of art in seconds by simply typing a few words into a text box — raising questions about the creative process, ethical values and more. While people around the world are desperately trying to navigate the intersection of AI and human creativity, Âé¶¹Ó³»­´«Ã½ is exploring the possibilities that lie ahead.

A prime example was this year’s Âé¶¹Ó³»­´«Ã½ Celebrates the Arts that turned the spotlight on the intersection of arts and technology for the first time ever. Through dance, concerts, film, discussions and more, the university’s best and brightest artists merged visual arts with AI and other revolutionary technologies to showcase the endless creative possibilities at this intersection.

The REALity of ARTificial Intelligence panelists from left to right: Jonathan Beever, Keidra Daniels Navaroli, Angela Hernandez-Carlson, Stephen Fiore and Ruben Villegas. (Photo courtesy of Stephen Kuebler)

Knights and the Central Florida community alike saw AI in action while attending the festival’s  event hosted by the in collaboration with the Center for Computer Vision and the .

An event attendee creates AI-generated artwork at a computer station. (Photo courtesy of Stephen Kuebler)

Attendees gained hands-on experience using AI tools at computer stations where they typed in keywords and statements to create their own art using generated scriptwriting and photographs. AI even assisted with the event’s ambiance, too, as the sounds of AI-generated techno music played in the background. There to help attendees make sense of it all were experts in ethics, visual art and computer science like Google Brain Research Scientist Ruben Villegas who explored the idea of AI being “the new paintbrush†during a panel discussion on the technology’s functions and how it’s reshaping the creation of art, music, writing and more.

Stephen Kuebler

When the world needs answers for today’s most challenging problems, Âé¶¹Ó³»­´«Ã½ is often looked to as a leader in innovation. This is what inspired Professor Stephen Kuebler, co-organizer of REALity of ARTificial Intelligence and founding associate director of Âé¶¹Ó³»­´«Ã½â€™s Center for Ethics, to design an experience that would inform the community on must-know innovations and techniques in AI technology.

“Communities at all levels need to get ahead of the curve and understand the implications of AI so we can implement it in ethical and sensible ways that truly improve the quality of life for all,†Kuebler says. “[Âé¶¹Ó³»­´«Ã½] is well positioned to do just that because we are a vibrant community of growth and change. But the first step is education and awareness.â€

The intersection of art and technology represents a shift and milestone in the evolution of art itself.

Left: A stained glass window made by Eric Standley. Right: The “Théâtre D’opéra Spatial” by Jason Allen.

The ’90s kickstarted a technology revolution that gave artists more tools — from digital colors to 3D printing — to express their creative visions while also improving the accessibility of their art. More recently, memorable installations like artist Eric Standley’s stained glass windows made from stacked laser-cut paper or designer Jason Allen’s Théâtre D’opéra Spatial — one of the first AI-generated pieces to win the Colorado State Fair’s annual art competition; are examples of how advanced technology has enabled artists to transform and manipulate their artwork.

As human beings, Kuebler says he believes we tend to think that only intelligent beings have the ability to create.

“AI challenges our view of art, creativity and how we value things,†he says. “Very few claim AI tools are anything like self-aware or intelligent, and yet they produce new images, stories and other works that can be exceptionally appealing.â€

Understandably, AI tools are receiving backlash from many human artists who fear for their own professional futures. Their main concern: Why would anyone pay for art when they could generate it themselves?

Kuebler argues that a lack of familiarity with the AI-art making process may be causing particular distrust among human artists.

“The works are inherently derivative because they are generated by an algorithm that samples a database of existing art,†he says. “But in many ways, human artists do the same because their work is informed and inspired by everything that came before.â€

Kuebler explains that our value for artwork changes once we know how it was created. Some people ascribe less value when they discover that it was created by a computer.

Yet despite the wave of criticism and fierce debates, the use of AI-generative platforms can undoubtedly birth new types of artists and art genres and may even deepen our appreciation for artists who use traditional, hands-on methods as they add a new level of authenticity to their artwork.

The integration of visual arts and AI is just one of many avenues that Âé¶¹Ó³»­´«Ã½ is exploring.

“Simulation technology is maximized by the inclusion of spoken word poetry,” Welcome says, “[to create] a more accurate representation of the live performance.” (Photo by Nick Leyva ’15)
When professional speaker and Orlando Poet Laureate Shawn Welcome ’17 was invited to perform spoken word poetry for a hologram installation at Âé¶¹Ó³»­´«Ã½ Celebrates the Arts 2023, it was his curiosity about the technology that drove him to participate.

The merging of hologram technology with the art of spoken word “represents something new for both disciplinary areas that we’ve yet to discover,†says Welcome, an English alum and current applied sociology graduate student.

The hologram device was acquired by Âé¶¹Ó³»­´«Ã½â€™s thanks to a gift from Brooks Rehabilitation. The new technology produces a lifelike person in hologram form that is being used to train Âé¶¹Ó³»­´«Ã½â€™s future healthcare professionals in learning how to assess and treat patients.

Guests get an up-close look at the hologram patient simulation tool at Âé¶¹Ó³»­´«Ã½ Celebrates the Arts 2023. (Photo by Kadeem Stewart ’17)

The wonders of this simulation tool were amplified during Âé¶¹Ó³»­´«Ã½ Celebrates the Arts through a collaboration with researchers and the arts and humanities. A series of holograms, including Welcome’s poetry performance, were displayed in the lobbies and public spaces of the Dr. Phillips Center for Performing Arts where the annual art showcase is held.

The hologram installation is something Welcome says he is still processing.

“As one uses body language and tone in addition to the actual words to creatively communicate, the hologram captures what you simply can’t capture in 2D,†Welcome says. “And for anyone in the world to engage with that is really fascinating.â€

As technology evolves, it’s no surprise that Âé¶¹Ó³»­´«Ã½, like many artists, is bringing it to the human experience to push boundaries and meet the changing aesthetic tastes and needs of society.

“[Âé¶¹Ó³»­´«Ã½] afforded me the opportunity to think critically about the intersection [of art and technology],†Welcome says. “[The university] gets to imagine what the future holds with accessibility to technology like this … and for better or worse, that is always a good thing.â€

]]>
Reality of Artificial Intelligence panel REALity_of_ARTificial_Intelligence Stephen Kuebler AI artwork Shawn Welcome’17 Dr. Hologram_ Celebrates-the-Arts-2023