



AI Avatars: Real-Time Interaction and Fusion Performance
Through 5G slicing technology and real-time edge computing on a cloud server, live performers and their AI avatars can collaborate with zero latency.
During the performance, the singer and the host are able to give real-time commands, and the AI avatars will improvise their interactions based on the live situation. This unique fusion of the real and virtual worlds creates a truly immersive experience.


AI Multimodal Semantic Recognition: Real-Time Action Response
Using multimodal AI semantics combined with 5G slicing technology, the virtual avatars can instantly understand the context of the live performance and generate an appropriate response in real time.
For example:
-
When a singer improvises, the AI avatar can respond instantly with matching movements or facial expressions.
-
When the host gives a command, the AI virtual character can simultaneously perform the corresponding actions.
This real-time interaction model, which links semantic understanding with action responses, transforms the performance from a one-way output into an immersive stage experience full of communication and surprises.

5G AI: A Multidimensional Duet Across Time and Space
Ding Dang x Della "Night Cat" & "Night Roam" at the 20th KKBOX Music Awards
Imagine your favorite singer not only performing on stage, but also interacting with an AI avatar on stage in a "parallel dimension." This isn't science fiction, it's the new performance experience made possible by 5G AI technology.
At the 20th KKBOX Music Awards, singer Ding Dang partnered with her virtual avatar, Della, to perform a stunning medley of her classic hits "Night Cat" and "Night Roam." Throughout the performance, they had lots of fun and charming interactions, as if two performers were conversing between the real and virtual worlds. This showcased the infinite possibilities of a 5G AI collaboration.

SOURCE|KKBOX


