GINI2

Immersive VR Drumming & Smart Music Education

TL;DR: I researched, designed, developed, and deployed the GINI2 application from scratch during my time at AWS-i institute. GINI2 empowers music teachers to create immersive VR content, helping students learn fundamental drumming techniques and sheet music notation through a highly interactive digital approach.

Video below presents a short overview on the project, followed by detailed information on the porject:


Role: Main R&D Engineer (Architected, developed, and deployed from scratch)

About GINI2

Learning to play a musical instrument is traditionally demanding, but immersive technology can entirely redefine the experience. GINI2 is a comprehensive Virtual Reality (VR) application that I built from the ground up to revolutionize music education. Unlike standard VR rhythm games designed purely for entertainment, I engineered GINI2 specifically as a pedagogical tool to teach both practical drumming skills and music theory, such as note-sheet rules.

By integrating VR headsets, game pedals, and advanced sensor technologies, GINI2 empowers music teachers to create and deliver highly immersive, user-centered digital learning content without requiring extensive technical expertise.

⚙️ Key Features & Technical Highlights

  • Advanced Motion Tracking & Real-Time Feedback: Utilized the VR system’s tracking controllers to precisely record stroke movements and rhythmic sequences. The application analyzes this data to provide learners with real-time feedback and actionable tips to improve their drumming technique.
  • Immersive VR UI & Dynamic Data Architecture: To seamlessly connect practical drumming with music theory, we designed a forward-facing VR interface that visualizes real-time sheet music. I proposed and engineered a complex data structure to drive the note sheet and later led to a dynamic tracking bar, which moves synchronously with the musical timeline to provide precise visual cues for exactly when to strike a note.
  • Automated Sheet Music Generation: Implemented the real-time conversion of physical drumming sequences into sheet music. The system recognizes individual drum strokes and instantly displays them as standard musical notation, enabling an intuitive “learning by doing” approach that connects physical movement with note reading.

  • Dynamic Real-Time Shaders for Visual Cues: Overcoming the challenges of controlling predefined HLSL values in real-time, I engineered a custom dynamic shader within Unity. Using a programmatic C# handle for run-time control, I dynamically manipulated the shaders on the virtual drum surfaces, projecting specific colors and precise radiuses perfectly synchronized with the note timeline to visually guide the learner.

Filling yellow circles warns about upcoming notes to play
Feedback to player according to their timing: correct and on-time (green), correct but off-time (orange), incorrect (red)
  • Teacher-Student Collaboration: Developed a collaborative framework supporting a customizable curriculum. This tool allows educators to prescribe specific, step-by-step exercises and interact with learners to share feedback in shared digital sessions.

  • Mathematical Performance Scoring: Developed a scoring backend that leverages the known timing of registered notes and user collision data. By measuring whether the user hit the correct drum and calculating the exact time difference between the target note and the actual hit, the system presents performance scores.

  • Full-Body VR Immersion: Integrated physical gaming foot pedals to trigger virtual bass drum and hi-hat notes. By mapping real-world foot mechanics directly to the VR drum kit, the system provided a highly tactile and realistic full-body drumming experience.

🔧 Engineering Insights & System Constraints

Building a real-time VR music system from scratch presented unique engineering challenges. I mitigated many of these by designing highly efficient algorithms and strictly adjusting the Level of Detail (LoD) for 3D geometry to maintain high performance.

During R&D, I also identified and documented specific hardware-based tracking limitations: a slight tracking delay in current VR sensors leads to minor inaccuracies during extremely fast drumming movements. Additionally, accurately simulating the physical “rebound” (the springing back of the stick) remains a strict hardware constraint. Because of this, I architected GINI2 not as a standalone replacement for physical practice, but as a highly engaging, intuitive supplement to in-person drumming lessons.



Please note: I am the original architect of the GINI2, and developed more than 85% of the whole VR drumming application from scratch to the final version, including all core functions and various features. The technical research, system foundations, and prototype development associated with this VR drumming system are entirely my own work during my employement time at AWS-i.