Project Details
Description
There is no broadly used system that enables a first-person view (FPV) drone pilot to participate in drone control and operation with an acoustic element. In FPV flying, the pilot wears a pair of goggles that displays real-time video from the drone’s perspective. Ocularcentrism limits the feedback mechanisms available for FPV pilots to experience a sensation of flight. This problem stems from rotor noise and inferior sound signals compared with high resolution video signals. Propeller noises that pilots are accustomed to hearing from afar are also detached from the immersive video feed. As a result, pilots are left with an aurally impoverished flying experience. Enhancing aural feedback is important because it can revolutionize FPV drone piloting.
The purpose of this research project is to enhance the embodied telepresence experienced by FPV drone pilots through the integration of real-time, data-driven sound design. This study aligns artistic creativity with innovative drone technologies to improve upon the PI’s previous transdisciplinary research evaluating the qualitative effects of immersive sonification in FPV drone piloting. It seeks to address significant human-drone interaction questions related to ocular and aural experiences shared between the human mind-body and the drone machine.
(i) How FPV pilot's understanding of aerodynamic spatial awareness and movement changes with the introduction of ocular and synthesized aural feedback?
(ii) What types of additional drone sensors, algorithms, sonification techniques, and sound designs further enhance the out-of-body aerial experience?
(iii) How can immersive sound designs be customized to the three major FPV flying types: freestyle, racing, and aerial cinematography?
By incorporating electroacoustic transducers, such as headphones, and employing precise methods derived from physical computing, FPV drone pilots hear real-time sounds that are both spatial and designed to enhance the feeling of flight. The sound designs are controlled by in-flight data sourced from drone sensors, flight controller algorithms, aerobatics, and operator inputs. By expanding the sensory dimensions of the aerial experience beyond the visual realm, the resulting auditory output enables FPV drone pilots to transcend the limitations of ocularcentrism in drone control and access an aural landscape that induces an enriched sense of spatial awareness while airborne. The research project engages a major research gap in artistic creativity within human-drone interaction and significantly contributes to the FPV community by making advancements in aural feedback within drone technology.
The purpose of this research project is to enhance the embodied telepresence experienced by FPV drone pilots through the integration of real-time, data-driven sound design. This study aligns artistic creativity with innovative drone technologies to improve upon the PI’s previous transdisciplinary research evaluating the qualitative effects of immersive sonification in FPV drone piloting. It seeks to address significant human-drone interaction questions related to ocular and aural experiences shared between the human mind-body and the drone machine.
(i) How FPV pilot's understanding of aerodynamic spatial awareness and movement changes with the introduction of ocular and synthesized aural feedback?
(ii) What types of additional drone sensors, algorithms, sonification techniques, and sound designs further enhance the out-of-body aerial experience?
(iii) How can immersive sound designs be customized to the three major FPV flying types: freestyle, racing, and aerial cinematography?
By incorporating electroacoustic transducers, such as headphones, and employing precise methods derived from physical computing, FPV drone pilots hear real-time sounds that are both spatial and designed to enhance the feeling of flight. The sound designs are controlled by in-flight data sourced from drone sensors, flight controller algorithms, aerobatics, and operator inputs. By expanding the sensory dimensions of the aerial experience beyond the visual realm, the resulting auditory output enables FPV drone pilots to transcend the limitations of ocularcentrism in drone control and access an aural landscape that induces an enriched sense of spatial awareness while airborne. The research project engages a major research gap in artistic creativity within human-drone interaction and significantly contributes to the FPV community by making advancements in aural feedback within drone technology.
Status | Not started |
---|---|
Effective start/end date | 1/01/26 → 31/12/27 |
Fingerprint
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.