I stand corrected...

Thank you very much, this was "quite" a helpful answer! 

Stefan

----- Mensagem de Dylan Marcus <dy...@mach1.tech> ---------

 Data: Tue, 4 Jan 2022 14:49:43 -0500

 De: Dylan Marcus <dy...@mach1.tech>

 Assunto: Re: [Sursound] Ambisonics with AirPods head tracking

 Para: Surround Sound discussion group <sursound@music.vt.edu>

On Jan 4, 2022, at 2:39 PM, Stefan Schreiber <st...@mail.telepac.pt> wrote:



 In other words:



 Is there a way to get access to the orientation data of the Airpods?



 And in which format are these? Cardan angles? (yaw, pitch, roll)



 Quaternions?

“Is there a simple process for listening to FOA Ambisonic recordings on



 AirPods, with head tracking?”

The question is of course of general interest.



I suppose that Apple currently does not give some easy access to the iPod 3DoF orientation data. But actually, why not?

They actually do via the CMHeadphoneMotionManager API



Here is an example app we made when this was introduced: https://github.com/Mach1Studios/M1-AirPodOSC <https://github.com/Mach1Studios/M1-AirPodOSC>

This app just shows how to access the orientation and send it to any host via OSC, feel free to use and fork however you like.



We also have some iOS examples of this being applied to spatial audio playback via our SDK: https://github.com/Mach1Studios/Pod-Mach1SpatialAPI <https://github.com/Mach1Studios/Pod-Mach1SpatialAPI>

You could even modify our “mach1-transcode-example” to playback FOA (transcoding to our Mach1 Spatial vector channel based format and using Mach1Decode to play it back).

https://github.com/Mach1Studios/Pod-Mach1SpatialAPI/tree/master/Examples/mach1-transcode-example/mach1-transcode-example <https://github.com/Mach1Studios/Pod-Mach1SpatialAPI/tree/master/Examples/mach1-transcode-example/mach1-transcode-example>



 ——



When it comes to using the orientation from AirPods, we are happy with the performance and access via the CMHeadphoneMotionManager, however this is limited when distributing an app that uses it (read below). As it comes to how the orientation is used in “Apple’s Spatial Audio” we are EXTREMELY unhappy, you can read about that in detail here:

https://research.mach1.tech/posts/feedback-on-apple-spatial-audio/ <https://research.mach1.tech/posts/feedback-on-apple-spatial-audio/>

And https://research.mach1.tech/posts/observations-and-limitations-of-apple-spatial-playback-implementation/ <https://research.mach1.tech/posts/observations-and-limitations-of-apple-spatial-playback-implementation/>



It should be noted that we have been doing tests of creating open source example codebases to allow artists/labels distribute spatial music via an iOS app until common music services support more spatial audio playback. We have been met with a lot of pushback from Apple which we have been tracking here: https://research.mach1.tech/posts/submitting-spatial-music-apps-to-ios-app-store/ <https://research.mach1.tech/posts/submitting-spatial-music-apps-to-ios-app-store/>

- - - -



Best,



 Stefan



P.S.: It is of course possible (and very likely “utterly legal”) to hack this device. Because you have bought it. (Otherwise Apple would pretend that some headtracker sensor functions of their head-/earphones would be “copy-protected”, or alike. This seems to be a nonsense idea, right from the start. Trade secrets?  But why are you then selling a product in the first place?   It is your risc if < your >  customer discovers how  your product is functioning, even outside the limited world of iOS...



So I suppose the Airpods should send some position data over the BT5 interface, from time to time. So you have to check the data coming from the AirPods, not from the iPhone/iPad. Are any talented "makers" lurking on this list?  )



P.S. 2: What is pretty clear is that Apple < does not tell you > how to read out some headtracking sensor data.



A different story is to get access to the position/orientation data of an iPhone/iPad itself, which should be supported by the OS. (Don’t check this now.



 For Android:



 https://developer.android.com/guide/topics/sensors/sensors_position



 )



 ----- Mensagem de Hugh Pyle <hp...@cabezal.com> ---------



 Data: Tue, 4 Jan 2022 08:19:42 -0500



 De: Hugh Pyle <hp...@cabezal.com>



 Assunto: [Sursound] Ambisonics with AirPods head tracking



 Para: Surround Sound discussion group <sursound@music.vt.edu>

Hi sursounders,



 I've been away from the Apple audio ecosystem for a long while.  But now I



 have some Airpods, and their head-tracked / spatialization functions are



 interesting.



 Is there a simple process for listening to FOA Ambisonic recordings on



 AirPods, with head tracking?  I have a number of recordings from TetraMic



 and H3-VR that I'd like to render for immersive headphone playback, with



 head-tracked rotation.  Ideally an audio-only production workflow that



 doesn't involve expensive proprietary tools.  But it's OK if the best



 answer involves publishing to YouTube or embedding the Ambisonic stream in



 some other video format.



 Thanks for any suggestions -



 Hugh
-------------- next part --------------
An HTML attachment was scrubbed...
URL: 
<https://mail.music.vt.edu/mailman/private/sursound/attachments/20220104/d5e5a892/attachment.htm>
_______________________________________________
Sursound mailing list
Sursound@music.vt.edu
https://mail.music.vt.edu/mailman/listinfo/sursound - unsubscribe here, edit 
account or options, view archives and so on.

Reply via email to