Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-14 Thread Dave Malham
e<mailto:rica...@justnet.com.au> > Sent: Tuesday, June 14, 2016 7:07 AM > To: 'sursound@music.vt.edu'<mailto:sursound@music.vt.edu> > Subject: Re: [Sursound] Using Ambisonic for a live streaming VR project > > > The main mechanisms for disambiguating 'con

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-14 Thread Martin Richards
ot;'sursound@music.vt.edu'" Sent: Tuesday, 14 June 2016, 12:35 Subject: Re: [Sursound] Using Ambisonic for a live streaming VR project > The main mechanisms for disambiguating 'cones of confusion' (and this includes front-back reversals) are: pinnae effects (Battea

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-13 Thread umashankar manthravadi
e 14, 2016 7:07 AM To: 'sursound@music.vt.edu'<mailto:sursound@music.vt.edu> Subject: Re: [Sursound] Using Ambisonic for a live streaming VR project > The main mechanisms for disambiguating 'cones of confusion' (and this includes front-back reversals) are: pinnae effe

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-13 Thread Richard Lee
> The main mechanisms for disambiguating 'cones of confusion' (and this includes front-back reversals) are: pinnae effects (Batteau) and head-movements (Wallach) - so, without either of these mechanisms at play, one would expect directional ambiguity. You can test the relative importance of the

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-13 Thread umashankar manthravadi
Lennox<mailto:p.len...@derby.ac.uk> Sent: Monday, June 13, 2016 2:38 PM To: Surround Sound discussion group<mailto:sursound@music.vt.edu> Subject: Re: [Sursound] Using Ambisonic for a live streaming VR project Stephan, hi The main mechanisms for disambiguating 'cones of confusion' (a

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-13 Thread Dave Malham
- > From: Sursound [mailto:sursound-boun...@music.vt.edu] On Behalf Of Stefan > Schreiber > Sent: 13 June 2016 02:03 > To: Surround Sound discussion group > Subject: Re: [Sursound] Using Ambisonic for a live streaming VR project > > Hi Archontis, > > sorry for the relatively

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-13 Thread Peter Lennox
.@derby.ac.uk t: 01332 593155 https://derby.academia.edu/peterlennox https://www.researchgate.net/profile/Peter_Lennox -Original Message- From: Sursound [mailto:sursound-boun...@music.vt.edu] On Behalf Of Stefan Schreiber Sent: 13 June 2016 02:03 To: Surround Sound discussion group Subject:

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-12 Thread Stefan Schreiber
Hi Archontis, sorry for the relatively late response. I was travelling and had some problems to post anything on sursound during my trip. (I finally know what went wrong...) Anyway, many thanks for the (as always) clear and well-informed answer you gave to my posting. It is quite remarkabl

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-07 Thread John Leonard
Re the OSSIC cans: if anyone wants a set of these when they become available, I seem to have a one-time code for 20% the sale price. Let me know off-list: first come, first served. John Please note new email address & direct line phone number email: j...@johnleonard.uk phone +44 (0)20 3286 594

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-07 Thread Politis Archontis
Hi Stefan, On 07 Jun 2016, at 04:35, Stefan Schreiber mailto:st...@mail.telepac.pt>> wrote: Politis Archontis wrote: But instead of combining all microphones to generate the binaural directivities (as in ambisonics), it interpolates only between the two adjacent microphones that should be cl

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-06 Thread Stefan Schreiber
;s system uses binaural recordings, and they know what they're doing. Decorrelation, and software reverb, can help with a sense of externalisation, though you can go too far. Ciao, Dave Hunt From: Aaron Heller Date: 4 June 2016 20:53:09 BDT To: Surround Sound discussion group

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-06 Thread Stefan Schreiber
Aaron Heller wrote: My experience with FOA-to-binaural rendering is pretty much the same as what Acrhontis says. I hear directional information and head tracking effects, but have never experienced the externalization and verisimilitude that direct dummy head "Direct dummy head" recordings

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-06 Thread len moskowitz
The folks at OSSIC claim to be able to decode B-format to binaural with a personalized HRTF. Their headphones measure - in a very short time - the response of your head and pinna. It does head tracking too. http://www.ossic.com/ Len Moskowitz (mosko...@core-sound.com) Core Sound LLC www.core

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-06 Thread Politis Archontis
> Ciao, > > Dave Hunt > > >> From: Aaron Heller >> Date: 4 June 2016 20:53:09 BDT >> To: Surround Sound discussion group >> Subject: Re: [Sursound] Using Ambisonic for a live streaming VR project >> >> >> My experience with FOA-to-bin

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-06 Thread Politis Archontis
Hi Peter, I have heard 7th-order real-time decoding with headtracking, and that’s from a real microphone array. There was no perceptible latency. And I think the AmbiX plugins can handle the rotations and (N+1)^2 short convolutions for the same order without problem (head-tracking performance

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-06 Thread Peter Lennox
ation, though you can go too far. Ciao, Dave Hunt > From: Aaron Heller > Date: 4 June 2016 20:53:09 BDT > To: Surround Sound discussion group > Subject: Re: [Sursound] Using Ambisonic for a live streaming VR > project > > > My experience with FOA-to-binaural render

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-05 Thread Dave Hunt
too far. Ciao, Dave Hunt From: Aaron Heller Date: 4 June 2016 20:53:09 BDT To: Surround Sound discussion group Subject: Re: [Sursound] Using Ambisonic for a live streaming VR project My experience with FOA-to-binaural rendering is pretty much the same as what Acrhontis says. I he

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-04 Thread Aaron Heller
My experience with FOA-to-binaural rendering is pretty much the same as what Acrhontis says. I hear directional information and head tracking effects, but have never experienced the externalization and verisimilitude that direct dummy head or Algazi and Duda.'s motion-tracked binaural recordings

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-04 Thread Stefan Schreiber
Stefan Schreiber wrote: As long as we we don't use personalised HRTFs (accessible HRTF personalisation methods already seem to be in development), < objective comaprison > between loudspeaker and binaural decoders are just not possible. "objective comparisons", of course. "Comaprison" re

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-04 Thread Stefan Schreiber
Politis Archontis wrote: Hi Jörn, On 03 Jun 2016, at 15:27, Jörn Nettingsmeier mailto:netti...@stackingdwarves.net>> wrote: Note however that while the quality of first-order to binaural is quite good because the listener is by definition always in the sweet spot, first-order over speakers

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-04 Thread Politis Archontis
Hi Jörn, On 03 Jun 2016, at 15:27, Jörn Nettingsmeier mailto:netti...@stackingdwarves.net>> wrote: Note however that while the quality of first-order to binaural is quite good because the listener is by definition always in the sweet spot, first-order over speakers can be difficult for multipl

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-03 Thread Jörn Nettingsmeier
On 06/03/2016 02:28 AM, Bob Burton wrote: Very excited to find out to do this. In my research, youtube must render the ambisonic files before playing, so when you live stream, in is stereo. When you play back later, it can be ambisonic. Is there someone else who streams ambisonic live? I did

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-03 Thread Politis Archontis
Hi Antoine, I don't know of any out-of-the-box solutions for that, maybe other people on the list do, but a DiY solution is to use one of the audio programming environments that can stream audio, and they have modules for ambisonic decoding and rotation. Puredata (Pd) has such objects from whe

Re: [Sursound] Using Ambisonic for a live streaming VR project

2016-06-02 Thread Bob Burton
Very excited to find out to do this. In my research, youtube must render the ambisonic files before playing, so when you live stream, in is stereo. When you play back later, it can be ambisonic. Is there someone else who streams ambisonic live? On Thu, Jun 2, 2016 at 9:13 AM, Antoine Simon wro