Op 04-12-12 17:26, joerg-cyril.hoe...@t-systems.com schreef: > Hi, > > Maarten Lankhorst wrote: >> Alsa's native period is ~ 22ms (1024 samples / 44100 or 48000) with dmix >> despite claiming it to be otherwise.. > What I don't understand is why you talk about ALSA at this level. DSound talks > to mmdevapi and that's DSound's underlying API, not ALSA. > > So if Wine's mmdevapi claims a 10ms period, it ought to behave like a > native shared mode driver with a 10ms period. > > IOW, I expect 10ms decreases of padding from winealsa when it claims a 10ms > period, > even if dmix/ALSA uses 21.333ms, PA-ALSA uses 50ms or Jack-ALSA uses 150ms. > > That means winealsa needs more polish... > > I talked about this decoupling between the app <-> mmdevapi <-> ALSA/OSS/XY > some time ago in wine-devel. I still believe this is less prone to app bugs > than to have > mmdevapi publish the period of 20, 50 or 150ms of its back-end, because no > native app > was ever tested with such periods. MS apps are not prepared for them. [Citation needed]
Now if you were talking about the IAudioClock, and preferably have some good tests that fail with current winepulse, I would believe you. > If you think that's not a good path to go, please raise your voice. VERY LOUDLY! > (Some may argue that winecoreaudio has been using 20ms since day 1, not 10ms. > Maybe that's close enough. > OTOH, I've mentioned today that my Mac machine stutters at the render test, > so wineCoreAudio is not yet in its best shape). I haven't looked at winecoreaudio at all. > >> GetStreamLatency is also used for calculating the period size, see > http://msdn.microsoft.com/en-us/library/windows/desktop/dd370874%28v=vs.85%29.aspx > "clients can use this latency value to compute the minimum amount of data > that they can write during any single processing pass." > That one paragraph is indeed interesting. I have no explanation of it yet. > What's true is that 10ms won't get you far when the back-end's period is 20, > 50 or 150ms. > However, I was talking about a continuous supply of 10ms chunks. > > (We are already past the ancient bug that "ALSA/dmix won't start if not fed at > least one ALSA period full of samples" - imagine the period were 150ms.) Or it sometimes refuses to flush the last remainder for writing.. if it's less than 1 period. > >> I don't think it's used as such in this commit yet, but in the mixer >> rework it's used it to calculate fragment length. > I'm not familiar with DSound. For sure, GetStreamLatency > should be involved in buffer size calculations, e.g. as a lower limit. Indeed! :D >>> have dsound call into the system openal to perform multichannel mixing and >>> 3D processing. (...) >> ... Must print this out on poster format, and frame this above my bed.. > This must sound like going a full circle to you. Wasn't some earlier version > of DSound and/or mmdevapi > based on OpenAL until some devs concluded that it was not a viable path? > > >> Except WAIT_OBJECT_0 is defined as 0. > I don't want to remember that. > You're now never going to forget that WaitForSingleObject and WaitForMultipleObjects with any=true returns the index of the first waited on event. ;-) ~Maarten