Speaking as a gwave user, the most useful aspect of playback was when
something went wrong, accidental deletion or copying over of content.
(usually with a crash too if the wave was big...)
Being able to revert to a previous version via the playback was the
easiest way to solve the problem.

Speaking as someone working on a Augmented Reality use case for
wfp...essentially not dealing with text at all, but 3d models placed
and positioned with data stored in blips. The idea of playing back the
whole creation of a 3d scene is very appealing. (especially if the
scene was made by a large group collaboration).

So those are the two use's in my mind and while I have no real
knowledge of the inner workings of the wave client or server beyond an
overview of the protocol, it seems having "key frames" might be the
best compromise solution between storage and loading speed.

-Thomas
arwave.org

~~~~~~
Reviews of anything, by anyone;
www.rateoholic.co.uk
Please try out my new site and give feedback :)



On 22 February 2011 15:06, Paul Thomas <[email protected]> wrote:
> Thanks is interesting.
>
> One point of playback is to quickly get updated on what you have missed. So
> therefore you don't really have to have have every singe change.
>
> It is kind of like flicking through the unread blips, except that doesn't have
> blip level history. I would be good if you could flick through unread changes.
>
> Using history to revert or fork wave might be used less often so that sort of
> history doesn't need to be played back smoothly, it just needs to be usable.
>
>
>
>
> ----- Original Message ----
> From: David Hearnden <[email protected]>
> To: [email protected]
> Sent: Tue, 22 February, 2011 12:43:23
> Subject: Re: is wave playback a priority right now?
>
> Hi Gerardo,
>
> It depends on what kind of playback experience you would like.
>
> In Google Wave, playback does not necessarily play things chronologically,
> but instead can reorder things to make the history simpler.  e.g., if two
> users A and B are concurrently adding their own replies, then the playback
> history can show A's complete reply as one history frame, then B's reply in
> a subsequent frame, even though there was no point in chronological history
> where A's reply was complete and B hadn't started replying  ...if that makes
> sense.  So mild reordering of the operation history in order to make it
> simpler is one complex part of playback.
>
> Another part of playback is grouping segments of history into "frames",
> where the boundaries between frames are historically interesting events
> (starting editing, stopping editing, participants being added and removed,
> etc).  Finding a good set of rules to group operations into useful frames is
> another complex part of playback.
>
> Being able to step backwards as well as forwards adds more complexity,
> because of the difference between "reversible" and "invertible" ops (the
> inverse of an invertible op is derivable from the op itself; the inverse of
> a reversible op, however, depends on the state to which it is applied).
>
> There are many other cases where adding some improvement to the feature can
> add significant complexity, e.g., efficiently moving wave state between two
> frames points in history, rather than applying all the operations one by
> one.
>
> So starting out with a simple goal of just playing back the operations
> individually, in order to play forwards through history, would be a good
> start.  Perhaps adding in some simple framing (no re-ordering) to group ops
> based on timestamp so that chunks of edits appear as a single frame?  I
> think that would be the start of a reasonably usable playback feature. The
> web client can create a wave model on an empty state, stub out the incoming
> operation stream component (MuxConnector) with a new one that's hooked up
> some play/pause UI control, and fetch the entire operation history from the
> server, putting those ops in the operation stream based on that UI control.
> It will be probably be quite slow, and won't scale for waves with big
> history, but it's certinaly a great start.
>
> Beyond that, you'd probably want to have a separate endpoint (maybe even a
> separate protocol, rather than the client/server operation protocol) for
> delivering a more compact representation of the history to the client.
> e.g., do some basic framing, and compose the ops in each frame together to
> only a few ops per frame.  That will significantly reduce the client-side
> processing, and sounds reasonably doable right now.
>
> -Dave
>
> On Tue, Feb 22, 2011 at 6:31 AM, Gerardo Lozano <[email protected]> wrote:
>
>> What would be the best way to approach playback implementation?
>>
>> This is what we've got:
>>
>> We've been looking at the code for the past few days now, and we think that
>> a good approach is to somehow get the a history of the wavelet deltas
>> (either from memory of from store) and then either apply the delta (done
>> with or in an Instance WaveletState) or append (done with or in an instance
>> of WaveletProvider them each time the playback is requested.
>>
>> To us, it seems that the most viable way to implement playback is to get
>> the
>> delta history from the store (with last week's implementation) and then
>> somehow build up from that.
>>
>> What would you guys recommend doing?
>>
>>
>>
>> 2011/2/8 James Purser <[email protected]>
>>
>> > Not at the moment, but if anyone wants to pick it up and run with it,
>> then
>> > please feel free :)
>> >
>> > James
>> >
>> > On Wed, Feb 9, 2011 at 5:17 AM, Yuri Z <[email protected]> wrote:
>> >
>> > > AFAIK - playback is not a priority at the moment and no one is working
>> on
>> > > it. If someone does - please correct me.
>> > >
>> > > 2011/2/8 Gerardo Lozano <[email protected]>
>> > >
>> > > > Hi everybody!
>> > > >
>> > > > Is anybody planning on working on wave playback? This is on the WIAB
>> > > > roadmap, but it has a blank status.
>> > > >
>> > > > Thanks!
>> > > >
>> > > > --
>> > > >
>> > > > Gerardo L.
>> > > >
>> > >
>> >
>>
>>
>>
>> --
>>
>> Gerardo L.
>>
>
>
>
>
>

Reply via email to