Is this analysis some horrifically long running thing that cannot happen in
the same thread? That might be the only reason I can think of for needed a
pause...

Brian

On Wednesday, June 22, 2016, Brian Geffon <briangef...@gmail.com> wrote:

> I think you might be misunderstanding certain continuation guarantees
> you'll have.
>
> First, with transformations you'll never have a content length header
> because the content length cannot be committed to until the final
> transformation is complete so it will always happen with chunked transfer
> encoding.
>
> Then regarding transformations. You will NEVER be called into your
> consume() while it's currently running. That is, if you're currently
> consuming a chunk you need not worry about "pausing" as you're already
> paused in that sense, there is underlying mutual exclusion. The same is
> true with handleInputComplete().
>
> Given that I still believe the method I'm describing will work for you, in
> your consume() method just keep buffering until you get a
> handleInputComplete() or you observe you've exceeded your limit. If you're
> in consume and your limit is exceeded you would read back in all of your
> buffered data and produce() it, set a flag that your no longer buffering
> and immediately produce all data that arrives in comsume() going forward.
> Finally if you get called in handleInputCompelte() before hitting the limit
> it means you were obviously below that limit and you would do your analysis
> and then produce the buffered content or modify it or discard it. Unless
> I'm missing something I believe what you want to do can be done today. Does
> that all make sense?
>
> If you'd like we can try to chat on IRC
>
> Brian
>
> On Wednesday, June 22, 2016, David Ben Zakai <david.benza...@gmail.com
> <javascript:_e(%7B%7D,'cvml','david.benza...@gmail.com');>> wrote:
>
>> Hi guys,
>> Thanks for your feedback
>>
>> James,
>> I can expose another method - lets say *void resume(uint ms) *that will
>> schedule it for the user, sounds good?
>>
>> Brian,
>> I want to buffer to a certain limit and when the 'Content-Length' header
>> is
>> missing we can't know the file size, so we'll start buffering to file
>> while
>> keeping track on the aggregated size. If we figure out that the file size
>> is greater than the limit then we'd like to pass on the analysis and just
>> consume all the data. For that we first need to produce all the file data
>> (Read chunks of it and consume it * N times), that's why we need to pause
>> the transformation we don't want to start buffering the data that keeps on
>> coming while we consume the file, when we are done consuming the file we
>> can resume the transformation and consume the rest of the data.
>>
>> Thanks
>>
>> On Wed, Jun 22, 2016 at 3:59 AM Brian Geffon <briangef...@gmail.com>
>> wrote:
>>
>> > The thing I'm missing about this, why can't you just keep buffering to
>> the
>> > same file while you're doing analysis? I don't see why you need to pause
>> > anything?
>> >
>> > Brian
>> >
>> > On Friday, June 17, 2016, David Ben Zakai <david.benza...@gmail.com>
>> > wrote:
>> >
>> > > Hi all,
>> > >
>> > > I'd like to suggest an API change in the CPP API Transformation
>> > interface.
>> > >
>> > > My own use case is that I'd like to be able to pause the
>> transformation,
>> > > handle what I can from the file and release the buffered content
>> before
>> > > resuming and releasing the rest of the data.
>> > > Basically what I'm trying to achieve:
>> > >
>> > >    1. Write data to file (up to a certain amount)
>> > >    2. File analysis
>> > >    3. Produce file data (and any leftovers) downstream
>> > >
>> > > When the file size reaches a certain size limit I'd like to be able to
>> > > pause the transformation and produce all of its content downstream and
>> > then
>> > > resume the transformation.
>> > >
>> > > API Changes:
>> > > TransformationPlugin.h:
>> > > /**
>> > > * Call this method if you wish to pause the transformation.
>> > > * Schedule the return value continuation to resume the transforamtion.
>> > > * If the continuation is scheduled and called after the transform is
>> > > destroyed it will
>> > > * won't do anything beyond cleanups.
>> > > * Note: You must schedule the continuation or destroy it (using
>> > > TSContDestroy) yourself,
>> > > * otherwise it will leak.
>> > > */
>> > > TSCont pause();
>> > >
>> > > Internally, the continuation wraps the "resumeCallback" static
>> function:
>> > > static int resumeCallback(TSCont cont, TSEvent event, void *edata);
>> /**
>> > > Resume callback*/
>> > >
>> > > You can view my jira issue here
>> > > <https://issues.apache.org/jira/browse/TS-4523>
>> > > You can view my pull request here
>> > > <https://github.com/apache/trafficserver/pull/712>.
>> > >
>> > > Thank you!
>> > >
>> >
>>
>

Reply via email to