i might have gotten through drafting the part that seemed most complex to me, 
unsure

                        region_mask = aligned_tails[idx:next_idx-1] < 
aligned_offsets[idx+1:next_idx]
                        region_bounds = region_mask.nonzero()[:,0]
                        region_bounds += idx
                        next_op_ct = op_ct + region_bounds.shape[0] + 1
                        offset_length_tail_idx_ops[op_ct:next_op_ct-1,TAIL] = 
aligned_tails[region_bounds]
                        offset_length_tail_idx_ops[next_op_ct-1,TAIL] = 
aligned_end
                        region_bounds += 1
                        offset_length_tail_idx_ops[op_ct+1:next_op_ct,OFFSET] = 
aligned_offsets[region_bounds]
                        offset_length_tail_idx_ops[op_ct,OFFSET] = aligned_start
                        offset_length_tail_idx_ops[op_ct:next_op_ct,OP] = 
OP_FETCH | OP_PLACE

above draft is intended to produce a list of pages that do that consolidation i 
was talking about, where fetches are merged into surrounding pages (by using 
aligned bounds) and adjacent pages are merged into a single fetch (by comparing 
non-overlapping bounds for difference)

[.. function not drafted fully yet, maybe 1/3rd done. next is listing all the 
ranges to output to the user and updating the loop variables over unfetched 
regions kinda. there's still a loop but the intent is to only loop over sparse 
and cached file regions (to cache accessed sparse regions)

Reply via email to