sorry, I meant rows.as_dict(key="buyer.id")

On Oct 29, 1:44 pm, "mr.freeze" <nat...@freezable.com> wrote:
> I am still testing your solution but here is mine in the meantime.  It
> lets you say rows.as_list(key="buyer.id") for joins
>
>     def as_dict(self,
>                 key='id',
>                 compact=True,
>                 storage_to_dict=True,
>                 datetime_to_str=True):
>
>         rows = self.as_list(compact, storage_to_dict, datetime_to_str)
>         out = {}
>         pcs = key.split(".") if "." in key else None
>         for r in rows:
>             if r.has_key(key):
>                 out[key] = r[key]
>             elif pcs:
>                for i in range(0,len(pcs)-1):
>                    val = r[pcs[i]][pcs[i+1]]
>                out[val] = r
>
>         return out
>
> On Oct 29, 1:03 pm, mdipierro <mdipie...@cs.depaul.edu> wrote:
>
> > it seems to me if you join as_dict, the key cannot be always a single
> > field. I have a proposed solution in trunk
>
> > On Oct 29, 12:22 pm, "mr.freeze" <nat...@freezable.com> wrote:
>
> > > as_list seems to be recursing properly now.  Same error on as_dict but
> > > I am digging deeper.
>
> > > On Oct 29, 12:12 pm, mdipierro <mdipie...@cs.depaul.edu> wrote:
>
> > > > I actually think i fixed the recursive dictit but it still could use
> > > > some testing.
>
> > > > Massimo
>
> > > > On Oct 29, 11:58 am, mdipierro <mdipie...@cs.depaul.edu> wrote:
>
> > > > > True. The issue with JOIN is not an easy one to fix.
>
> > > > > Can you send me a patch about one or both issues?
>
> > > > > Massimo
>
> > > > > On Oct 29, 11:43 am, "mr.freeze" <nat...@freezable.com> wrote:
>
> > > > > > Also, the as_list function only converts the first level of 
> > > > > > DALStorage
> > > > > > to dict when storage_to_dict is true.  Both function 
> > > > > > (as_list,as_dict)
> > > > > > should probably recurse through all level right?
>
> > > > > > On Oct 29, 11:27 am, "mr.freeze" <nat...@freezable.com> wrote:
>
> > > > > > > Works for normal queries but throws a KeyError on id for joins.
>
> > > > > > > On Oct 29, 10:12 am, mdipierro <mdipie...@cs.depaul.edu> wrote:
>
> > > > > > > > ok, in trunk, take a look.
>
> > > > > > > > On Oct 29, 9:26 am, Renato-ES-Brazil <caliari.ren...@gmail.com> 
> > > > > > > > wrote:
>
> > > > > > > > > I agree.
>
> > > > > > > > > On 29 out, 12:19, "mr.freeze" <nat...@freezable.com> wrote:
>
> > > > > > > > > > I think it is worth adding an as_dict function to Rows 
> > > > > > > > > > personally.
>
> > > > > > > > > > On Oct 29, 9:01 am, mdipierro <mdipie...@cs.depaul.edu> 
> > > > > > > > > > wrote:
>
> > > > > > > > > > > My bad again
>
> > > > > > > > > > > item_dict=dict([(r['id'],r) for r in db(db.items.id > 
> > > > > > > > > > > 0).select
> > > > > > > > > > > ().as_list()])
>
> > > > > > > > > > > On Oct 29, 8:58 am, Fran <francisb...@googlemail.com> 
> > > > > > > > > > > wrote:
>
> > > > > > > > > > > > On Oct 29, 1:52 pm, mdipierro <mdipie...@cs.depaul.edu> 
> > > > > > > > > > > > wrote:
>
> > > > > > > > > > > > > oops.there should be no sterisk.
> > > > > > > > > > > > > item_dict=dict([(r.id,r) for r in db(db.items.id > 
> > > > > > > > > > > > > 0).select()])
>
> > > > > > > > > > > > Great, that works ;)
>
> > > > > > > > > > > > This one still fails:
> > > > > > > > > > > > item_dict=dict([(r.id,r) for r in db(db.items.id > 
> > > > > > > > > > > > 0).select().as_list
> > > > > > > > > > > > ()])
>
> > > > > > > > > > > > 'dict' object has no attribute 'id'
>
> > > > > > > > > > > > Of course, this is already a dict:
> > > > > > > > > > > > db(db[table].id > 0).select().as_list()
>
> > > > > > > > > > > > F
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To post to this group, send email to web2py@googlegroups.com
To unsubscribe from this group, send email to 
web2py+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/web2py?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to