For a website with many dynamic URLs based on request.vars /
request.args then you will need to crawl + scrape it. But for a more
static layout I would prefer an internal web2py solution to scraping,
so that the sitemap updates automatically when you add new functions /
controllers.

So is there a way to list the exposed functions in a controller?

Here is some nice CSS to create a graphical layout: 
http://astuteo.com/slickmap/demo/

Richard


On Dec 22, 8:15 pm, Benigno <bca...@albendas.com> wrote:
> Actually, if you have anything dynamic (say you have a blog, you may
> have as many pages as posts, plus as many pages as categories or tags
> or whatever and that may not reflect on your meny).
>
> I think it may be easier to try to implement it, using something like
> BeautifulSouphttp://pypi.python.org/pypi/BeautifulSoup/3.0.7a
> and indexing your website by creating a treelike structure as you do
> so. I think there are other libraries that are specialiced in
> traversing all urls of a given domain, but never used them.
>
> Cheers,
> Benigno.
>
> On Dec 21, 9:30 pm, Mengu <whalb...@gmail.com> wrote:
>
>
>
> > This changes depending on your application's structure.
>
> > On Dec 21, 6:23 pm, Leandro - ProfessionalIT <lsever...@gmail.com>
> > wrote:
>
> > > How to implement this in Web2Py ?

--

You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To post to this group, send email to web...@googlegroups.com.
To unsubscribe from this group, send email to 
web2py+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/web2py?hl=en.


Reply via email to