While I can't answer your question, if no else can, here's a method I
use when building a website:

I give each page a unique ID on the <body> element. This is generated
by a PHP script, but you could do it manually as well. Then I use a
switch statement in my main JavaScript file, testing for $
('body').attr('id'). One case for each page. That way you cut down on
code that looks for specific IDs and whatnot, and you'll have lighter
HTML as well, assuming you have lots of IDs.

For example, if my <body> is #main-page:

switch ($('body').attr('id')) {
        case 'main-page':
                <do stuff>
                break;
        case 'register-page':
                <do stuff>
                break;
}

I also use this method to target CSS to individual pages, if needed.

On Jun 26, 8:15 pm, jack <[EMAIL PROTECTED]> wrote:
> Is there a significant performance impact on detecting for events on
> elements which are not found on a certain page?
>
> For example, when using jQuery to code large sites i have always used
> a global event include i.e events.js to attach a $(document).ready for
> all possible site events. This allows the script to be cached on first
> page load, and all "inside page" events would not have to download
> page events with a separate specific uncached file, i.e.
> "somepage.js".
>
> I've always wondered though if this approach could negatively impact
> page performance.
>
> Would for example having a 10kb file with events like
>
> $('#very-specific-id-not-on-every-page').hover(function(){
>     // do something}, function() {
>
>    // do something else
>
> });
>
> have a noticable performance hit? Do you think that effect would be
> greater than the effects of not caching those events in a global
> include?
>
> I realize this isn't a cut-and-dry issue but i figured you guys might
> have some insight into "best practices" for jquery.
>
> thanks,
> jack

Reply via email to