Hello,
On 06/08/2004 12:04 PM, Aaron Wolski wrote:
Just curious as to how people handle search engine optimization when
most of
the page content is dynically built from the db. Doesn't the bots
need
to
crawl the static pages and match your keywords to actual words in
the
file?
Practically, only Google matters these days as most sites get over 70% of leads from Google.
Since Yahoo! Dropped their affiliation with Google, many, including myself and my client, have seen a significant increase in Y! refers. 70% is not the case anymore.
Keeping all your eggs in one basket is a bad decision at best.
Yes, but if you just keep optimizing for Google you will be optimizing for most of the others that have significant share because Google always been the most successful in providing relevant results for the users and others tend to imitate Google.
For Google, it matters that your pages are served as fast as possible. If your pages are taking too long to be served, Google assumes it is causing too much load to your site and slows down giving more time between crawls.
There is a myth regarding the interpretation of this explanation for Google not indexing dynamic as many site pages because of the use of
URL
with query parameters (?some=thing&this=too).
http://www.google.com/webmasters/2.html#A1
Query parameters is is not the reason why Google does not index so
many
pages. I can demonstrate that just by let you see that Google indexes for 700.000 pages of php.net, many of which have many query
parameters:
http://www.google.com/search?q=site%3Aphp.net
So, do not bother with all those bogus advices telling you to use URL rewriting because that is not what matters. What matters is that your pages are served as fast as possible cause as less load to your server as possible.
Speed is a factor, page size is a factor but the number of query strings within a URL is why Google (and other bots) only go so deep into a site 0 for fear of getting caught in a endless loop.
They are getting better, however.
It's definitely not bogus information. I can get a site's pages indexed a lot quicker with URL rewriting than I can without.
My point is that if your pages are served slowly, URL rewriting does not matter and only a subset of your pages will be indexed. OTOH, if you make your pages be served very fast, URL rewriting is not necessary, at least for Google. I know that for experience. I have seen it several times Google crawling thousands of pages with no URL rewriting.
As for other search engines, I don't know because it is possible that they try to copy Google crawling logic the way they understand it, which may not be the actual way it works as Google does not disclose it .
--
Regards, Manuel Lemos
PHP Classes - Free ready to use OOP components written in PHP http://www.phpclasses.org/
PHP Reviews - Reviews of PHP books and other products http://www.phpclasses.org/reviews/
Metastorage - Data object relational mapping layer generator http://www.meta-language.net/metastorage.html
-- PHP General Mailing List (http://www.php.net/) To unsubscribe, visit: http://www.php.net/unsub.php