on Tue, Feb 19, 2002, Kent West ([EMAIL PROTECTED]) wrote: > Sorry for the Off Topic post, but I have a lot of confidence in the > opinions/knowledge of the folks on this list. > > I am not a web developer; know next to nothing about it. > > I have a co-worker who is developing some web pages. I've encouraged him > to pass his work through the W3C HTML validator. He says it fails, and > that if he recodes to pass, his work appears differently on different > browsers. For example, he has two frames next to each other that he > wants to have look like one piece, but Netscape 6.x puts a buffer around > each frame unless he inserts a non-W3C-approved tag, so that there's a > gulf between the two frames. > > So my question is this: > Are the W3C standards insufficient to allow the web > designers to do what they need to do, or is my > co-worker missing a technique that he needs to know? > > In other words, are the W3C standards sufficient to provide a > browser-agnostic world, with all the features that designers need? Or > does the W3C-approved label simply mean that the page is coded to the > least common denominator, and is therefore not practical for > PHB-oriented web sites?
A few random points. First. 75 - 90% of site traffic most places is some version of IE. For the designers writing for the masses, designing for this platform makes a fair bit of sense, repugnant as it may be to those of us on the GNU/Linux fringe. Personally, you'll have to pry Galeon out of my cold, dead hands. The ultimate surfing platform today is GNU/Linux, not Legacy MS Windows. Second. There's a bit of a rant I'm kicking around under the title of "Arachnophobia". It turns out that the things you do which are standards friendly are also pretty damned helpful at getting search engines to troll through and catalog your site. Google is the equivalent of a blind person searching the web. A blind person with 300 million best friends who hang on his every word. The same things that lead to cross-platform issues -- Javascript, Java, Flash, text-as-images, CGI, "personalization" -- all tend to reduce the ability (and/or inclination) of a search engine to crawl and archive a site. As the online field becomes ever more fiercely competitive, it's exposure, not "the message", which are vital for survival. My current employer knows and exploits this lessen intensively. Much as I'd like to see browsers drive standards compliance, I suspect the real kick is going to come from search engines, and that's spelled G-O-O-G-L-E everywhere that's anywhere. I've encountered a surprising number of sites with Flash-only front pages which are entirely excluded from search engines. Darwin still rules. Third. For those of you who're browsing from GNU/Linux but are having problems with poorly designed sites, particularly with fucked up font face and size preferences, CSS is your friend. I've created a userContent.css suitable for use with Galeon, Mozilla, Konqueror, Skipstone, and other standards-compliant browsers. It's extensively commented (though largely simple), and makes a hell of a difference to creating a readable web. You're welcome to try it: http://kmself.home.netcom.com/Download/userContent.css For "before" and "after" demonstrations, see pages with identical markup but differing stylesheets: http://kmself.home.netcom.com/Download/test.html http://kmself.home.netcom.com/Download/test-css.html ...results may vary if you have your own stylesheet in place. Peace. -- Karsten M. Self <kmself@ix.netcom.com> http://kmself.home.netcom.com/ What part of "Gestalt" don't you understand? There is no K5 cabal http://gestalt-system.sourceforge.net/ http://www.kuro5hin.org
pgpCNChZBjCJi.pgp
Description: PGP signature