As a developer, I find it odd that "Industry Leaders" would be pressuring the programmers to write more secure code. Any developer I know will write the most secure code they can - to a point. That point is normally how much the client is willing to pay for that security. For me to lock down every one of my applications, while coding it and through security reviews upon completion, could easily double or triple the bill the client has to pay. When clients balk at a 5 or 6 hour change request because it'll cost them $400-$700 bucks (for example only), they would walk away from the table if that number suddenly jumped to $2000 just to make sure every possible security hole has been addressed.
That said, a good programmer will address as many security holes as possible during development. But there is no crystal ball that says "in 1 year, a new bug will be found with the underlying system - if you code things this way instead, you'll avoid security concerns due to this bug". So, I guess it comes down to this: Everyone needs to strive for better and more secure code/applications/environments. However the bottom line will come into play eventually - and the end user will foot the bill every time, in one way or another. When a solution can be found that addresses security well, without passing costs onto the end user, while meeting the requirements of the application (which can include cost effecitvness, and ease of use), then we will see very large strides in the security landscape. In the meantime, it's like Curtis has said - we must strike a balance and be prepared if we choose wrongly. My thoughts. Shawn -----Original Message----- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] Behalf Of Curtis Sloan Sent: Wednesday, May 19, 2004 8:24 PM To: CLUG General Subject: Re: [clug-talk] Buggy software and usability issues ... On Wed May 19 2004 18:12, Andrew Graupe wrote: > bogi wrote: > >http://www.cnn.com/2004/TECH/biztech/05/19/computer.security.ap/index.html > > > >A good reading, specially for usability and security groups and > > developers. Cheers > >Szemir > > <snip> > easy-of-use, security, and a large number of features. In my > experience, security is not a set destination; it is different for > everyone. Adding to Andrews thoughts... Further to the security/accessibility continuum principle is the idea of "security as a process". It's never a finished product. Commentary on the article... Making "cybersecurity" real is at odds with the "capitalism" that brought pervasive computing and Microsoft ubiquity to market. How would we like to see government-mandated limitations on numbers of each deployed OS? "Sorry, you can't use Linux because it would form a quorum majority over the anti-monoculture policy limits. Instead, you get to use... let's see: BeOS. Yes, we know it's not developed or supported, but that's what's left on the list." ;-P I sincerely hope that this is not just another cry for a new/different "security" scapegoat (since blaming end users hasn't worked out). The final 2 paragraphs are accurate and telling: ------------ "Cybersecurity is everyone's responsibility, including the vendors, the users, enterprises and government agencies," said Greg Garcia of the Information Technology Association of America, one of the industry's leading trade groups. <snip> Both groups, however, said they oppose government mandates on security. ------------ As long as security is voluntary and optional, it isn't. :-P That's largely because the need for security is a human problem, not a technological one. My 2 cents, Curtis _______________________________________________ clug-talk mailing list [EMAIL PROTECTED] http://clug.ca/mailman/listinfo/clug-talk_clug.ca _______________________________________________ clug-talk mailing list [EMAIL PROTECTED] http://clug.ca/mailman/listinfo/clug-talk_clug.ca

