Security is an important part of any product. Whether it's a network attached drive, a router, a computer, or even a flash drive, if a customer's data can be accessed by an unauthorized user, it's bad. I'm not a security expert, but I have found my share of security flaws in products I've tested. Some were really bad (giving access to data through the Interent). Some were't as bad (giving access to limited information on the LAN). I've pushed for them all to be fixed.
However, there are two things that always wind up triggering "battles" between product management and SQA:
1) Exposing applications unecessarily. An example would be a server listening on port 80 when it could easily listen on a random port, such as 37272. I'll put a bug in stating that it should be moved, and invariably it comes back as "security through obscurity isn't secure." I completely agree with that and have even blogged about it in the past (see http://artofsqa.blogspot.com/2011/03/insecurity-through-perspicuity.html).
However, I believe in insecurity through visibility (yes, I've used "perspicuity" in the past, but "visibility" is a much more common word, easier to type and spell, and the connotations actually fit the phrase better). I'm not saying that "hiding" the app makes it more secure, but exposing it means that if there is a security flaw, the "bad guys" are more likely to find it before you do. After all, most port scanners that hit random IP addresses only scan a limited number of ports. Why have your app hit by port scanners if it doesn't have to?
2) Sending data in a way that appears to be insecure, even if it's not. For example, a product sends the source code for PHP files when you do a get without logging in. No customer data is exposed, and the product has open source firmware, so anybody can get the source code anyway. There is no real risk, but it is still bad from a marketing perspective.
A perception of insecurity is just as bad as a flaw in security. Because of the pride factor in being the first to disclose a vulnerability, some security researchers do not take into account the exploitablility of the vulnerability and are quick to disclose. In the example above, the disclosure might state that pages are accesssible without logging in. A disclosure like this means somebody researching your product might shy away from purchasing it because it's "insecure."
Friday, August 24, 2012
Subscribe to:
Posts (Atom)