Wednesday, November 24, 2010

SQA and the Scientific Method

My son has been learning about scientific method in his science class. As I've been helping him with his homework, I realized that I use scientific method when I find a bug.

For example, suppose you're testing remote access software installed on a Windows client. You're noticing that on one system, it keeps losing connection to the server. This is something I ran into once. Now, if you're a test monkey, you'll write up a bug saying, "Brokey brokey, no worky" and let development figure it out.

However, if you're reading this blog, you like making it easy for developers. So, you'll wind up asking yourself, "Why does this one system have a problem with disconnecting from the server?" At this point, you've just started approaching this from a scientific point of view.

Next, you'll do some research and elminate variables. What's unique about the one system with the problem? What could cause the connection to drop? Is it the network it's connected to? Is it a bad cable? Does it just not like me?

Once you've decided what could be causing the problem, you'll start with the first hypothesis. You'll want the simplest and easiest to test, so maybe it's the network. You'll test the hypothesis by moving the "bad" computer to the same network as the "good" computer. In fact, you could even use the same network cable that the "good" computer used. If it still fails, you've eliminated three variables (network, cable, and port on the switch). If it works, you've gotten it down to three.

If it still fails, it's back to the hypothesis and experiment loop. You'll want to keep eliminating variables until you find the cause of the problem. Maybe it's faulty hardware. Maybe it's another app. Maybe it's a feature unique to the computer.

In my case, the failing system was a laptop. After some experimentation, I traced the problem to the SpeedStep feature. If I turned that off, it worked fine. I entered the bug. When a developer got it, the root cause was found in minutes. It turned out that the API used to time the 60 second keep alive packet failed if the processor speed changed. When the app launched, the CPU usage was high, so the processor ran at full speed. Once it went idle, it slowed down, which slowed the timer down. Then, it missed the keep alive packet and the server assumed the client had disconnected and closed the pipe.

A good bug report starts with a question, then some reseach. After that, it's a cycle of coming up with a hypothesis, testing it, and repeating until you can prove a hypothesis and find the cause. Finally, you report the findings to a developer through a bug report and, hopefully, get the bug fixed.

Friday, May 21, 2010

Visual Studio 2010 Launch Event

Earlier this week, I went to the Microsoft Visual Studio 2010 launch event. It was informative and gave me a few ideas about how my testing is going to have to change in the upcoming months.

Microsoft seems to be pushing multi-touch screens. Testing a multi-touch screen is going to be different. At first, it's going to have to be mainly manual testing since I don't see any tools that would allow automating it. In fact, I just don't see how you can automate testing gestures. Sure, you could have some sort of simulator or even wire directly into the input stream, but a fully-automated system won't be able to account for human movement. The automated gestures would be too perfect. Touch screen interface testing is all about how it "feels," and you can't automate human perception (at least not yet).

Two more hot points are SharePoint and Windows Phone 7. SharePoint doesn't directly affect my testing, but the company I work for is moving towards it. It's going to change how I organize my test plans and testing tools as well as how I find information such as PRDs and MRDs. It's definitely much better than throwing everything in a network share, but it's going to take some getting used to.

Windows Phone 7 will likely affect my testing. I would be surprised if I don't wind up having to test some apps on that platform in the near future. It looks like it has some cool features, but we'll have to see how it does against the big boys in the mobile market. Sure, the market is expanding, but there are already some major players in there with a strong market share. Even MS will have a hard time gaining a large chunk of share, no matter how good they can make Windows Phone 7.

I did notice a couple of things during the event. Microsoft supplied lunch at the event. In with my sandwich was some fresh fruit. I thought it was kind of ironic that Microsoft chose an apple, so they wound up giving out apples at a Microsoft event.

The other thing I noticed was that just about every person giving a demo wound up lost in their own application. They couldn't find menu items. They didn't see a missing brace in the code. They didn't know where the icon to launch their demo was on the desktop. It pretty much confirms "Troy's Theory of Technical Throngs:" the amount of time it takes to find something on a computer is directly proportional to the number of people watching you.