By Jim Grey (about)
“Our product is a Web app,” the CEO said. “Why wouldn’t it work flawlessly on an iPad?”
The answer, of course, was that we didn’t build it with the iPad in mind. When we got it to work on our PCs on Chrome, we declared it done. But some guy at one customer site decided to run it on his iPad, and he was upset because it didn’t work “at all.” Support escalated the case to Engineering, where one of the testers poked around the product on her personal iPad. She found that she could log in, navigate, and enter data on any screen. “It was a pain. I’d rather do it on my laptop any day. But it did work,” she said.
So we were puzzled by our user’s bad experience. And then we learned that he had an original iPad. Our tester had a fourth-generation iPad – on more modern hardware and the latest version of iOS. Unless we wanted to shop for an old iPad on eBay, we were never going to get to the bottom of this complaint. The user wasn’t thrilled when we told him that we couldn’t support his old tablet.
Crushed under the weight of platform support
If you don’t tell your users what they can use to access and run your product, they will run it on whatever they want. When it doesn’t work, they’ll demand you support it. This will crush you. You need to set some limits.
This isn’t meant to be done by fiat. You need to think it through: What are your customers likely to want to use? You need to be there. How will they access your product – devices, operating systems, browsers? If your product will be installed on their equipment, what hardware (or virtual servers), operating systems, and databases are they likely to use? If your product interfaces with other products, what versions will your users want to use?
What you’re doing is limiting the edge cases, like that user’s original iPad. You’re also limiting the list to what you can meaningfully test.
Creating a matrix of supported platforms
Create a spreadsheet that says your product runs on and with these versions of these other products. If you support several versions of your product, the list of supported platforms will probably vary a little version to version.
Give the spreadsheet to support and sales and say, “Use this to set expectations. If they want to use something that’s not on this list, try to talk them into sticking to the matrix. If they insist on going their own way, tell them we might not help them if they have problems.”
This isn’t a once-and-done job. New versions of your supported platforms will be released. Your users will want to use entirely new products with your product. You’ll have to decide when and whether to add them to your matrix. You’ll need to remove old platforms from the matrix sometimes. You need to keep gathering information and keep updating the matrix.
You can’t support it all: why I hate Android

Still stinging from the customer who had the bad iPad experience, the CEO asked, “Do we have any idea how the mobile Web site behaves on Android? It needs to run on Android, too. It needs to run on any device they have in their hands.”
The problem with Android is that there are eleventeen base versions in active use, each of which a vendor can customize for their devices. And the devices themselves run the processing-power and screen-resolution gamut. When I explained that I’d need to buy a hundred devices and hire twenty testers to test them all, the CEO became less excited about Android.
You can limit the scope of Android, and any other platform with a lot of permutations, by deciding which permutations you’ll support and testing only those. As I write this, the 2.3.x (Gingerbread) versions of Android are the most widely used. You can buy a couple of popular devices on that version and test just those.
The only challenge is that where there are lots of permutations, you’ll have plenty of users on unsupported permutations who aren’t going to be happy.
This reminds me of the browser interoperability problems from several years ago, and why companies usually limited their Web apps to work only on Internet Explorer. Browsers are better now, and it’s more common for Web apps to work on any modern browser. Here’s hoping Android settles down as browsers have.
You can’t test it all: Supported vs. Certified
The items on your supported platforms matrix needs to be things with which your product actually works. As much as possible, design your product up front to work with them. But if you’ve just created a supported platforms matrix for a product that’s been around a while, the best you can do is test your product with those supported platforms to find out where the bugs are. The knee-jerk approach is to do a “full” (whatever that means) regression test on every platform.
One place I worked, our supported platforms matrix grew huge. When we printed it, we needed a jeweler’s loupe to read it. For each version of our product, there were fifty or more supported platforms. I couldn’t afford the army of testers I’d need to do “full” regression passes on them all.
But all that testing really wasn’t necessary. You can look at what’s new in the latest version of a supported platform and make a pretty good assessment of the risk to your product, and size your testing accordingly. For example, a product I used to work on ran on the Oracle and SQL Server databases. When Oracle 11g Release 2 (11.2.0.1) came out, it offered enough new stuff that we weren’t sure how it would run with our product. So in our next release, we did our normal release testing, including regression, on that database. We also ran some special tests that exercised our most complicated SQL queries on it. We found and fixed some bugs that way.
When Oracle issued 11.2.0.2, we looked at what was new in that release and saw no reason to believe that any of it would create new bugs in our product. Our experience was that if a database version was going to throw bugs in our product, those bugs would be foundational and would appear right away under even light testing. We had a bug-fix release coming up, so we installed 11.2.0.2 on that test server and tested the bug fixes on it. The release sailed through testing, so we shipped it and added 11.2.0.2 to the supported platforms matrix.
Platforms that got the so-called “full” test treatment were listed as “Certified” on the matrix. What we told our customers was that we had “performed a certification test” on the platform, had fixed the bugs we found, and were highly confident of a good experience on that platform.
Platforms that got light testing were listed as “Supported” on the matrix. Sometimes our analysis showed low enough risk that we did no testing and still called the platform Supported. We told our customers that we expected that they would have a good experience on that platform.
The bottom line for both Certified and Supported was that if the customer experienced a problem, we’d take their call and resolve their issue.
The bigger bottom line is that you want a manageable list of platforms with and on which you believe your users will have good experiences using your product, a list that won’t kill you to keep up with. Through understanding what your customers want, limiting the list to the most common and important platforms, and varying the depth of testing based on risk, you can keep up.
2 replies on “A thing called platform support and why Android makes me nuts”
I borrowed Jim’s concept of “supported” vs. “certified” (with a slightly different slant) for inclusion in a 3-part article for SearchSoftwareQuality on mobile testing back in 2011. I’ve also shared the concept with a Fortune 500 company as well as a few startups, and it’s always been pretty well received: high-value target are addressed, and no one feels like the rest of the space has been permanently written off.
[The articles can be accessed here, if you’re interested (free sign-up required):
http://searchsoftwarequality.techtarget.com/tip/Mobility-application-testing-mobile-devices-on-a-budget
http://searchsoftwarequality.techtarget.com/tip/Mobile-application-testing-Cost-effective-strategies (this one has Jim’s concept)
http://searchsoftwarequality.techtarget.com/tip/Mobile-Web-applications-Monitoring-test-triggers ]
LikeLike
I love adaptation. I don’t remember where I got the original idea of supported vs. certified, but it was in a different form when I first encountered it. I adapted it for use at two, going on three, different companies, and it was a little different each time. Context drove the changes.
LikeLike