There is a lot of information and opinions available on building an automation framework if you search online. This blog is an attempt to express my view from all the information available out there and some of the additional information that I have figured out and worked on.
Here are some of the challenges associated with testing a product that is suppose to work on different platforms like web, natively on desktop and on Mobile devices. I have also written a separate blog for “What to test in a cross-platform application?” As a benchmark, I will consider Skype as the application under test.
- OS combinations: An application that needs to work on different flavors of each OS type that it supports. Imagine an application that is suppose to work on Web, Windows and Mac, and needs to support at least the 3 versions of each OS type (based on current OS usage as of Dec 2013). Permutation and combination of each of these 3 results in 72, which means you need to run each test 72 times. Obviously you can rule out some combinations based on usage, application design extra. Now add in Mobile devices and OSes to this mix and the matrix grows exponentially.
- Installer/Setup Differences: Many a times, software testers find themselves spending a lot of time just setting up for the tests. Different platforms require the use of native package formats such as RPM and MSI. Multi-platform installers such as InstallAnywhere, JExpress, InstallBuilder, or IzPack address this need.
- Feature Parity – Developers are often restricted to using the lowest common denominator subset of features, which are available on all platforms. This may hinder the application’s performance or prohibit developers from using platforms’ most advanced features.
- Hardware Differences – If your application uses any of the system resources like audio, video etc. The different hardware combinations add to the complexity of testing.
- UI Differences – Different platforms often have different user interface conventions, which cross-platform applications do not always accommodate. For example, applications developed for Mac OS X and GNOME are supposed to place the most important button on the right-hand side of a window or dialog, whereas Microsoft Windows and KDE have the opposite convention. Though many of these differences are subtle, a cross-platform application, which does not conform appropriately to these conventions may feel clunky or alien to the user. When working quickly, such opposing conventions may even result in data loss such as in a dialog box confirming whether the user wants to save or discard changes to a file.
- Security concerns: Cross-platform execution environments may suffer cross-platform security flaws thus creating a fertile environment for cross-platform malware.
The pyramid above allows you to find bugs more easily and quickly as you are able to detect issues early on with more code coverage through Unit tests and component tests running in CI. Also this pyramid allows you reduce cost of fixing each bug. Any bug found higher up in the pyramid is difficult to fix, debug and consumes a lot of time.
Keeping the above pyramid in mind, here are the things that you would need to consider when building an automation framework for an application like Skype:
- Application Development technology: This is important for being able to create and write unit tests for the application under test. Also this would determine how many people would be able to contribute to the framework and tests.
- Frequency of changes in functionality or UI: This would obviously be dependent on the stage of your application. If your application is in its initial stages of development then the functionality and UI may be changing a lot to meet evolving customer needs.
- Future changes: Is the application to be in a maintenance mode, rewritten or modified significantly in the near future.
- Powerful vs easy to write tests cases: This would also determined by the technical capabilities of the team members and the velocity with which newer tests need to be added.
Automation framework vs Automation tool
A lot of people confuse automation framework with an automation tool. There are several companies who offer different tools that can be part of a framework to be used to achieve the desired level of automated tests. It would be fair to say:
- TestComplete, Eggplant are tools that can be used in a framework to achieve the desired results.
- Selenium, Web Driver, Appium, Robotium are tools/libraries that are used with in a framework to achieve certain automation from the above pyramid.
- TestNG, Cucumber, JUnit, RSpec are test runners that are used within a framework to be able to organize and execute tests within the framework.
Considering all the factors mentioned above, define the technology stack (Ruby, .NET, Win32, Java, Web/HTML, Web toolkits [GWT], etc.) that can be used for testing your application. You can also decide to use different technology for different platforms and still be able to use them within same framework using wrapper classes as long as they can send and receive response in some common protocol like JSON. I will share specific details about this mechanism in a later blog.
Building the framework
While building an automation framework, consider putting in place a multi-layered approach so that each layer can be built, maintained and grown independent of each other. Here are different components/layers to build out before defining a framework:
- First build out the ‘common’ utility functions/methods comprising of:
- Setup/TearDown – Start out with the most used platform for your application. The setup should include starting required services including database connections, etc. The teardown should take care of cleaning up all the test specific things. You always want to start clean for each test so your setup may also contain certain tear down methods.
- Reporting mechanism: The test framework would largely cover this that you use but being able to establish utilities on top of that allow you clearing define and debug your code when you are executing. I will soon write up a separate blog for reporting.
- Next build things for the basic technology layer:
- Objects that would be used in all the tests e.g. web driver.
- Define test data definition using data driven model keyword driven, and/or be able to read from properties file
- Build the capability to call different types of API clients e.g. HTTP client for REST API testing.
- Third, build out the extrapolation layer for the ‘Components’ that use the two previous layers to construct the tests.
- The components in combination with the keywords and other technology functions/methods give you a flexible way to implement both functional and data driven tests.
The key here is to be able to extend/push the limits of one or two tools (or more) to accomplish the end goal. By separating out the layers and making them flexible you can have a unified/core level/framework that other layers can utilize and then they can be custom to the type of system you’re working with. Build out the framework in such a way that its tool independent and the current tools can be replaced with new ones if they are better and solve more problems in the future. Forward thinking from the product perspective i.e. if the product is going to be adding features/functionality that may be crucial for the business than we need to be prepared for that from framework and automation coverage. In my next blog I will share more detailed information on creating a framework for cross platform product like skype etc.