We already know that millions of people from around the world use mobile phones and tablets, and the number of users is on the rise. This also determines the exponential growth of the global mobile applications market.Companies which develop mobile applications must ensure that they deliver high-quality products and that they stand out from the crowd in order to receive good feedback from customers. This has become a challenge, considering the diversity of mobile devices and operating systems. To ensure large and efficient coverage, these applications must undergo automated testing. This is the reason why generic solutions must be found, namely solutions that cover the testing of the applications to the greatest possible extent.
Lately, the most important trend in testing mobile applications on a wide range of devices and operating systems is cloud testing, which supports both manual and automated testing. However, this is quite an expensive solution, and, for the most part, the project budget does not cater for such an approach.
There is quite a large number of Cloud testing services, which can be used for the automated testing of mobile applications. Here are some examples: Testdroid, SauceLabs, AWS Device Farm, Browserstack. Each provider has hundreds or thousands of mobile devices, starting from low-end products to top mobile phones or tablets. The technical details behind each of these solutions differ for each provider. As a result, the tests written for one provider will not be 100% compatible with the ones used for another provider. This issue becomes salient when providers get changed.
The prices for these services differ depending on the provider and the subscription type. There are cheap subscriptions which might fit in the project budget, but these barely offer 10% from the capabilities provided by a full subscription, and our tests have rendered these hardly usable in the long run or for an efficient testing of the applications. Full subscriptions offer many functionalities which might ensure efficient testing, but there price ranges between hundreds of dollars per month to thousands of dollars for enterprise subscriptions.
Because of these high costs, we decided to look for one of our solutions, to replace Cloud testing to a great extent. Obviously, this is not 100% possible. For a more complex and efficient solution, similar to the current enterprise solutions, what is needed is a great and consistent investment in time and resources, and performing tests on emulators is not relevant at all in order to achieve real results (real devices do not behave the same way emulators do, and the execution speed differs considerably).
Our plan was to try to find an alternative, practical solution that provides a testing service god enough to work on various devices. Obviously, we did not reinvent the wheel and we did not attempt to do it. The solution we put forward still requires the use of physical devices, but there is no need for a Cloud platform to dictate a particular way of writing and executing tests. There is no need for the devices to be physically located where the project is located. Moreover, there is no need to dispatch our tested applications to a 3rd party. Everything is tested in the internal network, without running into security problems.
For starters, we tried to focus on the most popular mobile operating systems: Android and iOS. We started off from a small idea: having a generic framework that we can use as basis for testing mobile applications.
The moment we decided on the devices that need coverage, we shifted out focus towards choosing an automation tool. After testing several solution portfolios, we chose Appium.
Appium is an open-source tool used for the testing of native, hybrid or web mobile applications. It is built and based on Selenium, but it focuses on testing mobile applications.
To execute tests remotely, we created a REST service made up of 2 components:
A server which is deployed on a web server (with a UI which displays the list of visible devices)
Local clients (Agents), which run on the machines where the tests will be executed on (in .jar format)
The server waits for client calls and displays the received information on a UI.
The clients perform repeated searches or inquiries (once/minute) for the devices connected to the machine on which they run and are ready for test execution. The information is sent to the server, which, in turn, displays the received information.
The information consists of location identifiers (IPs) and the features of the devices which are ready for test execution.
Appium supports remote test execution, as long as there is an Appium server which runs on the destination URL. To do this, the test component calls the REST service, inquiring the following "is there any device with the <…> features which is free for testing?", and, depending on the answer, the component decides whether it will execute the tests or not.
In case of a positive answer, the server offers necessary information for text execution to the Java component of the testing framework and the latter, in its turn, sends an execution request towards REST. The REST service starts the Appium server and connects it to the IP-ul where the execution takes place. After the server is started, the tests are run on the device found in a remote location.
Upon complete execution, the Java components of the test framework sends a call to REST, to stop the Appium server from the remote location, and the results of these execution will become visible locally on the machine where the test run request was initiated.
For now, the remote solution supports only one mobile operating system, namely Android. The tests were successfully executed in various locations and on several devices simultaneously.
We plan to offer support for iOS in the near future. We aim at localizing the execution in a UI (start execution -> final report), thus also eliminating the local machine which starts the execution. We want to develop solutions and make the more efficient to accommodate the process and offer high-quality services.
by Ovidiu Mățan
by Ovidiu Mățan
by Mihai Varga
by Darius Bozga
by Laura Sima
by Cristina Juc
by Ioana Luțaș