Support for automated app testing in MobileTogether Designer allows developers to create, manage, and execute automated test cases directly within the development environment.
Automating repetitive testing tasks allows for thorough testing that reduces the chances of human error and increases the reliability of your apps before deployment. Automation provides consistency that streamlines the QA process to ensure that enterprise and app store apps perform as expected in real-world conditions.
MobileTogether offers app testing in combination with integrated, cross platform app simulation and comprehensive debugging for perfecting apps at every step of the process.
To get started with automated app testing, record a test case. This involves interacting with your enterprise solution or mobile app as a typical user would. Once you press record on the testing toolbar, MobileTogether captures and records everything as you navigate the UI and perform actions such as clicking buttons, filling out forms, and submitting data. Every interaction is logged and recorded in a reusable test script.
Once a test case is recorded, you can play it back to ensure the app looks and behaves consistently across iOS, Android, Windows, and web browsers. It is also important to re-run a test after making changes to the app development project. Such regression testing is important to make sure updates or enhancements do not break existing functionality.
You can opt to play back a test case using the simulator in MobileTogether Designer or on a connected device such as a smartphone. You can adjust the playback speed to run through the test at the desired pace.
The Manage Test Cases dialog in MobileTogether shows all previously recorded test cases and the associated test runs (i.e., playbacks). This makes it easy to see the results of all test runs, and you can also organize multiple tests into suites and set up recording and playback options for subsequent tests.
Comparing the the results of different test runs helps identify potential regressions and maintain the quality of your app or solution over time. When you select two or more test runs to compare, the comparison tool highlights differences including variations in execution time, actions taken, success rates, and errors encountered. It also shows the resulting state of the controls and data sources.
Comparing a new test with a known successful run is important to verify that changes to the codebase do not introduce bugs.
For more extensive enterprise testing scenarios, test cases can be deployed to your organization's MobileTogether Server, allowing for testing in production-like environments. Server logs and detailed analysis tools provide insights into how the application will perform in real-world conditions.
You can deploy one or more test cases of the active design to the server. If a test case is made active on the server, then it can be played back each time the solution is started on a client. In this way, a test case can be played back on multiple clients. These playbacks are stored on the server as a test run and displayed in the Automated Tests dialog on MobileTogether Server.