Contributed by: Shivraj Sabale (Manager, Product Engineering, Clarice Technologies)

With over 7 years in handling multiple QA teams, Shivraj has worked in different domains ranging from Application Security, Mobile/Tablet operating system stack development, Analytics and Cloud based services. He holds extensive knowledge of Test Case and Defect Management and Requirements Traceability. He has successfully lead automation and performance testing efforts in Selenium, Powershell and jMeter. At Clarice, Shivraj handles multiple QA engagements along with being a liaison for multiple teams. Shivraj has done his Masters in Software Engineering from La Trobe University and research background in Process Improvement and Quality assurance.

 

There was a time when mobile applications could be churned out in a few days and be made available to end users without any formal development cycle or testing.

The days when mobile devices hosted not more than 8-bit games and glorified calculators are passé. A growing trend of porting desktop and web applications over to mobile devices has brought mobile applications to the mainstream computing platform. Mobile devices are now capable of conducting almost all operations that were once possible only on non-mobile devices. It is about time we start thinking about having the right development and quality assurance process in place for such kind of product development.

QA efforts differ according to the nature of the product. With products based on tablets, smartphones, a holistic approach to QA envisaging Usability, performance and an overall user experience has to be taken into consideration. It no longer remains an endeavor to setup systems for consumers or enterprises to use but to provide the same functionality at their fingertips, literally.

 Resource Management

Although this transition from desktop and server systems to mobile systems is a definite step forward in technology, it takes us back a few years in terms of resource management for systems. All the cadavers of the performance bugs, we thought we had seen the last of, have now been unearthed and are now a major part of worry for most quality assurance teams. Limited resources for computing not only put stress on the overall performance of the stack but also cause hindrances to functionality in the most irritating ways possible. Testing for performance of mobile applications as well as the underlying stack has become one of the major considerations while defining the delivery process. Optimization of design artefacts and cutting out unnecessary processor cycles can be employed to improve the overall performance of applications. Benchmarking builds for performance and then measuring any delta in the performance can be achieved by many open source tools such as Monkey runner for Android. Similar tools are available for iOS and Windows phone OS.

Hardware and Software Platforms

One of the major challenges faced by testers in this domain is the sheer variety of hardware and software platforms. Fragmentation of a single platform into various versions and flavors adds to the challenge. However emulators do come to the rescue in this situation. Optimal use of emulators can reduce the stress on hardware resources (devices) considerably. Hardware dependency and fall-backs have to be taken into account. Hardware failures are bound to happen and the ability of the software to recover gracefully is highly desirable. Not every device, which is to be shipped to the field, can be tested individually and we have to rely on the promises by OEMs. However, a well-designed fail safe mechanism allowing for certain non-standard behavior from the device adds to the overall reliability of the device. Testing for such scenarios requires an in-depth study of the platform and the design of the system. Failures and crashes will have to be simulated to test these fail safe mechanisms.

Integration Testing

A lot of effort has to be directed at ensuring smooth operation in an ecosystem which not only integrates with various third party applications but with cloud services equally. Integration testing has to be done thoroughly to ensure a smooth operation.

Consistency

Frequency of updating firmware and the OS on these devices is lower compared to traditional web based applications. The possibility of receiving great in-depth feedback from users is also minimal. The only feedback you see is in Sales. This inherently requires the end product to be of top notch quality before the first version release. The device has to be tested for all possible use case scenarios before approving a release as the device will be handed over to the end user. Consistency is the key to a good first impression about the product.

 

Conclusion

It is not how many bugs you find that will define the quality of your mobile platform, it is how many bugs your end users will find…

It is not how your designers were impressed with the design, but how your users perceived the work flow…

Lastly, it is not how many applications your platform provided, but how many problems it solved!

facebooktwittergoogle_plusredditlinkedinby feather

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

  1. Thank you Nadine, Mukesh. We are trying to work on a few more ideas to make mobility testing more dynamic. Feel free to send in your questions/ideas and we will try our best to answer/respond to them.