SOASTA (News - Alert), provider of testing platforms for Web and mobile applications, has released a new testing platform for iOS6 that allows to complete automation of performance and functional testing of mobile applications across real, distributed mobile devices. Using the CloudTest Mobile platform, mobile app developers can precisely capture and playback all continuous touch gestures including pan, pinch, zoom and scroll on iPhones, iPads and iPods.
According to a press release, the new platform, will allow mobile developers to easily automate the testing of their mobile apps on iOS 6. CloudTest’s unique approach of being embedded inside the app allows developers to precisely test their apps when new mobile operating systems are released.
“Starting today, developers are working on iOS 6 versions of their apps and automated functional testing is a real problem for these developers,” stated Tal Broda, VP engineering at SOASTA. “Today CloudTest provides full support for capture and playback of any app running on iOS 6 including any gesture or any UI element.”
Additionally, CloudTest Mobile will also provide support for Appcelerator Titanium 2.0 platform, which gives Appcelerator’s 300,000 mobile developers seamless access to CloudTest Mobile’s test automation functionalities.
Tom Lounibos (News - Alert), CEO, SOASTA, said, “Now, with CloudTest Mobile, developers can quickly develop, test and deliver their mobile applications with the quality their users expect. Our support for Titanium 2.0 will make this especially seamless for Appcelerator’s mobile developer community.”
CloudTest Mobile provides a cost effective, precise and comprehensive mobile app testing capabilities for real mobile devices. Device emulators and optical recognition based approaches used by convention testing solution do not provide enough precision or reliability required for testing this generation of mobile apps. CloudTest Mobile provides the capability to capture the start and end points of each gesture, the navigation between, and the speed with which the gesture is performed. It uniquely conducts testing from within the mobile app, replacing unreliable optical recognition approaches and enabling validations that are based on variable values and internal app state changes.
Edited by Brooke Neuman