Testing the internet of things? It’s time to plan your test strategy

What is your reaction to a technology that is likely to impact economy by $6.2 trillion by the year 2025 as valued by McKinsey Global Institute? Of course I am referring to the “Internet of Things” or “IoT” as it is commonly called. Though $6.2 trillion may be a long shot, the fact is that IoT has huge market potential that is expected to connect anywhere between 20 to 30 billion devices within the next couple of years.

IOT_V2Some of these devices are consumer-focused like smart phones or industrial devices, for example, medical equipment that can communicate in real time with providers and insurance companies while monitoring patient health, or cars that will stream driving patterns to auto insurance companies, engine and operational details to service centers or prescribe alternative travel routes by reading traffic patterns. While these changes are real and set to completely alter the way we look at business models and use products, they also pose some real technological challenges; especially testing and validating the functional accuracy of such complex, highly interdependent, data intensive and real-time communicating devices.

The world of IoT is all about interconnected devices communicating and analysing terabytes of data and providing information and taking appropriate actions in real time. In short, it is all about smart devices connecting smart cities, communicating with smart buildings and smart industries to provide smart transport, smart energy and smart health, enabling us a smart life. Smart Buildings could consist of communicating devices that monitor and communicate the status of HVAC (heating, ventilating and AC), lighting, security, and building access. Smart energy could include devices that monitor and communicate the status of UPS, batteries and generators, while smart homes could include devices that monitor and communicate the status of digital cameras, computers, TV’s, lighting and alarm system. Smart health could consist of devices that monitor and communicate the status of MRIs, implants, surgical equipment, lab equipment and other health monitoring devices.

As different types of devices communicate and simultaneously exchange data, relying on a strong analytics engine and acting independent of human intervention and testing and certifying such a complex environment will be a huge challenge for organisations. It is critical to implement a fool-proof test strategy. In order to do this, one must consider certain technology factors, including:

  • Pricing model: The standard T&M or fixed bid pricing model may not be mutually beneficial to the testing team or the client, unless there is already some implementation in place to understand the different dynamics. Pricing models based on output or preferably based on outcome may be good to start with. Output-based pricing can be measured by the number of integration points tested or the number of use cases tested. Outcome based pricing can be based on meeting product quality SLA’s. Alternatively, variable pricing models, with a fixed price to charge for infrastructure and software licenses costs along with an output of outcome for work, should also be considered.
  • Skill set: This is a very critical factor, especially considering the complexity of onboarding team members. They must have adequate experience in different domain specific business process, an understanding of different devices and how they communicate, and a good understanding of systems engineering (architecture, algorithms, analytics, etc.). The traditional model of onboarding people with only black box testing skills will not work.
  • Test environment management: Factors to consider when setting up an effective testing environment are:
    • Procuring necessary hardware, buying is not an option because there will be too many devices with different versions and newer models will keep getting launched.
    • Integrating the devices to form a network of interconnected systems. For example, hospital management system connected with your car, home and wearable health monitoring systems.
    • Mimic operating devices in real world conditions, (example: heat / cold, power out situations, electronic disturbances etc).
    • Simulating different network capabilities (broadband, 3G, 4G, 5G, low bandwidth, etc.).
  • Test data management: This is probably one of the most complex needs in the case of IoT based applications. It involves:
    • Creating terabytes of data that mimics the behaviour of multi-domain specific business processes, like banking specific data, insurance specific data, hospital management system care specific data, patient information, demographic data, device specific data, for example, different devices in a hospital generate different types of data.
  • IoT standards: There are no industry accepted standards at this point of time. Until these standards are identified and universally adopted, clear agreement on the standards to be adopted must be established. This can include protocols supported, data exchange formats, data periodicity and device performance standards.
  • Data privacy: This is one of the most critical factors in determining the success of IoT implementations, and requires significant effort from the testing and certification perspective. This means, testing must validate compliance with medical, insurance, banking, financial services and host of other government & industry regulatory frameworks. Think about distributed data getting stored on cloud infrastructure as well as other servers worldwide.
  • M2M & M2H interaction: Machine to machine, and machine to human interactions, rely on a machine’s artificial intelligence capabilities and its ability to interpret situations, based on data and trigger appropriate response. Minimum human interference means most of the actions are performed automatically, giving less time for human intelligence and emotions to respond.
  • Self-healing systems: In the interconnected world, systems will be designed to auto-heal (recover) and start performing their activities. From a testing perspective, this means designing scenarios to bring down devices and validate the continuity of business processes without failure. This is going to be truly complex as you are dealing with hundreds of thousands of devices and you will need a very meticulous mechanism to sequence series of device failures.

Having carefully considered the above factors, the next important step is to determine the test types. Though no new type of testing is needed, the testing team needs to focus on running different types of tests with varying degrees of detail, in order to reach a decent level of test coverage. Here are some of the common tests that are relevant in the context of IoT.

  • Compatibility and interoperability testing (C&I testing): C&I testing focuses on validating data flow between different devices, OS, and following different messaging protocols over different types of networks. Validating compatibility standards and interfaces is a key factor for this testing type.
  • Performance testing: No longer restricted to simple load generation for a small set of business scenarios, IoT takes performance testing to a different dimension. Testing of volumes of varying types of data, running business scenarios over different devices and networks, you will have to set monitors to not only measure response times between different layers in the network, but also to carefully watch system statistics of the devices including, power usage, changes in device temperatures, memory usage, disk space usage etc. Reliability testing, endurance testing, scalability testing, failover and disaster recovery testing are all essential.
  • Analytics and business rules testing: As the success of IoT rests on real time analysis and provisioning of real time responses, testing analytics algorithms and the ability to validate thousands of critical business rules will pose a huge challenge. Imagine the outcome if a wrongly implemented business rule administered more than the needed dosage of medicine to a patient or an air conditioning unit in a smart building getting switched on during a fire accident.
  • Application security and data privacy: With pretty much everything we do being connected and information passing between different systems spread geographically, application security and data privacy testing are non-avoidable tests.
  • Accessibility testing: Smart cities connected to smart buildings and communicating with wearable devices means, you will definitely have to test for accessibility. This will include testing for much more complex regulations beyond what the current WCAG 2.0 standards define.
  • I18N and L10N: This is short form for internationalisation and localisation testing, which are other very important tests. Validation of content delivery that adheres to different geographies, cultures and government regulations is essential.
  • Regulatory compliance testing: Testing is never complete without validating for compliance of the application with regulatory requirements, and so this is a crucial testing type.
  • Upgrade testing: You must think about your regression testing strategy when different devices get upgraded or replaced with better solutions.

In summary, while IoT is exciting, it puts a huge responsibility on the testing team. Complexity created by multiple devices, regulations and networks communicating in real-time need a foolproof test strategy. This needs to be supported by a relevant test environment. Testing teams must look at solutions to solve these challenges and have tools to support these tests and frameworks that will be able to run volumes of test automatically and the ability to validate the results.

The writing on the wall is clear. Model driven and highly robust automated scripts that can determine what tests to run based on dynamically changing situations, are needed. Also needed are solutions to create and manage test environments (something like a TaaS model) that mimic hardware and other devices, as well as a robust test data management solution. Manual testers, with no specialisation or domain knowledge, will become redundant unless they re-skill themselves with technology and specialisation.

 

The article was originally published on M2M Now  Machine to Machine News) on May 25, 2015 and is re-posted here by permission. 

Venkata Ramana Lanka

Director - QA, Virtusa. Venkata Ramana Lanka (LRV) is an accomplished software test management professional with an extensive background in software testing, test automation framework design, building domain specific testing solution accelerators, leading large software testing teams and supporting presales initiatives. LRV is an hands-on manager with proven ability to direct and improve quality initiatives, reduce defects and improve overall efficiency and productivity. He has in-depth and proven expertise in developing and implementing test strategies and operational procedures. LRV has extensive experience in working with multiple commercial and open source test tools. As a QA thought leader, LRV has written multiple White papers and articles some of which have been published in various media. Further, he has spoken and presented papers at various software testing conferences.

More Posts - Website