Several organisations have spoken to me about how they can accommodate cross-device / platform testing within the short, iterative cycles of Agile development methodology.
Our conversations start with the usual request for automation frameworks, parallelisation of automated tests, 3rd party device labs but after I've spent some time discussing the overall deployment pipeline we discover other options are available to us.
Options that don't involve writing code...
The secret here is that I'm not looking at the cross-device testing problem when I'm looking at the deployment pipeline. I'm looking at the flow of work, from concept to cash (Poppendieck), to identify the touch times (where work is being performed), the handovers, queues & the downtime (where work is not being performed).
You've guessed it - I'm creating a value stream map to capture the value-add time (touch time) & non value-add time (handovers, queues & downtime).
Typically, organisations have far more non value-add time in their flow of work. In Lean terms, this non value-add time is called "Waste" & eliminating this waste is 1 of the 7 Lean principles.
In software development, we should aim to reduce the amount of non value-add time before optimising the value-add time - the gains will be greater & far easier to obtain.
Here's a (contrived) example:
So after I spent a day with a client, we were able to build a value stream map identifying the value-add & non value-add times.
From this map, we were able to identify a raft of experiments to reduce the non value-add times such as
- Reduce handoffs of work between teams (both time & number)
- Spread testing throughout the development cycle
- Include testers in backlog analysis (aka Backlog refinement) sessions to identify bugs in requirements / Acceptance Criteria & prevent them being coded into the software
- Challenge stakeholders' requirements - are they correct? Are they needed?
Some of the changes resulted in an immediate reduction in non value-add time & consequently a reduction in lead time.
This reduction in lead time enabled enough breathing space in the delivery cycle to facilitate mob testing of multiple devices by the entire team (both technical & business stakeholders).
Device specific bugs are being found sooner (i.e. not by the customer!), business stakeholders are happy & the experiment has provided the data/evidence required to convince the execs that hiring expertise in to build a suitable framework that can automate & parallelise the execution of the cross-device testing is a worthwhile expenditure.
Thanks for reading
Interested in seeing if I can help you with a similar challenge? Check out our testing services