Automation, robotics and digital transformation will be critical to the journey towards Lab 4.0. This vision of seamless, hands-off routine lab tasks, experiments and data handling is motivating investment in hardware and software that will increase lab efficiency, flexibility and throughput, while reducing costs and failures.
However, achieving integration of all of this lab equipment, and getting systems to communicate with each other remains a struggle. Even highly automated labs are today faced with managing islands of language-barriered hardware, and carrying out workflows that are interrupted by the requirement for error-prone, time-consuming manual tasks.
At a foundational level this punctuation results in part, because vendors of lab equipment have traditionally paired their systems with proprietary software that was not designed to talk to that of other suppliers, suggested Pantea Razzaghi, head of design at Automata. And this has been ‘oftentimes intentional,’ she suggested. ‘Some of the larger players had a monopoly within the industry,’ and it may not have been in their interest to make instrumentation that communicated easily with systems outside of their brand. Ease of integration would smooth the way for customers to switch to competing systems at upgrade, or when expanding or diversifying their labs.
This lack of interconnectivity means labs often have to maintain software that doesn’t fit with the evolving lab environment, and retain equipment that generates data requiring manual housekeeping for downstream utility, Razzaghi pointed out. Labs may even decide to sideline equipment that works perfectly well and does a great job, because it remains disconnected with the lab setup.
Whatever the outcome, this disconnect is likely to be costly, time-consuming and result in interrupted workflows and the need for repetitive manual tasks. It’s also likely the format of the data ‘doesn’t fit well’ with that required by the next instrument using that data. You may then have silos of data that are not standardised or optimised, and so there’s no way to maximise its utility, she noted. ‘We may then need a middle layer of translation before that data can be used to its full benefit.’
Think about at which point lab functionality is most reliant on human intervention, and somewhere near the top of the list will possibly be the requirement to manually pull data out of devices and transfer that data to the next stage. ‘It’s almost comical how manual this process commonly still is,’ Razzaghi commented. ‘We see people literally walking up to an instrument, inserting a USB stick, downloading the data, walking over to a computer and uploading it into that system.’
The lab today thus fosters equipment that has a level of ‘intelligence’, lets say, that is analogous to that of the early era of digital cameras, she further suggested. ‘To use these early digital cameras we had to insert a memory card into the camera, take the photo, then take out the memory card, put it into a reader, connect it to a computer, pull the images out of it, and store them on that computer. But today we can just snap a photo on a smartphone and send it directly to someone else, wirelessly and in an instant… ‘ It’s this sort of ability that we need to bring into the lab space. ‘It’s not just about making scientists more efficient, but removing punctuation in processes and the requirement for manual data input, retrieval and transfer will save scientists from having to engage in multiple, repeated manual steps as part of everyday experiments.’
Progressive changes
Fortunately, the philosophy in the vendor space is changing, Razzaghi suggested. An increasing number of what she described as ‘more progressive’ companies are developing systems designed with an open architecture that can more easily be configured to interconnect and communicate. Vendors are also recognising that culture and expectations are changing within labs themselves.
Scientists and lab technicians are increasingly becoming more interested in engaging with the different layers of a system’s software, to help it ‘play together’. As she explained: ‘Automation scientists may come from a scientific background or an engineering background, but today are interested in extending the utility of how they relate with a device. And that means they’re actively looking for new ways to modify a system – whether that’s through drivers or API’s – to orchestrate different instruments to connect and communicate together.’
This cultural change is also driving a shift in expectations. ‘Similar to consumer markets in other industries, the move within the lab sector is to diversify options on the market, and particularly to give users far greater flexibility,’ she said. This means that suppliers and developers are evolving their own mindset. ‘They’re realising that to survive in this space they have to make sure they offer this flexibility – think open API – and develop systems that offer a set of drivers, or advanced tools that give more advanced users within a lab space the option to interact with that software,’ she added.
This interaction may be the responsibility of the organisation’s automation engineer, or automation scientist – ‘who may be few in number and in great demand,’ Razzaghi noted. ‘These are the people who the lab will call on when they want to scale up an assay or experiment, or transition from manual tasks to a partially or even fully automated task.’ But even with open APIs and inbuilt tools, there’s still a great deal of work that needs to happen to enable that progress to automation.’
Not every organisation will have its own library of drivers, or automation tools, so even with a more open system, enabling that connectivity and communication device to device can be a major task. ‘It’s still very early in the process for most labs, and it’s going to take time for them to have a robust, reusable library of software tools – a fact which itself opens up another interesting question,’ she pointed out. ‘Should every lab have to do that? Should each lab have to develop its own library of tools and drivers to enable that lab integration?’
As consumers we now expect our technology to be plug and play, and to work with whatever else we’ve got on our home or office networks. We no longer have to download drivers or other integration tools when we set new systems up. So why has the pharma industry lagged behind? It’s partly down to the already mentioned complexity of the lab environment, and also the diversity of automation and robotics systems now available in the lab sector, Razzaghi commented. And while ‘democratisation is now happening within this space’, these closed systems are still commonplace in labs.
Thinking to the future, system developers in both the R&D and the manufacturing space realise the imperative to reduce the risk of market entry, and that making systems more ‘amenable’ to integration will help to attract potential customers. Interestingly, Razzaghi said the lab is becoming a much more stimulating space from the perspective of user experience and interface designers. ‘Whereas there has historically been a huge focus on developing consumer-oriented tools in fields such as gaming design or application design, the lab space is offering great opportunities for designers to help make the world a better place.’
Coming back to that cultural shift within the lab, and we can also see that, with scientists today having a far greater understanding of how software works, the imperative is again there for vendors to open up their system’s configurability. ‘Compared with scientists who were graduating 10-15 years ago, scientists these days are far more knowledgeable about software tools, and how to configure them. Their personal toolset is very different. It’s far more common for scientists who graduate today to be Python-savvy, for example.’
Scientists want to demonstrate more value out of their workflow, and to be able to use their time more productively, formulating new projects or writing up papers, for example, rather than having to spend time doing repetitive, manual tasks. ‘So if they can access a tool that has an open API, they are more likely to try to work it to get the system to do what they want.’ This is also generally a more cost-effective option than having to hire someone in, and will also likely be much faster with respect to upstream and downstream connectivity, because the scientists are the ones who know how the lab functions, and what is required to optimise that functionality.
Further challenges
However, Razzaghi acknowledged, the caveat to all this is that it’s never a case of walking into a lab and immediately being able to undertake a single automation project that will connect everything. And that gap between expectation and reality can, in itself, present as a significant problem when labs are looking to undertake some sort of digital transformation or automation exercise. ‘A lab may, for example, have a manual protocol or experimental workflow they want to translate to an automated format. What they may not realise is that transformation may not be possible as a single step. Here at Automata we understand this gap between expectation and reality, and so we work hard to educate as well as provide the software solutions to get that integration in place.’
Part of Razzaghi’s role is also to teach scientists about how to ‘think in an automated manner,’ she explained. ‘When you are doing something manually it’s a linear process. You may have restrictions, for example, waiting for a thermal cycler to finish before you can move to the next step. Or you may only have one liquid handler and it can only be used for a certain task. But labs going through an automation transformation may be able to achieve greater parallelisation of tasks.’ It can thus be possible to save hours of work by optimising processes, and this is reliant on the communication channels and integration between instrumentation.
Factor in tools such as scheduling software, and the lab then becomes even more efficient. ‘You can then schedule your lab resources and leverage applications to calculate your workforce and instrument capacity.’ Supporting automation and interconnectivity thus makes it possible to adapt workflows and processes to maximise efficiency. ‘It gives you a way to adapt that journey map to understand how, by moving different steps, or blocks, around, you’re going to achieve a certain task or workflow more efficiently, whether that efficiency is how much of a reagent is used, the time it takes for the experiment to complete, or when and how many of the lab instruments are required to complete that task.’ Then, of course, it may be possible to calculate the cost benefits associated with each alternative iteration of that task or workflow. ‘And that’s one of the greatest bits of value we can bring to the table,’ Razzaghi stated. ‘Our aim is to really help labs leverage the key instrumentation they already have, as well as implement new robotics hardware, through software.’
As well as offering both hardware and software to aid lab integration and automation, Automata has the industry insight and expertise that is helping labs make a smoother transition. ‘So, we provide the robotic lab bench system that has different actuation devices. This can be thought of as the device framework. Then to that robotic platform customers can integrate their own devices, or we can help them through the process of putting together and purchasing a bundle of instrumentation, and then leverage software so they can write the workflow protocols and do the day-to-day runs and data collection.’
Automata partners with the scientists running the experiments, so that everyone in the lab and other stakeholders understand how systems and software work together, and what they are capable of, Razzaghi noted. ‘Importantly, the lab bench system is very much vendor agnostic, so it can fit with a variety of different instruments, and this really helps to connect them together. We have an existing library of drivers and can develop custom drivers to enable that integration.
‘Our software also offers a workflow design tool, which makes it possible to interact with the protocols developed for each individual instrument, and facilitate communication so that you can program flexibility into your procedures,’ Razzaghi continued. ‘It’s then possible to run the workflow through a simulator to help identify where there may be errors, and help to optimise experiments and workflows to make the most of time and generate the best quality outcome.’
So, how might these developments impact on lab function within the next few years? Razzaghi suggested: ‘In five to 10 years, we can imagine labs no longer having to rely heavily on service integrators for a bespoke solution, instead [having] the ability to independently build out their own automation platform using Automata and cutting-edge instruments from the vendor of their choice.’