It has been a long and stealthy takeover, but robots now dominate many leading bioscience laboratories, doing in just hours what once took days or weeks. Now the convergence of automation with nanotechnologies, biomedics and advanced algorithms promises to take robotization of medical research much further.
In May of this year, Ross King, professor of machine intelligence at the UK’s University of Manchester, traveled east to talk to students at the University of Nottingham campus in Ningbo, China. His paper “Robot scientists: Automating biology and chemistry” was a vindication of theories he and colleagues first proposed almost a decade ago.
In a
In China, as he had earlier at Brunel University in London, Prof. King named the two “robot scientists” Adam and Eve, constructed at the University of Aberystwyth in Wales. These robots form hypotheses, select efficient experiments to discriminate between them, execute the experiments using laboratory automation equipment, and then analyze the results.
Both Adam and Eve have made actual discoveries.
Adam was developed to investigate the functional genomics of yeast (Saccharomyces cerevisiae) and the robot succeeded in autonomously identifying the genes that encode locally “orphan” enzymes in yeast.
In biblical fashion, Adam was followed by Eve using similar techniques to create a machine tasked toward automation and integration of drug discovery: screening, hit conformation, and quantitative structure-activity relationship (QSAR) development. Eve uses novel synthetic biology screens that combine the advantages of computational, target-based, and cell-based assays.
Prof. Ross King says:
“Our focus has been on neglected tropical disease, and using Eve, we have discovered lead compounds for malaria, Chagas, African sleeping sickness and other conditions.”
Analytical robots like Adam, Eve or the more advanced products now being developed at centers of excellence – such as at the Fraunhofer Institute for Factory Operation and Automation (IFF) in Magdeburg, Germany – are a far cry from the robotic systems that first entered the lab some three decades ago.
The history of a leading company in the field – Hamilton Robotics – demonstrates the progression:
- From precision syringes in the 1940s
- Through the first semi-automated diluter in 1970
- To the first fully automated workstation for sample preparation in 1980.
Such workstations, which mechanically handle samples under full computer control, meet the core dictionary definition of a robot as “a machine capable of carrying out a complex series of actions automatically.” Their actual mechanical or physical “work” component also satisfies Karel Čapek’s original “forced labor” definition in his 1920 play R.U.R.. This is the play that introduced the word “robot” to the world.
Liquid handling is one of the four core applications for robotics in the laboratory. The others are:
Microplate handling: using robots to move plates around a workcell, between stacks and other devices (liquid handlers, readers, incubators, and so on). Advanced microplate robots integrate with third-party instruments to create work cells that automate applications and protocols to almost any level of complexity.
Automated biological research systems: robots provide automated handling and reading for various aspects of biological and biochemical research, ranging from flow cytometers to specific molecular biology applications such as PCR preparation and purification, colony picking or cell culture development.
Drug discovery screening: the most recent mainstream robotics application allows researchers to run a wide range of cell-based, receptor-based and enzyme-based assays typically used in high throughput screening (HTS).
The laboratory advantages of using robotics seem obvious, starting with the ergonomic benefits of automating tasks that would be tedious, repetitive, injurious or even hazardous for a human.
A robot makes no distinction between the backbreaking low rack a few centimeters off the floor and the one up high, for which a human would need to stand on a chair. Robots can also safely handle toxins, biohazards or operate in sealed or climate-controlled areas that we would find unbearable.
Laboratories originally embraced robotics because it seemed to offer an escape from the “quantity or quality” dilemma – the constant need to trade off speed for accuracy.
By contrast, it seemed robots could perform infinitely repeated operations to a supreme degree of precision that never varied and was infinitely controllable.
However, in practice, and particularly with high throughput screening, some limitations began to emerge. These included:
- Long design and implementation time
- Protracted transfer from manual to automated methods
- Unstable robotic operation, and
- Limited error recovery abilities.
Furthermore, the need to reduce steps in robotic processes tended to encourage the use of less accurate homogenous assays over the heterogenous ones that most companies would prefer.
Early 21st century adoption of Allegro and other technologies based on assembly-line techniques overcame many of these problems by passing microplates down a line to consecutive processing modules, each performing just one step of the assay. Speed could be multiplied into the process by making each step bigger, with the 96-well microplate giving way to 384 and now 1,536-well plates.
The new capability of robots to screen such enormous plates unsupervised paved the way for the quantitative high-throughput screening (qHTS) paradigm that can test each library compound at multiple concentrations.
Maximum efficiency and miniaturization gave qHTS the theoretical capacity to carry out cell-based and biochemical assays across libraries of more than 100,000 compounds, testing between 700,000 and 2 million sample wells within a few hours.
However, few companies actually need to screen that many compounds in-house each day, with the associated costs of consumables such as assay reagents, cell cultures, microplates, and pipet tips, as well as the cost of data handling and analysis time.
When you add in the investment overheads for associated infrastructure, robotics can seem like a rich kid’s toy.
During the first decade of the 21st century, growing numbers of contract companies doing high-throughput screening (HTS) offered assay development and screening, data analysis, and other library support.
The use of such contract robotics labs became a lot more popular after they stopped demanding royalty payments on any discovery. Such labs trade on the ability to offer ultra-fast turnaround times, running 24/7 on high-capacity HTS robotic workstations.
Some pharma and biotech companies began to outsource primary screening, keeping the higher-value, more proprietary secondary screening in-house, to enable higher hit rates for their teams. However, even these approaches are becoming redundant with new technology.
Essentially, high-throughput screening is the shotgun approach to research – using robotics to throw many thousands of chemical compounds against a target pathogen to see if its cell growth accelerates, stops, or is eliminated. The capacity is awesome, but the costs are high and the unit-to-success ratio is low.
A more sophisticated robotics-enabled paradigm is high-content screening (HCS) – a “rifle” approach that applies molecular specificity based on fluorescence and takes advantage of more sophisticated reagent classes.
High-content screening has the ability to multiplex, along with image analysis coupled to data management, data mining, and data visualization. All these help researchers focus on biological and genomic information and make far more targeted decisions on which assays to run.
Latest technology takes this targeting still further. Hudson Robotics recently announced what it terms high-efficiency screening (HES) for small molecules and antibodies.
High-efficiency screening uses a proprietary algorithm to compile a shortlist of library samples that will be screened. This is then passed on to a robotic workstation where the molecules are cherry-picked and screened in the appropriate assay.
Any molecules found to be active are used to enhance the model and the process is repeated until the user has both a list of active molecules, as well as the final model that can be used to search additional compound collections and guide synthesis of optimized analogs.
In preliminary testing against known compound databases, Hudson says its high-efficiency screening consistently identified the majority of known inhibitors of ten different biological targets after screening under 10% of a library containing some 80,000 diverse molecules.
Three decades in from the first laboratory use of robotics, it seems clear that the technology is still in its infancy. Robots may seem pervasive in today’s biomedical research, but they have a long way to evolve.
For one thing, robots cannot easily coexist with humans, needing to work in safely enclosed areas. The Fraunhofer Institute has been studying this aspect and developed LISA, a prototype mobile lab assistant with touch sensitive “skin” and heat sensors to stop her bumping into humans and vice versa.
But even LISA is likely to look as clunky as the Wright Flyer once biomedics, 3D printing and nanotechnologies really come into play. A glimpse of the possibilities is offered by the robotic inchworm pioneered by Columbia University.
Biobots like these, or the DNA spiders developed at New York University and the University of Michigan are little more than fascinating, if rather scary, toys at the moment. But they point to a future where robotics moves beyond the research lab into the operating room – or even down into the molecular realm.