Three megatrends are rapidly changing the landscape of robotics and automation in the laboratory and the life science industry from both a technical and a commercial perspective. Each of these trends stands alone in having a significant impact on the industry, but their interaction is influencing design methods and componentry, as well as the expectations those designs inspire. These trends are:
We have all heard that the Baby Boomers are marching toward retirement, and we have questioned how that will affect healthcare costs and society in general. In the life sciences in particular, however, we know this generation will influence the use of—and need for—robotics in clinical, diagnostic, and pharmacological laboratories.
The 60-year-old-plus population uses three-to-five times the healthcare services that younger generations use. That group is expected to grow by 33 percent between 2010 and 2020. Furthermore, that population is expected to jump from 18 percent of the total population to 22 percent in that 10-year span, according to the Administration for Community Living, U.S. Department of Health and Human Services.
With that growing population demanding the lion’s share of healthcare, the volume of clinical and diagnostic testing will increase dramatically, straining the capability of laboratories and their staffs. Additionally, the growing demand to develop new drugs to treat the patient pool will fuel capacity expansion in drug discovery.
The pressure to contain healthcare costs has been increasing, and the changing demographics will likely drive up that containment pressure exponentially. This takes automation from a “nice to have” to a “must do.” Laboratory robotics offers the promise of increased efficiencies, improved throughput times, and reduced costs, much as the moving assembly line did for Henry Ford in the automobile industry. And as the demand for new drugs to treat the aging population skyrockets, robotics will help the drug discovery process keep pace through around-the-clock research.
Before such automation, scientists were hampered by manual testing. Today’s automated high-throughput screening empowers scientists with access to an abundance of data—with little or no manual interaction.
The engines behind automation’s faster pace are higher throughput machines powered by servomotors. Stepper motors have been the technology of choice since the inception of the laboratory automation space, and although they continue to be the correct solution for many applications, more applications are requiring the greater speeds and high dynamic response of servos.
Many instrument manufacturers are also replacing mechanical drive trains with linear motors in robotics and other automation. They have superior smoothness and accuracy while enabling higher speed, lower maintenance, less down time, precision performance, and longer overall life. For example, new compact motors are often integrated into centrifuges and instruments requiring a very small footprint, as they can be built right into the machine.
The increasing volume of testing and the specificity of tests require equipment that takes up less room. That is why standard positioning devices are getting smaller. In fact, the overall footprint of standard motion products continues to shrink in an effort to provide solutions to original equipment manufacturers (OEMs) as they shrink their instrument sizes.
Where the form factor of standard products does not fit this new world, customized actuators can also be integrated into instruments. Solutions are still being engineered to fit directly into existing instruments to serve as the structure of the motion system where it makes sense.
Personalized medicine itself is a megatrend that is at the intersection of a number of major technology trends. According to Wikipedia, personalized medicine is a “medical model that proposes the customization of healthcare—with medical decisions, practices, and/or products being tailored to the individual patient.” Personalized medicine employs genetic testing to help select the most appropriate therapy based on the genetic makeup of the patient.
The cost of sequencing an individual’s genetic code is continuing to decrease and is now approaching $1,000 U.S. That cost brings new developments in sequencing ribonucleic acid (RNA), allowing doctors to better differentiate external environmental impacts on health from individual genetic tendencies. This gives a better perspective on the individual’s overall health. It is becoming possible to understand an individual’s genetic makeup, so that responsiveness to certain drugs and therapies can be predicted, allowing the best individualized care.
“Big data” refers to the collection and use of data sets that are so large that just a few years ago, they were impossible to manage with traditional computing and data processing applications. The collection and application of decoded genetics fits into this definition, and in fact, data management has been a factor in the cost and commercial viability of this tool.
Big data has also facilitated collecting huge numbers of test samples during the drug discovery process, allowing a real understanding of how drugs will perform given a genetic disposition. Coupled with a high level of automation in the testing process, the number of iterative samples is growing exponentially.
The real benefits of patient-specific cellular therapies have not yet progressed beyond experimental trials, but they are showing such great promise that it is difficult not to get excited about how they could transform medicine.
According to James Price, the president and chief executive of the Canadian Stem Cell Foundation, stem cell therapy will transform health care and how we treat devastating diseases. “We are on the verge of breakthroughs that will fundamentally alter the treatment of illnesses such as diabetes, cancer, heart disease, and neurological diseases,” he believes.
Because manufacture of patient-specific cellular therapies is a discrete process compared to the batch manufacturing processes typically employed in the pharmaceutical manufacturing environment, it will drive a mindset change from “scale up” (increase batch sizes) to “scale out” (increase capacity at the bottleneck).
What is the impact?
Modularity and scalability really mean lean automation. In this approach, instruments run unattended with some level of operator load and unload. This minimizes capital costs. Increasing capacity is a simple matter of adding another instrument to the system.
With requirements to run around the clock in remote locations that do not typically have the on-site maintenance teams many factories have, mean time between failure is critical in today’s laboratory robotics. Using industry-proven designs with thousands of hours of operation greatly reduces the risks associated with a new design.
More and more instrument OEMs are choosing to outsource some or all of their instrument design and manufacturing. That is because during the “Great Recession,” many organizations cut deeply into their engineering talent. Today, in the spirit of a jobless recovery, they have turned to outsourcing as a tool to get products released.
Increasingly, speed to market is a major factor in their success, and outsourcing either the entire design or key subsystems allows more resources to be applied for a shorter design cycle. In fact, it is becoming more common for the majority of an OEM’s revenue to come from sales of the consumable as opposed to sales of the instrument. This is why OEM innovation is now more focused on developing revenue-generating consumables.
OEMs must now choose how they define their core competency and what they want to outsource. The answers range from outsourcing just the design to outsourcing just the manufacturing to all options in between. If an OEM chooses to maintain control over the system-level design, it typically wants suppliers with engineering capabilities to do submodule design. By sourcing modular subsystems from a trusted supplier, companies can ensure they get fully qualified, plug-in systems that reduce the overall risk throughout the life cycle of the instrument.
There is a greater physical separation between OEM design teams and OEM manufacturing teams, whether for geographical or organizational reasons. That is why it is critical to have a partner who operates as an extension of the instrument development team. This reduces the OEM’s overall engineering resource requirement. With more manufacturing moving to offshore contract manufacturers, the ideal partner should be able to engineer where the OEM is engineering and manufacture where the OEM is manufacturing.
Powerful market forces are driving exciting changes in the use and design of laboratory automation and robotics. New technologies will change healthcare as we know it, bringing costs down to a point of widespread viability. The instrument manufacturers in this space are having to reduce their development cycles, improve their product capabilities, and increase their overall reliability. Moreover, they are relying more on partners to help them achieve their goals. Ideally, these automation partners will provide them with a portfolio of robust standard products for rapid early development efforts, the engineering capabilities to add value, technologies to optimize the product design for specific applications, and the overall process capability to act as an extension of the OEM’s team both in development and production.
A version of this article originally was published at InTech magazine.
Source: ISA News