In an effort to enhance soldier lethality, Army researchers are developing biorecognition receptors capable of consistent performance in multi-domain environments with the ability to collect real-time assessments of soldier health and performance.

Dr. Matt Coppock, chemist and team lead at ARL, stands in his research laboratory at ARL headquarters in Adelphi, MD, where he is developing biorecognition receptors capable of consistent performance in multi-domain environments with the ability to collect real-time assessments of soldier health and performance. (Credit: U.S. Army Photo by Jhi Scott)

“The Army will need to be more adaptive, more expeditionary, and have a near-zero logistic demand while optimizing individual to squad execution in multifaceted operational environments,” says Dr. Matt Coppock, chemist and team lead for the U.S. Army Combat Capabilities Development Command's Army Research Laboratory, the Army's corporate research laboratory known as ARL. “It can be envisioned that real-time health and performance monitoring, as well as sensing current and emerging environmental threats, could be a key set of tools to make this possible.”

ARL scientists, in collaboration with researchers from the California Institute of Technology and Indi Molecular, Inc., developed a protein catalyzed capture, or PCC, agent technology that improves previous versions of receptors and could enable the monitoring of personal and environmental data from soldiers in the field. The research, funded by the Army's Institute of Collaborative Biotechnologies since 2012, is presented in a comprehensive review article for Chemical Reviews.

“PCC technology has demonstrated improvements in receptor stability, adaptability and manufacturability over standard antibody receptors, and supports the Soldier Lethality Cross-Functional Team as a potentially viable technology to monitor Soldier performance via relevant biomarkers collectable from wearable sensors,” Coppock says.

Biological receptors are integrated into a biosensor to selectively capture a target of interest from a complex mixture like blood, sweat, salvia, etc., to produce the measurable effect by the sensor, he says.

“Without the receptor, it would be impossible to know you are detecting what you want to detect,” Coppock says.

Antibodies collected from animals injected with the target of interest are used as receptors in biological sensors due to their high binding strengths and selectivity for the target.

“The gold standard receptor work is based around antibodies, which are fantastic at target capture and selectivity, but their detection capabilities are somewhat limited due to their instability, limited shelf life and batch-to-batch performance variation,” Coppock says.

The research team developed a different and more innovative approach.

“As an alternative, peptide-based receptors are smaller, simpler to produce, inexpensive and much more robust to environmental stresses, while still retaining the desirable binding properties of an antibody,” Coppock says.

Receptors used by the research team are capable of retaining nearly all activity after being heated for one hour at 90 °C, whereas many antibodies are completely inactive within minutes after heating up to greater than 70 °C.

“We utilize an entirely synthetic approach to receptor development, which allows for much more control over the incorporation of unique building blocks to guarantee stability and permit straightforward modifications for sensor integration,” Coppock says.

The team built a full infrastructure of capabilities that allows them to fully develop the synthetic, peptide-based receptors in house at the laboratory, on-demand and in whatever quantities are needed through widely available peptide synthesizers.

“All aspects of the technology have progressed throughout the collaboration culminating in a high-throughput development methodology,” Coppock says. “These aspects included rethinking targeting strategies, upgrading library constructs, automating screening steps and simultaneously characterizing the performance of up to 100 different peptide sequences.”

Researchers are now capable of fully developing a receptor from start to finish in as little as two to three weeks once the target of interest is identified, he says. This process previously took about five to six months.

“Current Army programs are ongoing to determine important biological markers correlating to soldier health and performance, so establishing the capability for rapid development of receptors will allow the lab to keep up with and advance the biomarker discovery and analysis,” Coppock says.

The team noted that research continues on making the design and selection process of new reagents in a more rapid fashion to make these sensors easier to manufacture on demand. In addition, ARL is currently coordinating efforts to address sensing needs in human performance and food and water safety.

Other potential applications of these receptors include environmental bio-threat surveillance, health diagnostics and therapeutics, which could significantly impact the warfighter.

This technology has drawn much interest across the Army science and technology community, such as fellow researchers at CCDC Soldier Center, CCDC Chemical Biological Center and the Army Medical Command. Previously, this collaborative research has matured biological receptors from Technology Readiness Level, or TRL-2, to TRL-4, including the successful integration into multiple assay platforms for ruggedized biological sensing in austere environments and reagent transition to the CCDC CBC.

This article was written by CCDC Army Research Laboratory Public Affairs. For more information, visit here .