The PAMSys Voice™ wearable sensor incorporates bidirectional microphones for synchronized audio data capture, as well as an accelerometer and gyroscope to record chest motion data. (Credit: BioSenics)

BioSensics  , Newton, MA, has received a $3 million, three-year award from the U.S. National Institutes of Health (NIH). This grant will support the development of a groundbreaking wearable device designed for tracking speech activities and biomarkers during daily routines.

The PAMSys Voice™ wearable sensor incorporates bidirectional microphones for synchronized audio data capture, as well as an accelerometer and gyroscope to record chest motion data. The innovative patented device seamlessly integrates audio and motion monitoring, making it a unique wearable sensor for remote tracking of speech activities and biomarkers, coupled with precision actigraphy and fall detection capabilities.

The NIH grant is aimed at advancing the development of PAMSys Voice™ and validating its utilization as a robust tool for the early identification of cognitive decline and the remote tracking of cognitive-motor function in individuals with dementia and Alzheimer's disease. The project involves collaboration with Drs. Bijan Najafi and Michele York from Baylor College of Medicine  .

BioSensics develops wearable sensors and digital health technologies for clinical trials and research, remote patient monitoring, and health assessments. Founded in 2007 by three scientists from Harvard, BioSensics has created new paradigms in using wearable sensors in healthcare and advanced the medical alert industry by creating technologies that are now used by thousands of older adults.

The company has received over $50 million in research and development program support from NIH. In 2022, BioSensics remote measure technologies was selected by the NIH for use in clinical trials involving people with rare diseases.

A limited number of PAMSys Voice™ sensors will be available for research purposes in early 2024. If you wish to explore possibilities for collaboration and partnership, contact Ashkan Vaziri at This email address is being protected from spambots. You need JavaScript enabled to view it..