Flexible Strain Sensors Based on an Interlayer Synergistic Effect of Nanomaterials for Continuous and Noninvasive Blood Pressure Monitoring
ACS APPLIED MATERIALS & INTERFACES(2024)
Taiyuan Univ Technol | Univ Texas El Paso
Abstract
The continuous, noninvasive monitoring of human blood pressure (BP) through the accurate detection of pulse waves has extremely stringent requirements on the sensitivity and stability of flexible strain sensors. In this study, a new ultrasensitive flexible strain sensor based on the interlayer synergistic effect was fabricated through drop-casting and drying silver nanowires and graphene films on polydimethylsiloxane substrates and was further successfully applied for continuous monitoring of BP. This strain sensor exhibited ultrahigh sensitivity with a maximum gauge factor of 34357.2 (∼700% sensitivity enhancement over other major sensors), satisfactory response time (∼85 ms), wide strange range (12%), and excellent stability. An interlayer fracture mechanism was proposed to elucidate the working principle of the strain sensor. The real-time BP values can be obtained by analyzing the relationship between the BP and the pulse transit time. To verify our strain sensor for real-time BP monitoring, our strain sensor was compared with a conventional electrocardiogram-photoplethysmograph method and a commercial cuff-based device and showed similar measurement results to BP values from both methods, with only minor differences of 0.693, 0.073, and 0.566 mmHg in the systolic BP, diastolic BP, and mean arterial pressure, respectively. Furthermore, the reliability of the strain sensors was validated by testing 20 human subjects for more than 50 min. This ultrasensitive strain sensor provides a new pathway for continuous and noninvasive BP monitoring.
MoreTranslated text
Key words
flexible strain sensor,interlayer synergistic effect,continuous blood pressuremonitoring,wearable sensors,graphene
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper