Intra-aortic and Intra-caval Balloon Pump Devices in Experimental Non-traumatic Cardiac Arrest and Cardiopulmonary Resuscitation
Journal of Cardiovascular Translational Research(2022)SCI 3区
Abstract
Intra-aortic balloon pump (IABP) use during CPR has been scarcely studied. Intra-caval balloon pump (ICBP) may decrease backward venous flow during CPR. Mechanical chest compressions (MCC) were initiated after 10 min of cardiac arrest in anesthetized pigs. After 5 min of MCC, IABP (n = 6) or ICBP (n = 6) was initiated. The MCC device and the IABP/ICBP had slightly different frequencies, inducing a progressive peak pressure phase shift. IABP inflation 0.15 s before MCC significantly increased mean arterial pressure (MAP) and carotid blood flow (CBF) compared to inflation 0.10 s after MCC and to MCC only. Coronary perfusion pressure significantly increased with IABP inflation 0.25 s before MCC compared to inflation at MCC. ICBP inflation before MCC significantly increased MAP and CBF compared to inflation after MCC but not compared to MCC only. This shows the potential of IABP in CPR when optimally synchronized with MCC. The effect of timing of intra-aortic balloon pump (IABP) inflation during mechanical chest compressions (MCC) on hemodynamics. Data from12 anesthetized pigs.
MoreTranslated text
Key words
Heart arrest,Cardiopulmonary resuscitation,Counterpulsation,Hemodynamics
PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
2005
被引用74 | 浏览
1989
被引用25 | 浏览
2003
被引用258 | 浏览
1990
被引用951 | 浏览
2010
被引用119 | 浏览
1981
被引用197 | 浏览
2018
被引用28 | 浏览
2019
被引用22 | 浏览
2021
被引用52 | 浏览
2021
被引用289 | 浏览
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
去 AI 文献库 对话