Comparative Study of Joint Estimation of State of Charge (SOC) and State of Health (SOH) of Lithium-ion Batteries Based on Different Tree Models
Journal of Electrical and Electronic Engineering(2024)
School of Materials Science and Chemical Engineering | School of Electrical and Electronic Engineering
Abstract
The realization of accurate State of Health (SOH) and State of Charge (SOC) estimation is a prerequisite to ensure the safe use of energy storage batteries, which helps to further improve the energy utilization efficiency effectively. Data-driven methods are efficient, accurate, and do not rely on accurate battery models, which is a hot direction in battery state estimation research. However, the relationships between variables in the lithium-ion battery dataset are mostly nonlinear, which largely affects the prediction of the model. In addition, the model also has a series of defects, such as large computation, strong data dependence, and long consumption time. In this paper, a joint online estimation method of battery SOC-SOH based on tree modeling algorithm is proposed to solve the above problems. Based on NASA battery sample data, this study explores the changing law between SOC and discharge voltage and temperature under different State of Health (SOH). Subsequently, a combination of RFR, GBDT and XGBoost tree modeling algorithms are used for battery SOC-SOH estimation based on the above variation rules. The experimental results show that the R2 scores of the XGBoost algorithm in predicting both SOC and SOH are more than 0.995, indicating its good adaptability and feasibility.
MoreTranslated text
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
2019
被引用73 | 浏览
2019
被引用344 | 浏览
2020
被引用146 | 浏览
2020
被引用291 | 浏览
2020
被引用24 | 浏览
2020
被引用40 | 浏览
2021
被引用434 | 浏览
2020
被引用205 | 浏览
2020
被引用70 | 浏览
2020
被引用198 | 浏览
2020
被引用44 | 浏览
2022
被引用21 | 浏览
2022
被引用37 | 浏览
2022
被引用130 | 浏览
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest