One of the most crucial challenges faced by the Li-ion battery community concerns the search for the minimum time charging without damaging the cells. This can fall into solving large-scale nonlinear optimal control problems according to a battery model. Within this context, several model-based techniques have been proposed in the literature. However, the effectiveness of such strategies is significantly limited by model complexity and uncertainty. Additionally, it is difficult to track parameters related to aging and re-tune the model-based control policy. With the aim of overcoming these limitations, in this paper we propose a fast-charging strategy subject to safety constraints which relies on a model-free reinforcement learning framework. In particular, we focus on the policy gradient-based actor-critic algorithm, i.e., deep deterministic policy gradient (DDPG), in order to deal with continuous sets of actions and sets. The validity of the proposal is assessed in simulation when a reduced electrochemical model is considered as the real plant. Finally, the online adaptability of the proposed strategy in response to variations of the environment parameters is highlighted with consideration of state reduction.

Park, S., Pozzi, A., Whitmeyer, M., Perez, H., Joe, W. T., Raimondo, D. M., Moura, S., Reinforcement learning-based fast charging control strategy for li-ion batteries, Contributed paper, in CCTA 2020 - 4th IEEE Conference on Control Technology and Applications, (Canada, 24-26 August 2020), Institute of Electrical and Electronics Engineers Inc., 345 E 47TH ST, NEW YORK, NY 10017 USA 2020: 100-107. 10.1109/CCTA41146.2020.9206314 [https://hdl.handle.net/10807/193662]

Reinforcement learning-based fast charging control strategy for li-ion batteries

Pozzi, Andrea
Secondo
;
2020

Abstract

One of the most crucial challenges faced by the Li-ion battery community concerns the search for the minimum time charging without damaging the cells. This can fall into solving large-scale nonlinear optimal control problems according to a battery model. Within this context, several model-based techniques have been proposed in the literature. However, the effectiveness of such strategies is significantly limited by model complexity and uncertainty. Additionally, it is difficult to track parameters related to aging and re-tune the model-based control policy. With the aim of overcoming these limitations, in this paper we propose a fast-charging strategy subject to safety constraints which relies on a model-free reinforcement learning framework. In particular, we focus on the policy gradient-based actor-critic algorithm, i.e., deep deterministic policy gradient (DDPG), in order to deal with continuous sets of actions and sets. The validity of the proposal is assessed in simulation when a reduced electrochemical model is considered as the real plant. Finally, the online adaptability of the proposed strategy in response to variations of the environment parameters is highlighted with consideration of state reduction.
2020
Inglese
CCTA 2020 - 4th IEEE Conference on Control Technology and Applications
4th IEEE Conference on Control Technology and Applications, CCTA 2020
Canada
Contributed paper
24-ago-2020
26-ago-2020
978-1-7281-7140-1
Institute of Electrical and Electronics Engineers Inc.
Park, S., Pozzi, A., Whitmeyer, M., Perez, H., Joe, W. T., Raimondo, D. M., Moura, S., Reinforcement learning-based fast charging control strategy for li-ion batteries, Contributed paper, in CCTA 2020 - 4th IEEE Conference on Control Technology and Applications, (Canada, 24-26 August 2020), Institute of Electrical and Electronics Engineers Inc., 345 E 47TH ST, NEW YORK, NY 10017 USA 2020: 100-107. 10.1109/CCTA41146.2020.9206314 [https://hdl.handle.net/10807/193662]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10807/193662
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 25
  • ???jsp.display-item.citation.isi??? 18
social impact