Smart City Gnosys

Smart city article details

Title Optimization Regenerative Braking In Electric Vehicles Using Q-Learning For Improving Decision-Making In Smart Cities
ID_Doc 40692
Authors Suanpang P.; Jamjuntr P.
Year 2025
Published Decision Making: Applications in Management and Engineering, 8, 1
DOI http://dx.doi.org/10.31181/dmame8120251329
Abstract The growing prevalence of electric vehicles (EVs) in urban settings underscores the need for advanced decision-making frameworks designed to optimise energy efficiency and improve overall vehicle performance. Regenerative braking, a critical technology in EVs, facilitates energy recovery during deceleration, thereby enhancing efficiency and extending driving range. This study presents an innovative Q-learning-based approach to refine regenerative braking control strategies, aiming to maximise energy recovery, ensure passenger comfort through smooth braking, and maintain safe driving distances. The proposed system leverages real-time feedback on driving patterns, road conditions, and vehicle performance, enabling the Q-learning agent to autonomously adapt its braking strategy for optimal outcomes. By employing Q-learning, the system demonstrates the ability to learn and adjust to dynamic driving environments, progressively enhancing decision-making capabilities. Extensive simulations conducted within a smart city framework revealed substantial improvements in energy efficiency and notable reductions in energy consumption compared to conventional braking systems. The optimisation process incorporated a state space comprising vehicle speed, distance to the preceding vehicle, battery charge level, and road conditions, alongside an action space permitting dynamic braking adjustments. The reward function prioritised maximising energy recovery while minimising jerk and ensuring safety. Simulation outcomes indicated that the Q-learning-based system surpassed traditional control methods, achieving a 15.3% increase in total energy recovered (132.8 kWh), enhanced passenger comfort (jerk reduced to 7.6 m/s3), and a 13% reduction in braking distance. These findings underscore the system's adaptability across varied traffic scenarios. Broader implications include integration into smart city infrastructures, where the adaptive algorithm could enhance real-time traffic management, fostering sustainable urban mobility. Furthermore, the improved energy efficiency reduces overall energy consumption, extends EV range, and decreases charging frequency, aligning with global sustainability objectives. The framework also holds potential for future EV applications, such as adaptive cruise control, autonomous driving, and vehicle-to-grid (V2G) systems. © 2025 Regional Association for Security and crisis management. All rights reserved.
Author Keywords Decision Making; Optimization, Multi-Agent Reinforcement Learning; Smart Grid Management; Vehicle-to-Grid Systems


Similar Articles


Id Similarity Authors Title Published
40393 View0.913Suanpang P.; Jamjuntr P.Optimal Electric Vehicle Battery Management Using Q-Learning For SustainabilitySustainability (Switzerland), 16, 16 (2024)
40790 View0.865Suanpang P.; Jamjuntr P.Optimizing Electric Vehicle Charging Recommendation In Smart Cities: A Multi-Agent Reinforcement Learning ApproachWorld Electric Vehicle Journal, 15, 2 (2024)
4034 View0.861Wang E.; Memar F.H.; Korzelius S.; Sadek A.W.; Qiao C.A Reinforcement Learning Approach To Cav And Intersection Control For Energy EfficiencyProceedings - 2022 5th International Conference on Connected and Autonomous Driving, MetroCAD 2022 (2022)
8368 View0.857Dutta P.; Boulanger A.; Anderson R.; Wu L.An Innovative Approach To Vehicle Electrification For Smart CitiesHandbook of Research on Social, Economic, and Environmental Sustainability in the Development of Smart Cities (2015)
24056 View0.857Aldossary M.Enhancing Urban Electric Vehicle (Ev) Fleet Management Efficiency In Smart Cities: A Predictive Hybrid Deep Learning FrameworkSmart Cities, 7, 6 (2024)
2234 View0.855Chalaki B.; Malikopoulos A.A.A Hysteretic Q-Learning Coordination Framework For Emerging Mobility Systems In Smart Cities2021 European Control Conference, ECC 2021 (2021)
26972 View0.852Breschi V.; Ravazzi C.; Strada S.; Dabbene F.; Tanelli M.Fostering The Mass Adoption Of Electric Vehicles: A Network-Based ApproachIEEE Transactions on Control of Network Systems, 9, 4 (2022)
51245 View0.852Raja G.; Saravanan G.; Prathiba S.B.; Akhtar Z.; Ali Khowaja S.; Dev K.Smart Navigation And Energy Management Framework For Autonomous Electric Vehicles In Complex EnvironmentsIEEE Internet of Things Journal, 10, 21 (2023)
23309 View0.85Laroui M.; Dridi A.; Afifi H.; Moungla H.; Marot M.; Cherif M.A.Energy Management For Electric Vehicles In Smart Cities: A Deep Learning Approach2019 15th International Wireless Communications and Mobile Computing Conference, IWCMC 2019 (2019)