Smart City Gnosys

Smart city article details

Title Multiagent Deep Reinforcement Learning Based Energy Efficient Resource Management Scheme For Ris Assisted D2D Users In 6G-Aided Smart Cities Environment
ID_Doc 38480
Authors Vishnoi V.; Budhiraja I.; Garg D.; Garg S.; Choi B.J.; Hossain M.S.
Year 2025
Published Alexandria Engineering Journal, 116
DOI http://dx.doi.org/10.1016/j.aej.2024.12.004
Abstract Device-to-device communication (D2D-C) is one of the promising technologies for the sixth-generation (6G) environment. This is because it enhances end-user throughput, energy efficiency (EE), and the network's quality of service (QoS) even when the users are in complex networks or high-traffic zones of the smart cities. However, in D2D-C, different channels share the same subchannels (SCs), which causes considerable interference to cellular links. Moreover, ultra-massive connectivity (UMC) is a significant challenge in this environment. To overcome these obstacles, we paired the unmanned aerial vehicle (UAV) with the power domain non-orthogonal multiple access (PD-NOMA) technology to improve coverage and connection while reducing interference. Also, we used the reconfigurable intelligent surfaces (RISs) for propagation between UAV and D2D pairs (D2DPs) as they do not require much energy resources, due to which EE increases. Then, we propose a methodology for energy-efficient allocation of resources in RIS-assisted NOMA-enabled underlaying UAVs to D2D users. To achieve this goal, we first use the Markov decision process (MDP) to transform the formulated problem into a machine-learning form using the reinforcement technique. A multi-agent, priority sampling-based, decentralized and coordinated, dueling deep Q-network (PS-DC-DDQN) technique is proposed since the network is complicated with large state and action spaces. To reduce the complexity, the data for resource allocation and power is distributed among neighboring agents only in a decentralized and coordinated manner. Moreover, to optimize the RIS phase shift, the centralized-DDQN (C-DDQN) algorithm is recommended to reduce the power consumption. Simulation results demonstrated that the suggested PS-DC-DDQN algorithm has 7.3%, 17.07%, and 29.26% higher EE in comparison to the state-of-the-art FA-DDQN, DDQN, and DQN techniques, respectively. © 2024 The Authors
Author Keywords 6G; D2D-C; EE; NOMA; RIS; Smart cities


Similar Articles


Id Similarity Authors Title Published
23217 View0.886Wang X.; Li J.; Wu J.; Guo L.; Ning Z.Energy Efficiency Optimization Of Irs And Uav-Assisted Wireless Powered Edge NetworksIEEE Journal on Selected Topics in Signal Processing, 18, 7 (2024)
23253 View0.883DIamanti M.; Tsampazi M.; Tsiropoulou E.E.; Papavassiliou S.Energy Efficient Multi-User Communications Aided By Reconfigurable Intelligent Surfaces And UavsProceedings - 2021 IEEE International Conference on Smart Computing, SMARTCOMP 2021 (2021)
40866 View0.878Darem A.A.; Alkhaldi T.M.; Alhashmi A.A.; Mansouri W.; Alghawli A.S.A.; Al-Hadhrami T.Optimizing Resource Allocation For Enhanced Urban Connectivity In Leo-Uav-Ris NetworksJournal of King Saud University - Computer and Information Sciences, 36, 10 (2024)
16152 View0.877Kim J.; Park S.; Jung S.; Cordeiro C.Cooperative Multi-Uav Positioning For Aerial Internet Service Management: A Multi-Agent Deep Reinforcement Learning ApproachIEEE Transactions on Network and Service Management, 21, 4 (2024)
44892 View0.868Ul Abideen Tariq Z.; Baccour E.; Erbad A.; Hamdi M.Reinforcement Learning-Based Anti-Jamming Solution For Aerial Ris-Aided Dense Dynamic Multi-User Environments20th International Wireless Communications and Mobile Computing Conference, IWCMC 2024 (2024)
40943 View0.867Sehito N.; Shouyi Y.; Alshahrani H.M.; Alamgeer M.; Dutta A.K.; Alsubai S.; Nkenyereye L.; Dhanaraj R.K.Optimizing User Association, Power Control, And Beamforming For 6G Multi-Irs Multi-Uav Noma Communications In Smart CitiesIEEE Transactions on Consumer Electronics, 70, 3 (2024)
9694 View0.866Dhuheir M.; Erbad A.; Al-Fuqaha A.; Hamdaoui B.; Guizani M.Aoi-Aware Intelligent Platform For Energy And Rate Management In Multi-Uav Multi-Ris SystemIEEE Transactions on Network and Service Management (2025)
28813 View0.866Seid A.M.; Abishu H.N.; Erbad A.; Guizani M.Hdfrl-Empowered Energy Efficient Resource Allocation For Aerial Mec-Enabled Smart City Cyber Physical System In 6G2023 International Wireless Communications and Mobile Computing, IWCMC 2023 (2023)
5700 View0.866Oubbati O.S.; Alotaibi J.; Alromithy F.; Atiquzzaman M.; Altimania M.R.A Uav-Ugv Cooperative System: Patrolling And Energy Management For Urban MonitoringIEEE Transactions on Vehicular Technology (2025)
46706 View0.865Tariq Z.U.A.; Baccour E.; Erbad A.; Hamdi M.; Guizani M.Rl-Based Adaptive Uav Swarm Formation And Clustering For Secure 6G Wireless Communications In Dynamic Dense EnvironmentsIEEE Access, 12 (2024)