The growing demand for computation-intensive and delay-sensitive services in internet of things (IoT) networks is constrained by the limited computing capacity and battery life of device users, as well as bandwidth limitations in shared communication channels. Mobile-edge computing (MEC) emerges as a promising solution to address these resource limitations by offloading tasks. However, many existing offloading approaches may restrict performance gains due to the overloaded communication channels among multiple users. To tackle these issues, this research aims to develop an energy-efficient task offloading framework for multi-IoT, multi-server edge computing systems. This framework integrates a load balancing algorithm for optimal device distribution, a compression layer to reduce data transmission overhead, and a deep reinforcement learning technique to dynamically make offloading and compression decisions. Additionally, the proposed solution jointly formulates load balancing, task offloading, compression, and communication allocation, aiming to minimize the energy consumption of the entire system. Given the NP-hard nature of this problem, an efficient deep learning-based technique is developed to achieve a near-optimum solution. Finally, experimental results reveal that the model achieves significant energy savings, with reductions of up to 63.96% and 61.87% in local execution and offloading scenarios, respectively, in scenarios with low channel bandwidth availability. These findings confirm the effectiveness of the proposed solution in enhancing system efficiency and scalability in real-world MEC environments. © 2025 Elsevier B.V., All rights reserved.