The proliferation of autonomous vehicles (AVs) demands efficient data transfer and low latency for optimal performance in next-generation (5G and beyond) cellular networks. This paper addresses this challenge by proposing a deep learning (DL) technique using a Bidirectional Long Short-Term Memory (BiLSTM) model to predict traffic rates for AVs in a dynamic Fog Computing environment integrated with multi-cloud services. We compare the BiLSTM model’s accuracy with a traditional unidirectional LSTM model, focusing on the impact of batch size on prediction performance. Simulation results demonstrate that the BiLSTM model significantly outperforms the unidirectional LSTM in terms of forecasting accuracy. Furthermore, an optimal batch size is identified that yields superior results. © 2025 Elsevier B.V., All rights reserved.