This study delves into the critical need for generating real-world compatible data to support the application of deep reinforcement learning (DRL) in vehicle routing. Despite the advancements in DRL algorithms, their practical implementation in vehicle routing is hindered by the scarcity of appropriate real-world datasets. Existing methodologies often rely on simplistic distance metrics, failing to accurately capture the complexities inherent in real-world routing scenarios. To address this challenge, we present a novel approach for generating real-world compatible data tailored explicitly for DRL-based vehicle routing models. Our methodology centers on the development of a spatial data extraction and curation tool adept at extracting geocoded locations from diverse urban environments, encompassing both planned and unplanned areas. Leveraging advanced techniques, the tool refines location data, accounting for unique characteristics of urban environments. Furthermore, it integrates specialized distance metrics and location demands to construct vehicle routing graphs that represent real-world conditions. Through comprehensive experimentation on varied real-world testbeds, our approach showcases its efficacy in producing datasets closely aligned with the requirements of DRL-based vehicle routing models. It's worth mentioning that this dataset is structured as a graph containing location, distance, and demand information, with each graph stored independently to facilitate efficient access and manipulation. The findings underscore the adaptability and reliability of our methodology in tackling the intricacies of real-world routing challenges. This research marks a significant stride towards enabling the practical application of DRL techniques in addressing real-world vehicle routing problems.
Copyright: © 2024 Ali, Saleem. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.