Skip to content

Python-driven workflow for AV/Toll cooperation modes#24

Open
scholarhaozheng wants to merge 10 commits intoXuan-1998:cuda-latestfrom
scholarhaozheng:feature/python-integration
Open

Python-driven workflow for AV/Toll cooperation modes#24
scholarhaozheng wants to merge 10 commits intoXuan-1998:cuda-latestfrom
scholarhaozheng:feature/python-integration

Conversation

@scholarhaozheng
Copy link

@scholarhaozheng scholarhaozheng commented Oct 24, 2025

(Note for reviewers: This PR is a summary of all work. Per XJ's request, it has been split into two new PRs for easier review and merging:)

Hi XJ,
This PR integrates the Python-driven workflow to simulate and analyze three cooperation modes for AV companies and Tolls.
The main scripts added are:

  • optimizer_with_optuna_MSA_S3.py: Used for parameter tuning with Optuna.
  • optimizer_MSA_S3_new.py: The main script for running the experiments.
  • final_analysis_0930.py: Script for analyzing the results.
    Note:
    optimizer_MSA_S3_new_lane_sensitivity.py is also included. This is a new script and is still a work-in-progress.

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR integrates a Python-driven workflow for simulating and analyzing three cooperation modes between AV companies and Tolls. The implementation includes parameter tuning via Optuna, experiment runners for different scenarios, and comprehensive result analysis scripts.

Key changes:

  • Added four main Python scripts for optimization, sensitivity analysis, and visualization of toll optimization experiments
  • Extended C++ simulator with support for AV/HV disaggregation, toll fee handling, and multiple cooperation scenarios
  • Implemented dynamic lane-changing behavior based on cost-benefit analysis for both AV and HV vehicles

Reviewed Changes

Copilot reviewed 27 out of 38 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
visualize_evolution.py New script for visualizing dynamic evolution of tolls and congestion across iterations
optimizer_with_optuna_MSA_S3.py Parameter tuning script using Optuna framework for hyperparameter optimization
optimizer_MSA_S3_run_try_travel_time_new.py Main experiment runner for missing S3 scenarios
optimizer_MSA_S3_new_lane_sensitivity.py Work-in-progress script for lane sensitivity analysis
final_analysis_0930.py Comprehensive analysis and visualization of experiment results
LivingCity/traffic/b18TrafficSimulator.cpp Core C++ changes: AV/HV tracking, toll-based routing, generalized cost functions
LivingCity/traffic/b18CUDA_trafficSimulator.cu CUDA kernel updates for cost-based lane changing and vehicle tracking
command_line_options.ini Configuration updates for new AV/toll parameters
Comments suppressed due to low confidence (2)

optimizer_baseline.py:1

  • The inline comment '// -1' is misleading as the default value is now 1000, not -1. This comment should be updated or removed to reflect the actual default value.
import pandas as pd

LivingCity/traffic/b18TrafficSimulator.cpp:1

  • There's an extra closing brace and semicolon ,}; after the last parameter. This should be just }; without the comma.
#pragma once

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

toll_update_df.to_csv(temp_toll_filepath, index=False)

config_path_in_sim_dir = os.path.join(SIMULATOR_WORKING_DIR, CONFIG_FILE_NAME)
update_ini_file_safely(config_path_in_sim_dir, params_for_ini)
Copy link

Copilot AI Oct 24, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The function update_ini_file_safely expects a single toll_filepath string parameter but is being passed a dictionary params_for_ini. This will cause a runtime error. The function signature should be updated to accept a dictionary or the call site should be corrected.

Copilot uses AI. Check for mistakes.
Comment on lines +34 to +43
"LC_ALPHA_AV": 1.5, "LC_GAMMA_AV": 0.8, # 低敏感度、高犹豫度的AV
"LC_ALPHA_HV": 1.0, "LC_GAMMA_HV": 0.5, # 低敏感度、高犹豫度的HV
},
"AV-High_HV-Low": {
"LC_ALPHA_AV": 3.0, "LC_GAMMA_AV": 1.5, # 高敏感度、低犹豫度的AV (原始值)
"LC_ALPHA_HV": 1.0, "LC_GAMMA_HV": 0.5, # 低敏感度、高犹豫度的HV
},
"AV-High_HV-High": {
"LC_ALPHA_AV": 3.0, "LC_GAMMA_AV": 1.5, # 高敏感度、低犹豫度的AV (原始值)
"LC_ALPHA_HV": 2.0, "LC_GAMMA_HV": 1.0, # 高敏感度、低犹豫度的HV
Copy link

Copilot AI Oct 24, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comments are written in Chinese, which may not be accessible to all developers. Consider translating these to English (e.g., 'Low sensitivity, high hesitation for AV').

Suggested change
"LC_ALPHA_AV": 1.5, "LC_GAMMA_AV": 0.8, # 低敏感度、高犹豫度的AV
"LC_ALPHA_HV": 1.0, "LC_GAMMA_HV": 0.5, # 低敏感度、高犹豫度的HV
},
"AV-High_HV-Low": {
"LC_ALPHA_AV": 3.0, "LC_GAMMA_AV": 1.5, # 高敏感度、低犹豫度的AV (原始值)
"LC_ALPHA_HV": 1.0, "LC_GAMMA_HV": 0.5, # 低敏感度、高犹豫度的HV
},
"AV-High_HV-High": {
"LC_ALPHA_AV": 3.0, "LC_GAMMA_AV": 1.5, # 高敏感度、低犹豫度的AV (原始值)
"LC_ALPHA_HV": 2.0, "LC_GAMMA_HV": 1.0, # 高敏感度、低犹豫度的HV
"LC_ALPHA_AV": 1.5, "LC_GAMMA_AV": 0.8, # Low sensitivity, high hesitation for AV
"LC_ALPHA_HV": 1.0, "LC_GAMMA_HV": 0.5, # Low sensitivity, high hesitation for HV
},
"AV-High_HV-Low": {
"LC_ALPHA_AV": 3.0, "LC_GAMMA_AV": 1.5, # High sensitivity, low hesitation for AV (original value)
"LC_ALPHA_HV": 1.0, "LC_GAMMA_HV": 0.5, # Low sensitivity, high hesitation for HV
},
"AV-High_HV-High": {
"LC_ALPHA_AV": 3.0, "LC_GAMMA_AV": 1.5, # High sensitivity, low hesitation for AV (original value)
"LC_ALPHA_HV": 2.0, "LC_GAMMA_HV": 1.0, # High sensitivity, low hesitation for HV

Copilot uses AI. Check for mistakes.
void b18GetTollProportionsCUDA(std::vector<uint>& toll_steps, std::vector<uint>& total_steps, size_t num_edges) {
if (num_edges == 0) return;

// 调整CPU端向量的大小以接收数据
Copy link

Copilot AI Oct 24, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment is in Chinese ('Adjust CPU-side vector size to receive data'). Consider translating to English for consistency with the rest of the codebase.

Suggested change
// 调整CPU端向量的大小以接收数据
// Adjust CPU-side vector size to receive data

Copilot uses AI. Check for mistakes.
@@ -0,0 +1 @@
import pandas as pdimport geopandas as gpdfrom shapely.geometry import LineStringimport os"""create_geojson.pyThis utility script reads the raw node and edge CSV files from the LPSimsimulation data and generates a GeoJSON file representing the road network'sgeometry. This GeoJSON file can then be used for spatial analysis and plotting.This script only needs to be run once."""# --- Configuration ---# Please ensure these paths point to your actual data files.# I have pre-filled them based on the paths you provided.LIVINGCITY_DIR = 'LivingCity'NETWORK_DATA_PATH = os.path.join(LIVINGCITY_DIR, 'berkeley_2018', 'full_network') # Adjusted to match .iniNODES_CSV_PATH = os.path.join(NETWORK_DATA_PATH, 'nodes.csv')EDGES_CSV_PATH = os.path.join(NETWORK_DATA_PATH, 'edges.csv')# Output filenameGEOJSON_OUTPUT_PATH = "network.geojson"def create_network_geojson(): """ Reads node and edge data and creates a GeoJSON file. """ print("--- Starting GeoJSON Generation ---") # 1. Check if input files exist if not os.path.exists(NODES_CSV_PATH) or not os.path.exists(EDGES_CSV_PATH): print(f"ERROR: Cannot find input files.") print(f" - Checked for nodes at: {NODES_CSV_PATH}") print(f" - Checked for edges at: {EDGES_CSV_PATH}") print("Please verify the paths in the script.") return # 2. Load nodes data (ID, Longitude, Latitude) print(f"Reading nodes from '{NODES_CSV_PATH}'...") nodes_df = pd.read_csv(NODES_CSV_PATH) # Create a dictionary for fast coordinate lookup: {node_id: (longitude, latitude)} # The 'index' column from nodes.csv corresponds to 'u' and 'v' in edges.csv node_coords = {row['index']: (row['x'], row['y']) for index, row in nodes_df.iterrows()} print(f"Loaded {len(node_coords)} node coordinates.") # 3. Load edges data print(f"Reading edges from '{EDGES_CSV_PATH}'...") edges_df = pd.read_csv(EDGES_CSV_PATH) print(f"Loaded {len(edges_df)} edges.") # 4. Create LineString geometries for each edge print("Creating geometries for each edge (this might take a moment)...") def create_linestring(row): # Get coordinates for the start (u) and end (v) nodes of the edge start_node_id = row['u'] end_node_id = row['v'] start_coord = node_coords.get(start_node_id) end_coord = node_coords.get(end_node_id) if start_coord and end_coord: return LineString([start_coord, end_coord]) return None # Apply the function to each row of the edges dataframe edges_df['geometry'] = edges_df.apply(create_linestring, axis=1) # 5. Create the GeoDataFrame print("Constructing GeoDataFrame...") # Select only the necessary columns for the final file # 'uniqueid' is the crucial column that links simulation results to geometry gdf = gpd.GeoDataFrame(edges_df[['uniqueid', 'geometry']], geometry='geometry') gdf.set_crs("EPSG:4326", inplace=True) # Set coordinate system to WGS84 (lat/lon) # 6. Save to GeoJSON file print(f"Saving GeoDataFrame to '{GEOJSON_OUTPUT_PATH}'...") gdf.to_file(GEOJSON_OUTPUT_PATH, driver='GeoJSON') print("\n--- SUCCESS! ---") print(f"GeoJSON network file has been created at '{GEOJSON_OUTPUT_PATH}'.") print("You can now use this file in the main analysis script.")if __name__ == "__main__": create_network_geojson() No newline at end of file
Copy link

Copilot AI Oct 24, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The entire file content is on a single line with no proper line breaks. This file needs proper formatting with line breaks after each statement to be readable and maintainable.

Suggested change
import pandas as pdimport geopandas as gpdfrom shapely.geometry import LineStringimport os"""create_geojson.pyThis utility script reads the raw node and edge CSV files from the LPSimsimulation data and generates a GeoJSON file representing the road network'sgeometry. This GeoJSON file can then be used for spatial analysis and plotting.This script only needs to be run once."""# --- Configuration ---# Please ensure these paths point to your actual data files.# I have pre-filled them based on the paths you provided.LIVINGCITY_DIR = 'LivingCity'NETWORK_DATA_PATH = os.path.join(LIVINGCITY_DIR, 'berkeley_2018', 'full_network') # Adjusted to match .iniNODES_CSV_PATH = os.path.join(NETWORK_DATA_PATH, 'nodes.csv')EDGES_CSV_PATH = os.path.join(NETWORK_DATA_PATH, 'edges.csv')# Output filenameGEOJSON_OUTPUT_PATH = "network.geojson"def create_network_geojson(): """ Reads node and edge data and creates a GeoJSON file. """ print("--- Starting GeoJSON Generation ---") # 1. Check if input files exist if not os.path.exists(NODES_CSV_PATH) or not os.path.exists(EDGES_CSV_PATH): print(f"ERROR: Cannot find input files.") print(f" - Checked for nodes at: {NODES_CSV_PATH}") print(f" - Checked for edges at: {EDGES_CSV_PATH}") print("Please verify the paths in the script.") return # 2. Load nodes data (ID, Longitude, Latitude) print(f"Reading nodes from '{NODES_CSV_PATH}'...") nodes_df = pd.read_csv(NODES_CSV_PATH) # Create a dictionary for fast coordinate lookup: {node_id: (longitude, latitude)} # The 'index' column from nodes.csv corresponds to 'u' and 'v' in edges.csv node_coords = {row['index']: (row['x'], row['y']) for index, row in nodes_df.iterrows()} print(f"Loaded {len(node_coords)} node coordinates.") # 3. Load edges data print(f"Reading edges from '{EDGES_CSV_PATH}'...") edges_df = pd.read_csv(EDGES_CSV_PATH) print(f"Loaded {len(edges_df)} edges.") # 4. Create LineString geometries for each edge print("Creating geometries for each edge (this might take a moment)...") def create_linestring(row): # Get coordinates for the start (u) and end (v) nodes of the edge start_node_id = row['u'] end_node_id = row['v'] start_coord = node_coords.get(start_node_id) end_coord = node_coords.get(end_node_id) if start_coord and end_coord: return LineString([start_coord, end_coord]) return None # Apply the function to each row of the edges dataframe edges_df['geometry'] = edges_df.apply(create_linestring, axis=1) # 5. Create the GeoDataFrame print("Constructing GeoDataFrame...") # Select only the necessary columns for the final file # 'uniqueid' is the crucial column that links simulation results to geometry gdf = gpd.GeoDataFrame(edges_df[['uniqueid', 'geometry']], geometry='geometry') gdf.set_crs("EPSG:4326", inplace=True) # Set coordinate system to WGS84 (lat/lon) # 6. Save to GeoJSON file print(f"Saving GeoDataFrame to '{GEOJSON_OUTPUT_PATH}'...") gdf.to_file(GEOJSON_OUTPUT_PATH, driver='GeoJSON') print("\n--- SUCCESS! ---") print(f"GeoJSON network file has been created at '{GEOJSON_OUTPUT_PATH}'.") print("You can now use this file in the main analysis script.")if __name__ == "__main__": create_network_geojson()
import pandas as pd
import geopandas as gpd
from shapely.geometry import LineString
import os
"""
create_geojson.py
This utility script reads the raw node and edge CSV files from the LPSim
simulation data and generates a GeoJSON file representing the road network's
geometry. This GeoJSON file can then be used for spatial analysis and plotting.
This script only needs to be run once.
"""
# --- Configuration ---
# Please ensure these paths point to your actual data files.
# I have pre-filled them based on the paths you provided.
LIVINGCITY_DIR = 'LivingCity'
NETWORK_DATA_PATH = os.path.join(LIVINGCITY_DIR, 'berkeley_2018', 'full_network') # Adjusted to match .ini
NODES_CSV_PATH = os.path.join(NETWORK_DATA_PATH, 'nodes.csv')
EDGES_CSV_PATH = os.path.join(NETWORK_DATA_PATH, 'edges.csv')
# Output filename
GEOJSON_OUTPUT_PATH = "network.geojson"
def create_network_geojson():
"""
Reads node and edge data and creates a GeoJSON file.
"""
print("--- Starting GeoJSON Generation ---")
# 1. Check if input files exist
if not os.path.exists(NODES_CSV_PATH) or not os.path.exists(EDGES_CSV_PATH):
print(f"ERROR: Cannot find input files.")
print(f" - Checked for nodes at: {NODES_CSV_PATH}")
print(f" - Checked for edges at: {EDGES_CSV_PATH}")
print("Please verify the paths in the script.")
return
# 2. Load nodes data (ID, Longitude, Latitude)
print(f"Reading nodes from '{NODES_CSV_PATH}'...")
nodes_df = pd.read_csv(NODES_CSV_PATH)
# Create a dictionary for fast coordinate lookup: {node_id: (longitude, latitude)}
# The 'index' column from nodes.csv corresponds to 'u' and 'v' in edges.csv
node_coords = {row['index']: (row['x'], row['y']) for index, row in nodes_df.iterrows()}
print(f"Loaded {len(node_coords)} node coordinates.")
# 3. Load edges data
print(f"Reading edges from '{EDGES_CSV_PATH}'...")
edges_df = pd.read_csv(EDGES_CSV_PATH)
print(f"Loaded {len(edges_df)} edges.")
# 4. Create LineString geometries for each edge
print("Creating geometries for each edge (this might take a moment)...")
def create_linestring(row):
# Get coordinates for the start (u) and end (v) nodes of the edge
start_node_id = row['u']
end_node_id = row['v']
start_coord = node_coords.get(start_node_id)
end_coord = node_coords.get(end_node_id)
if start_coord and end_coord:
return LineString([start_coord, end_coord])
return None
# Apply the function to each row of the edges dataframe
edges_df['geometry'] = edges_df.apply(create_linestring, axis=1)
# 5. Create the GeoDataFrame
print("Constructing GeoDataFrame...")
# Select only the necessary columns for the final file
# 'uniqueid' is the crucial column that links simulation results to geometry
gdf = gpd.GeoDataFrame(edges_df[['uniqueid', 'geometry']], geometry='geometry')
gdf.set_crs("EPSG:4326", inplace=True) # Set coordinate system to WGS84 (lat/lon)
# 6. Save to GeoJSON file
print(f"Saving GeoDataFrame to '{GEOJSON_OUTPUT_PATH}'...")
gdf.to_file(GEOJSON_OUTPUT_PATH, driver='GeoJSON')
print("\n--- SUCCESS! ---")
print(f"GeoJSON network file has been created at '{GEOJSON_OUTPUT_PATH}'.")
print("You can now use this file in the main analysis script.")
if __name__ == "__main__":
create_network_geojson()

Copilot uses AI. Check for mistakes.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants