Skip to content

Add collector for Cheltenham Borough Council#135

Open
moley-bot[bot] wants to merge 2 commits intomainfrom
collector/CheltenhamBoroughCouncil-issue-131-1769857489
Open

Add collector for Cheltenham Borough Council#135
moley-bot[bot] wants to merge 2 commits intomainfrom
collector/CheltenhamBoroughCouncil-issue-131-1769857489

Conversation

@moley-bot
Copy link

@moley-bot moley-bot bot commented Jan 31, 2026

Summary

This PR adds a new bin collection data collector for Cheltenham Borough Council.

  • Implements ICollector interface
  • Adds integration tests
  • Successfully tested with example postcode from issue

Closes #131

Test Summary

 ==================== Test Summary ====================
 
 --------------------- Collector ----------------------
 
 Cheltenham Borough Council
 
 ------------------- Addresses (17) -------------------
 
 - 1 SPENSER ROAD CHELTENHAM CHELTENHAM GL51 7DU, GL51 7DU, 51363
 - 3 SPENSER ROAD CHELTENHAM CHELTENHAM GL51 7DU, GL51 7DU, 51374
 - 3 A SPENSER ROAD CHELTENHAM CHELTENHAM GL51 7DU, GL51 7DU, 5597
 - 5 SPENSER ROAD CHELTENHAM CHELTENHAM GL51 7DU, GL51 7DU, 51376
 - 7 SPENSER ROAD CHELTENHAM CHELTENHAM GL51 7DU, GL51 7DU, 51377
 - ...
 
 --------------------- Bin Types ----------------------
 
 - Food Waste (Green Caddy)
 - Garden Waste (Brown)
 - General Waste (Green)
 - Mixed Dry Recycling (Green Box) (Green Box)
 - Paper, Glass & Cardboard Recycling (Blue Bag) (Blue Bag)
 
 ------------------- Bin Days (78) --------------------
 
 - 03/02/2026 (3 bins):
   - Mixed Dry Recycling (Green Box) (Green Box)
   - Paper, Glass & Cardboard Recycling (Blue Bag) (Blue Bag)
   - Food Waste (Green Caddy)
 
 - 09/02/2026 (1 bins):
   - Garden Waste (Brown)
 
 - 10/02/2026 (2 bins):
   - General Waste (Green)
   - Food Waste (Green Caddy)
 
 - 17/02/2026 (3 bins):
   - Food Waste (Green Caddy)
   - Mixed Dry Recycling (Green Box) (Green Box)
   - Paper, Glass & Cardboard Recycling (Blue Bag) (Blue Bag)
 
 - 23/02/2026 (1 bins):
   - Garden Waste (Brown)
 
 - 24/02/2026 (2 bins):
   - General Waste (Green)
   - Food Waste (Green Caddy)
 
 - 03/03/2026 (3 bins):
   - Food Waste (Green Caddy)
   - Mixed Dry Recycling (Green Box) (Green Box)
   - Paper, Glass & Cardboard Recycling (Blue Bag) (Blue Bag)
 
 - 09/03/2026 (1 bins):
   - Garden Waste (Brown)
 
 - 10/03/2026 (2 bins):
   - General Waste (Green)
   - Food Waste (Green Caddy)
 
 - 17/03/2026 (3 bins):
   - Mixed Dry Recycling (Green Box) (Green Box)
   - Paper, Glass & Cardboard Recycling (Blue Bag) (Blue Bag)
   - Food Waste (Green Caddy)
 - ...
 
 ======================================================

Generated automatically by Moley-Bot using Codex CLI

Closes #131

Generated with Codex CLI by Moley-Bot
  Formatted by Moley-Bot
@BadgerHobbs
Copy link
Owner

/gemini review

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a new collector for Cheltenham Borough Council. While comprehensive, it has several security and reliability issues. The code is vulnerable to denial-of-service (DoS) attacks due to improper handling of malformed external responses in ParseJsonp and GetWeekStartDate. Missing URL encoding for client-side requests also poses a risk to functionality (e.g., with postcodes containing spaces) and security (potential parameter injection). Furthermore, the implementation is complex, particularly in the GetBinDays method, which exceeds recommended length, and exhibits code duplication and style guide deviations. Refactoring suggestions include creating helper methods for session/task logic, inlining constants, and improving collection/loop style.

Comment on lines +167 to +711
public GetBinDaysResponse GetBinDays(Address address, ClientSideResponse? clientSideResponse)
{
// Prepare client-side request for creating a session
if (clientSideResponse == null)
{
var clientSideRequest = new ClientSideRequest
{
RequestId = 1,
Url = $"{_auroraBaseUrl}/RequestSession?userName=guest%20CBC&password=&script={_scriptPath}&callback=_jqjsp",
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;
}
// Prepare client-side request for retrieving workflow tasks
else if (clientSideResponse.RequestId == 1)
{
var responseJson = ParseJsonp(clientSideResponse.Content);
var sessionId = responseJson.RootElement.GetProperty("Session").GetProperty("SessionId").GetString()!;

var clientSideRequest = new ClientSideRequest
{
RequestId = 2,
Url = $"{_auroraBaseUrl}/GetWorkflow?sessionId={sessionId}&workflowId=wastestreet&callback=_jqjsp",
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
Options = new ClientSideOptions
{
Metadata =
{
{ "sessionId", sessionId },
},
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;
}
// Prepare client-side request to open the script map
else if (clientSideResponse.RequestId == 2)
{
var workflowJson = ParseJsonp(clientSideResponse.Content);
var sessionId = clientSideResponse.Options!.Metadata["sessionId"];

var taskIds = workflowJson.RootElement.GetProperty("Tasks").EnumerateArray().ToDictionary(
task => task.GetProperty("$type").GetString()!,
task => task.GetProperty("Id").GetString()!
);

var clientSideRequest = new ClientSideRequest
{
RequestId = 3,
Url = $"{_auroraBaseUrl}/OpenScriptMap?sessionId={sessionId}&callback=_jqjsp",
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
Options = new ClientSideOptions
{
Metadata =
{
{ "sessionId", sessionId },
{ "restoreTaskId", taskIds["StatMap.Aurora.RestoreStateTask, StatMapService"] },
{ "saveTaskId", taskIds["StatMap.Aurora.SaveStateTask, StatMapService"] },
{ "clearTaskId", taskIds["StatMap.Aurora.ClearResultSetTask, StatMapService"] },
{ "visibilityTaskId", taskIds["StatMap.Aurora.ChangeLayersVisibilityTask, StatMapService"] },
{ "drillTaskId", taskIds["StatMap.Aurora.DrillDownTask, StatMapService"] },
{ "fetchTaskId", taskIds["StatMap.Aurora.FetchResultSetTask, StatMapService"] },
},
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;
}
// Prepare client-side request to restore state
else if (clientSideResponse.RequestId == 3)
{
var sessionId = clientSideResponse.Options!.Metadata["sessionId"];
var restoreTaskId = clientSideResponse.Options.Metadata["restoreTaskId"];

var job = """
{
"Task": {
"$type": "StatMap.Aurora.RestoreStateTask, StatMapService"
}
}
""";

var clientSideRequest = new ClientSideRequest
{
RequestId = 4,
Url = $"{_auroraBaseUrl}/ExecuteTaskJob?sessionId={sessionId}&taskId={restoreTaskId}&job={job}&callback=_jqjsp",
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
Options = new ClientSideOptions
{
Metadata = new()
{
{ "sessionId", sessionId },
{ "saveTaskId", clientSideResponse.Options.Metadata["saveTaskId"] },
{ "clearTaskId", clientSideResponse.Options.Metadata["clearTaskId"] },
{ "visibilityTaskId", clientSideResponse.Options.Metadata["visibilityTaskId"] },
{ "drillTaskId", clientSideResponse.Options.Metadata["drillTaskId"] },
{ "fetchTaskId", clientSideResponse.Options.Metadata["fetchTaskId"] },
},
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;
}
// Prepare client-side request to save state
else if (clientSideResponse.RequestId == 4)
{
var sessionId = clientSideResponse.Options!.Metadata["sessionId"];
var saveTaskId = clientSideResponse.Options.Metadata["saveTaskId"];

var job = """
{
"Task": {
"$type": "StatMap.Aurora.SaveStateTask, StatMapService"
}
}
""";

var clientSideRequest = new ClientSideRequest
{
RequestId = 5,
Url = $"{_auroraBaseUrl}/ExecuteTaskJob?sessionId={sessionId}&taskId={saveTaskId}&job={job}&callback=_jqjsp",
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
Options = new ClientSideOptions
{
Metadata = new()
{
{ "sessionId", sessionId },
{ "clearTaskId", clientSideResponse.Options.Metadata["clearTaskId"] },
{ "visibilityTaskId", clientSideResponse.Options.Metadata["visibilityTaskId"] },
{ "drillTaskId", clientSideResponse.Options.Metadata["drillTaskId"] },
{ "fetchTaskId", clientSideResponse.Options.Metadata["fetchTaskId"] },
},
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;
}
// Prepare client-side request to clear the result set
else if (clientSideResponse.RequestId == 5)
{
var sessionId = clientSideResponse.Options!.Metadata["sessionId"];
var clearTaskId = clientSideResponse.Options.Metadata["clearTaskId"];

var job = """
{
"Task": {
"$type": "StatMap.Aurora.ClearResultSetTask, StatMapService"
}
}
""";

var clientSideRequest = new ClientSideRequest
{
RequestId = 6,
Url = $"{_auroraBaseUrl}/ExecuteTaskJob?sessionId={sessionId}&taskId={clearTaskId}&job={job}&callback=_jqjsp",
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
Options = new ClientSideOptions
{
Metadata = new()
{
{ "sessionId", sessionId },
{ "visibilityTaskId", clientSideResponse.Options.Metadata["visibilityTaskId"] },
{ "drillTaskId", clientSideResponse.Options.Metadata["drillTaskId"] },
{ "fetchTaskId", clientSideResponse.Options.Metadata["fetchTaskId"] },
},
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;
}
// Prepare client-side request to ensure layers are visible
else if (clientSideResponse.RequestId == 6)
{
var sessionId = clientSideResponse.Options!.Metadata["sessionId"];
var visibilityTaskId = clientSideResponse.Options.Metadata["visibilityTaskId"];

var job = """
{
"Task": {
"$type": "StatMap.Aurora.ChangeLayersVisibilityTask, StatMapService"
}
}
""";

var clientSideRequest = new ClientSideRequest
{
RequestId = 7,
Url = $"{_auroraBaseUrl}/ExecuteTaskJob?sessionId={sessionId}&taskId={visibilityTaskId}&job={job}&callback=_jqjsp",
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
Options = new ClientSideOptions
{
Metadata = new()
{
{ "sessionId", sessionId },
{ "drillTaskId", clientSideResponse.Options.Metadata["drillTaskId"] },
{ "fetchTaskId", clientSideResponse.Options.Metadata["fetchTaskId"] },
},
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;
}
// Prepare client-side request to resolve the selected location
else if (clientSideResponse.RequestId == 7)
{
var sessionId = clientSideResponse.Options!.Metadata["sessionId"];

var clientSideRequest = new ClientSideRequest
{
RequestId = 8,
Url = $"{_auroraBaseUrl}/FindLocation?sessionId={sessionId}&locationId={address.Uid}&callback=_jqjsp",
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
Options = new ClientSideOptions
{
Metadata = new()
{
{ "sessionId", sessionId },
{ "drillTaskId", clientSideResponse.Options.Metadata["drillTaskId"] },
{ "fetchTaskId", clientSideResponse.Options.Metadata["fetchTaskId"] },
},
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;
}
// Prepare client-side request to drill into the selected location
else if (clientSideResponse.RequestId == 8)
{
var responseJson = ParseJsonp(clientSideResponse.Content);
var sessionId = clientSideResponse.Options!.Metadata["sessionId"];
var drillTaskId = clientSideResponse.Options.Metadata["drillTaskId"];

var location = responseJson.RootElement.GetProperty("Locations").EnumerateArray().First();
var queryX = location.GetProperty("X").GetDouble();
var queryY = location.GetProperty("Y").GetDouble();

var job = $$"""
{
"QueryX": {{queryX.ToString(CultureInfo.InvariantCulture)}},
"QueryY": {{queryY.ToString(CultureInfo.InvariantCulture)}},
"Task": {
"Type": "StatMap.Aurora.DrillDownTask, StatMapService"
}
}
""";

var clientSideRequest = new ClientSideRequest
{
RequestId = 9,
Url = $"{_auroraBaseUrl}/ExecuteTaskJob?sessionId={sessionId}&taskId={drillTaskId}&job={job}&callback=_jqjsp",
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
Options = new ClientSideOptions
{
Metadata = new()
{
{ "sessionId", sessionId },
{ "fetchTaskId", clientSideResponse.Options.Metadata["fetchTaskId"] },
{ "queryX", queryX.ToString(CultureInfo.InvariantCulture) },
{ "queryY", queryY.ToString(CultureInfo.InvariantCulture) },
},
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;
}
// Prepare client-side request to fetch the result set
else if (clientSideResponse.RequestId == 9)
{
var sessionId = clientSideResponse.Options!.Metadata["sessionId"];
var fetchTaskId = clientSideResponse.Options.Metadata["fetchTaskId"];
var queryX = clientSideResponse.Options.Metadata["queryX"];
var queryY = clientSideResponse.Options.Metadata["queryY"];

var job = $$"""
{
"QueryX": {{queryX}},
"QueryY": {{queryY}},
"Task": {
"Type": "StatMap.Aurora.FetchResultSetTask, StatMapService"
},
"ResultSetName": "inspection"
}
""";

var clientSideRequest = new ClientSideRequest
{
RequestId = 10,
Url = $"{_auroraBaseUrl}/ExecuteTaskJob?sessionId={sessionId}&taskId={fetchTaskId}&job={job}&callback=_jqjsp",
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;
}
// Prepare client-side request for the calendar feed and store service configuration
else if (clientSideResponse.RequestId == 10)
{
var fetchJson = ParseJsonp(clientSideResponse.Content);
var table = fetchJson.RootElement
.GetProperty("TaskResult")
.GetProperty("DistanceOrderedSet")
.GetProperty("ResultSet")
.GetProperty("Tables")
.EnumerateArray()
.First();

var columnNames = table.GetProperty("ColumnDefinitions")
.EnumerateArray()
.Select(column => column.GetProperty("ColumnName").GetString()!)
.ToList();

var record = table.GetProperty("Records").EnumerateArray().First().EnumerateArray().ToList();

var recordData = columnNames.Zip(record).ToDictionary(
pair => pair.First,
pair => pair.Second.ToString().Trim()
);

var clientSideRequest = new ClientSideRequest
{
RequestId = 11,
Url = _calendarUrl,
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
Options = new ClientSideOptions
{
Metadata = new()
{
{ "refuseDay", recordData["New_Refuse_Day_internal"] },
{ "refuseWeek", recordData["Refuse_Week_External"] },
{ "recyclingDay", recordData["New_Recycling_Round"] },
{ "recyclingWeek", recordData["Amended_Recycling_Round"] },
{ "foodDay", recordData["New_Food_Day"] },
{ "foodWeek", recordData["New_Food_Week_Internal"] },
{ "gardenDay", recordData["Garden_Bin_Crew"] },
{ "gardenWeek", recordData["Garden_Bin_Week"] },
},
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;
}
// Process bin days using the calendar feed and service configuration
else if (clientSideResponse.RequestId == 11)
{
var week1Start = GetWeekStartDate(clientSideResponse.Content, "Week 1");
var week2Start = GetWeekStartDate(clientSideResponse.Content, "Week 2");
var endDate = DateOnly.FromDateTime(DateTime.Now.AddMonths(12));

var refuseDates = BuildCollectionDates(
clientSideResponse.Options!.Metadata["refuseDay"],
clientSideResponse.Options.Metadata["refuseWeek"],
week1Start,
week2Start,
endDate
);

var recyclingDates = BuildCollectionDates(
clientSideResponse.Options.Metadata["recyclingDay"],
clientSideResponse.Options.Metadata["recyclingWeek"],
week1Start,
week2Start,
endDate
);

var foodDates = BuildCollectionDates(
clientSideResponse.Options.Metadata["foodDay"],
clientSideResponse.Options.Metadata["foodWeek"],
week1Start,
week2Start,
endDate
);

var refuseBins = ProcessingUtilities.GetMatchingBins(_binTypes, "Refuse");
var recyclingBins = ProcessingUtilities.GetMatchingBins(_binTypes, "Recycling");
var foodBins = ProcessingUtilities.GetMatchingBins(_binTypes, "Food");
var gardenBins = ProcessingUtilities.GetMatchingBins(_binTypes, "Garden");

// Iterate through each collection date, and create a bin day entry
var binDays = new List<BinDay>();
foreach (var date in refuseDates)
{
binDays.Add(new BinDay
{
Address = address,
Date = date,
Bins = refuseBins,
});
}

// Iterate through each recycling date, and create a bin day entry
foreach (var date in recyclingDates)
{
binDays.Add(new BinDay
{
Address = address,
Date = date,
Bins = recyclingBins,
});
}

// Iterate through each food waste date, and create a bin day entry
foreach (var date in foodDates)
{
binDays.Add(new BinDay
{
Address = address,
Date = date,
Bins = foodBins,
});
}

var gardenWeek = clientSideResponse.Options.Metadata["gardenWeek"];
if (!string.IsNullOrWhiteSpace(clientSideResponse.Options.Metadata["gardenDay"])
&& !string.IsNullOrWhiteSpace(gardenWeek))
{
var gardenDates = BuildCollectionDates(
clientSideResponse.Options.Metadata["gardenDay"],
gardenWeek,
week1Start,
week2Start,
endDate
);

// Iterate through each garden waste date, and create a bin day entry
foreach (var date in gardenDates)
{
binDays.Add(new BinDay
{
Address = address,
Date = date,
Bins = gardenBins,
});
}
}

var getBinDaysResponse = new GetBinDaysResponse
{
BinDays = ProcessingUtilities.ProcessBinDays(binDays),
};

return getBinDaysResponse;
}

// Throw exception for invalid request
throw new InvalidOperationException("Invalid client-side request.");
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The GetBinDays method is extremely long and complex, spanning over 500 lines with 11 distinct steps. This violates the style guide's core principle of keeping collectors simple and ideally under 400 lines total.

The primary source of complexity is the repetitive logic for the ExecuteTaskJob calls in steps 4 through 7. This logic can be extracted into a helper method to significantly reduce code duplication and improve readability.

Consider creating a helper method like this:

private ClientSideRequest CreateTaskJobRequest(int nextRequestId, ClientSideResponse clientSideResponse, string taskIdKey, string taskType)
{
    var sessionId = clientSideResponse.Options!.Metadata["sessionId"];
    var taskId = clientSideResponse.Options.Metadata[taskIdKey];

    var job = $$$"""
{
	"Task": {
		"$type": "{{taskType}}"
	}
}
"""";

    var newMetadata = new Dictionary<string, string>(clientSideResponse.Options.Metadata);
    newMetadata.Remove(taskIdKey);

    return new ClientSideRequest
    {
        RequestId = nextRequestId,
        Url = $"{_auroraBaseUrl}/ExecuteTaskJob?sessionId={sessionId}&taskId={taskId}&job={job}&callback=_jqjsp",
        Method = "GET",
        Headers = new() { { "user-agent", Constants.UserAgent } },
        Options = new ClientSideOptions { Metadata = newMetadata },
    };
}

Using this helper would make each step much more concise.

References
  1. Collectors should be kept simple, ideally between 200-400 lines. Collectors over 500 lines are a code smell indicating unnecessary complexity. (link)
  2. Truly duplicated logic (used 2-3+ times) should be extracted into helper methods to reduce duplication. (link)

Comment on lines +718 to +720
var startIndex = content.IndexOf('(');
var endIndex = content.LastIndexOf(')');
var jsonContent = content[(startIndex + 1)..endIndex];

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

security-medium medium

The ParseJsonp method is vulnerable to a denial-of-service (DoS) attack. It attempts to extract JSON content by finding the first '(' and last ')' characters without validating their existence or relative positions. If the content (which originates from the untrusted clientSideResponse.Content) is malformed—for example, if it lacks parentheses or if the closing parenthesis appears before the opening one—the method will throw an ArgumentOutOfRangeException during the string slicing operation at line 720. This will cause the request to fail with a 500 error.

		var startIndex = content.IndexOf('(');
		var endIndex = content.LastIndexOf(')');

		if (startIndex == -1 || endIndex == -1 || endIndex <= startIndex)
		{
			throw new InvalidOperationException("Invalid JSONP response.");
		}

		var jsonContent = content[(startIndex + 1)..endIndex];

Comment on lines +836 to +843
var startLine = lines.FirstOrDefault(line => line.StartsWith("DTSTART", StringComparison.OrdinalIgnoreCase));

if (startLine == null)
{
continue;
}

var dateText = startLine.Split(':', 2)[1].Trim();

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

security-medium medium

The GetWeekStartDate method is vulnerable to a denial-of-service (DoS) attack. At line 843, it assumes that the startLine (derived from the untrusted clientSideResponse.Content) always contains a colon character. If an attacker provides a malformed ICS feed where a DTSTART line lacks a colon, Split(':', 2) will return an array with only one element. Accessing index [1] will then throw an IndexOutOfRangeException, causing the request to crash with a 500 error.

			var startLine = lines.FirstOrDefault(line => line.StartsWith("DTSTART", StringComparison.OrdinalIgnoreCase));

			if (startLine == null || !startLine.Contains(':'))
			{
				continue;
			}

			var dateText = startLine.Split(':', 2)[1].Trim();

var clientSideRequest = new ClientSideRequest
{
RequestId = 2,
Url = $"{_auroraBaseUrl}/FindLocation?sessionId={sessionId}&address={postcode}&limit=100&callback=_jqjsp",

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

security-medium medium

Multiple URLs constructed for ClientSideRequest objects (e.g., at lines 114, 441, 489, and 536) use string interpolation with untrusted data such as postcode, address.Uid, sessionId, and job without performing URL encoding.

  1. Functionality Risk: Postcodes containing spaces (which is standard for UK postcodes) will result in invalid URLs, likely causing requests to fail.
  2. Security Risk (Parameter Injection): An attacker can inject additional query parameters by including characters like & or = in the input. This could be used to manipulate the requests made to the third-party council API, such as injecting a malicious callback parameter for a JSONP-based XSS attack on the council's domain.

Recommendation: Use System.Web.HttpUtility.UrlEncode for all untrusted variables inserted into query strings.

Comment on lines +170 to +188
if (clientSideResponse == null)
{
var clientSideRequest = new ClientSideRequest
{
RequestId = 1,
Url = $"{_auroraBaseUrl}/RequestSession?userName=guest%20CBC&password=&script={_scriptPath}&callback=_jqjsp",
Method = "GET",
Headers = new()
{
{ "user-agent", Constants.UserAgent },
},
};

var getBinDaysResponse = new GetBinDaysResponse
{
NextClientSideRequest = clientSideRequest,
};

return getBinDaysResponse;

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The logic to request a new session is duplicated at the beginning of both GetAddresses (lines 85-103) and GetBinDays. This violates the DRY principle and the style guide's recommendation to extract duplicated logic.

Please create a shared helper method for this logic, as shown in the style guide for logic used in both GetAddresses and GetBinDays.

/// <summary>
/// Creates the initial client-side request for a new session.
/// </summary>
private ClientSideRequest CreateSessionRequest(int requestId)
{
	return new ClientSideRequest
	{
		RequestId = requestId,
		Url = $"{_auroraBaseUrl}/RequestSession?userName=guest%20CBC&password=&script={_scriptPath}&callback=_jqjsp",
		Method = "GET",
		Headers = new()
		{
			{ "user-agent", Constants.UserAgent },
		},
	};
}

You can then call this helper from both GetAddresses and GetBinDays.

			var clientSideRequest = CreateSessionRequest(1);

			var getBinDaysResponse = new GetBinDaysResponse
			{
				NextClientSideRequest = clientSideRequest,
			};

			return getBinDaysResponse;
References
  1. Extract truly duplicated logic (used 2-3+ times) into helper methods. The style guide provides an example for creating an initial request helper used in both GetAddresses and GetBinDays. (link)

Colour = BinColour.Green,
Keys = [ "Food" ],
Type = BinType.Caddy,
},

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

The last object initializer in the _binTypes collection is missing a trailing comma. The style guide requires trailing commas for all multi-line initializers to improve maintainability and create cleaner diffs.

		},
References
  1. Always use trailing commas in multi-line initializers to make future diffs cleaner and reduce merge conflicts. (link)

/// <summary>
/// The Google Calendar ICS feed for Cheltenham's week schedule.
/// </summary>
private const string _calendarUrl = "https://calendar.google.com/calendar/ical/v7oettki6t1pi2p7s0j6q6121k%40group.calendar.google.com/public/full.ics";

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

This constant _calendarUrl is only used once within the file (at line 578). According to the style guide, constants should only be created for values used two or more times. For single-use values, it's better to inline them directly where they are used.

References
  1. Constants should only be created for values that are used 2 or more times. Values used only once should be inlined directly. (link)

Comment on lines +643 to +699
// Iterate through each collection date, and create a bin day entry
var binDays = new List<BinDay>();
foreach (var date in refuseDates)
{
binDays.Add(new BinDay
{
Address = address,
Date = date,
Bins = refuseBins,
});
}

// Iterate through each recycling date, and create a bin day entry
foreach (var date in recyclingDates)
{
binDays.Add(new BinDay
{
Address = address,
Date = date,
Bins = recyclingBins,
});
}

// Iterate through each food waste date, and create a bin day entry
foreach (var date in foodDates)
{
binDays.Add(new BinDay
{
Address = address,
Date = date,
Bins = foodBins,
});
}

var gardenWeek = clientSideResponse.Options.Metadata["gardenWeek"];
if (!string.IsNullOrWhiteSpace(clientSideResponse.Options.Metadata["gardenDay"])
&& !string.IsNullOrWhiteSpace(gardenWeek))
{
var gardenDates = BuildCollectionDates(
clientSideResponse.Options.Metadata["gardenDay"],
gardenWeek,
week1Start,
week2Start,
endDate
);

// Iterate through each garden waste date, and create a bin day entry
foreach (var date in gardenDates)
{
binDays.Add(new BinDay
{
Address = address,
Date = date,
Bins = gardenBins,
});
}
}

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

low

The logic for creating BinDay objects is repeated in four separate foreach loops for refuse, recycling, food, and garden waste. This can be made more concise and less repetitive.

Consider refactoring this into a single loop that iterates over a collection of tuples or a small helper method. This would improve readability and reduce code duplication.

Example refactoring:

var binDays = new List<BinDay>();
var services = new[]
{
    (refuseDates, refuseBins),
    (recyclingDates, recyclingBins),
    (foodDates, foodBins)
};

foreach (var (dates, bins) in services)
{
    foreach (var date in dates)
    {
        binDays.Add(new BinDay { Address = address, Date = date, Bins = bins });
    }
}

// Handle garden waste separately due to its conditional nature
// ...
References
  1. Keep collectors simple and focused. Reducing code duplication, even for smaller patterns, contributes to overall simplicity and maintainability. (link)

/// <summary>
/// Parses a day of week string that may be abbreviated.
/// </summary>
private static DayOfWeek ParseDayOfWeek(string dayOfWeekText)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Helper method only used once.

/// <summary>
/// The Google Calendar ICS feed for Cheltenham's week schedule.
/// </summary>
private const string _calendarUrl = "https://calendar.google.com/calendar/ical/v7oettki6t1pi2p7s0j6q6121k%40group.calendar.google.com/public/full.ics";
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Single referenced consts.

var addresses = new List<Address>();
foreach (var location in locations)
{
var description = location.GetProperty("Description").GetString()!
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could use regex instead.

}

/// <inheritdoc/>
public GetBinDaysResponse GetBinDays(Address address, ClientSideResponse? clientSideResponse)
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

11 requests? This smells of over complexity.

var queryY = clientSideResponse.Options.Metadata["queryY"];

var job = $$"""
{
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Incorrect indentation.

},
new()
{
Name = "Mixed Dry Recycling (Green Box)",
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Don't include bin type and color in names.

@BadgerHobbs BadgerHobbs added the new collector Request for a new collector to be supported label Feb 1, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

new collector Request for a new collector to be supported

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Cheltenham Borough Council

1 participant