Easy Parallel and Batch Ops w/ Microsoft Dataverse SDK
Repository: https://github.com/nicknow/Nicknow.DataverseOps NuGet Package: Nicknow.DataverseOps
A .NET class library that simplifies parallel and batch operations using the Microsoft Dataverse SDK. DataverseOps enables high-performance data operations by executing multiple requests concurrently, significantly reducing processing time for bulk operations.
I created this library and put it out as a Nuget to make my life easier. I'm regularly asked for source code that handles parallel operations with the Dataverse SDK (usually after asking why something isn't being done in parallel.) Instead of having to dig up the last code I worked with and try to make it generic enough to be useful I wanted to be able to point folks to a simple library. That is all that this solution is intended to provide, a place where the code is generic enough to be useful and easily accessible either by taking the source code or including the Nuget package in yours solution.
This solution is a personal project of mine. It's released under the MIT License. You are welcome to do anything and everything with it within the rules of the MIT License.
I consider this still in an advanced beta state. I may or may not get around to saying it's officially v1.0 at some point. It really depends on when I get to a place where it is thoroughly tested and has more automated testing. That said I've tested it fairly extensively. This documentation, especially the samples, is also a work in progress. My current goals are to add more automated tests and to implement progress reporting using IProgress<T>. Progress reporting is needed for end-user applications.
- Parallel Execution: Execute multiple Dataverse requests simultaneously with configurable parallelism
- Asynchronous Support: Full async/await support with cancellation token handling
- Batch Operations: Support for ExecuteMultiple requests with automatic batching
- Comprehensive Results: Detailed success/error tracking with timing data and reference values
- Flexible Logging: Optional integration with Microsoft.Extensions.Logging
- Generic Design: Works with any OrganizationRequest/Response types
- Performance Monitoring: Optional timing data capture for performance analysis
- Interface-Based Architecture: Support for
IServiceClientinterface for better testability and dependency injection
Install the DataverseOps NuGet package:
dotnet add package Nicknow.DataverseOpsOr via Package Manager Console:
Install-Package Nicknow.DataverseOpsThe solution is built for .NET Standard 2.1 so should be usable for most projects.
It is not supported for use in a Dataverse Plugin.
using DataverseOps;
using Microsoft.PowerPlatform.Dataverse.Client;
using Microsoft.Xrm.Sdk;
// Initialize your Dataverse ServiceClient
var serviceClient = new ServiceClient("your-connection-string");
// Create the CUDOperations helper for common operations
var cudOps = new CUDOperations(serviceClient, maxParallelOperations: 10);
// Create multiple entities in parallel
var entities = new List<Entity>
{
new Entity("account") { ["name"] = "Company A" },
new Entity("account") { ["name"] = "Company B" },
new Entity("account") { ["name"] = "Company C" }
};
// Execute creates in parallel
var result = cudOps.CreateEntities(entities, referenceAttribute: "name");
// Check results
Console.WriteLine($"Total: {result.TotalProcessed}, Success: {result.SuccessCount}, Errors: {result.ErrorCount}");using DataverseOps;
using Microsoft.PowerPlatform.Dataverse.Client;
using Microsoft.Xrm.Sdk;
using Microsoft.Extensions.Logging;
// Setup with logging and timing data
var logger = LoggerFactory.Create(builder => builder.AddConsole()).CreateLogger<Program>();
var serviceClient = new ServiceClient("your-connection-string");
// Initialize CUDOperations with custom settings
var cudOps = new CUDOperations(
serviceClient: serviceClient,
maxParallelOperations: 8, // Maximum concurrent operations
logger: logger, // Optional logging
captureTimingData: true // Capture performance metrics
);
// Prepare entities to create
var accountsToCreate = new List<Entity>();
for (int i = 1; i <= 100; i++)
{
var account = new Entity("account");
account["name"] = $"Bulk Account {i}";
account["accountnumber"] = $"ACC-{i:D4}";
account["telephone1"] = $"555-{i:D4}";
accountsToCreate.Add(account);
}
// Execute parallel creates with reference tracking
var createResult = cudOps.CreateEntities(
entities: accountsToCreate,
referenceAttribute: "name" // Use 'name' field for tracking individual results
);
// Process results
Console.WriteLine($"Create Operation Summary:");
Console.WriteLine($"- Total Processed: {createResult.TotalProcessed}");
Console.WriteLine($"- Successful: {createResult.SuccessCount}");
Console.WriteLine($"- Failed: {createResult.ErrorCount}");
Console.WriteLine($"- Duration: {createResult.Duration.TotalSeconds:F2} seconds");
// Handle successful results
foreach (var success in createResult.SuccessResults)
{
Console.WriteLine($"Created: {success.ReferenceValue} - ID: {success.Response?.id}");
// Access timing data if captured
if (success.TimingData != null)
{
Console.WriteLine($" Execution time: {success.TimingData.SdkTransactionTimeMilliseconds.TotalMilliseconds}ms");
}
}
// Handle errors
foreach (var error in createResult.ErrorResults)
{
Console.WriteLine($"Failed: {error.ReferenceValue} - Error: {error.ErrorMessage}");
}// Prepare entities to update (assuming you have existing entity IDs)
var accountsToUpdate = new List<Entity>();
var existingAccountIds = new List<Guid> { /* your existing account IDs */ };
foreach (var accountId in existingAccountIds)
{
var account = new Entity("account", accountId);
account["description"] = $"Updated on {DateTime.Now:yyyy-MM-dd}";
account["creditlimit"] = new Money(50000);
accountsToUpdate.Add(account);
}
// Execute parallel updates
var updateResult = cudOps.UpdateEntities(
entities: accountsToUpdate,
referenceAttribute: "accountid"
);
Console.WriteLine($"Updated {updateResult.SuccessCount} of {updateResult.TotalProcessed} accounts");// Delete by entity references
var entityReferencesToDelete = new List<EntityReference>
{
new EntityReference("account", Guid.Parse("guid1")),
new EntityReference("account", Guid.Parse("guid2")),
new EntityReference("contact", Guid.Parse("guid3"))
};
var deleteResult = cudOps.DeleteEntities(entityReferencesToDelete);
// Or delete by entity name and IDs
var accountIdsToDelete = new List<Guid> { /* account IDs to delete */ };
var deleteByIdsResult = cudOps.DeleteEntities("account", accountIdsToDelete);
Console.WriteLine($"Deleted {deleteResult.SuccessCount} entities successfully");DataverseOps now provides full asynchronous support for Execute operations:
using System.Threading;
// Create cancellation token for operation control
var cancellationTokenSource = new CancellationTokenSource();
var cancellationToken = cancellationTokenSource.Token;
// Initialize the core parallel executor for async operations
var executor = new ExecuteRequestsInParallel(
serviceClient: serviceClient,
maxParallelOperations: 5,
logger: logger,
captureTimingData: true
);
// Prepare create requests
var createRequests = accountsToCreate.Select(entity => new CreateRequest { Target = entity });
// Execute asynchronously with cancellation support
var asyncResult = await executor.ExecuteRequestsAsync<CreateRequest, CreateResponse>(
requests: createRequests,
referenceKeySelector: req => req.Target["name"]?.ToString() ?? string.Empty,
cancellationToken: cancellationToken
);
Console.WriteLine($"Async execution completed: {asyncResult.SuccessCount} successful, {asyncResult.ErrorCount} failed");// Execute large datasets asynchronously with batching
var largeEntitySet = new List<Entity>();
for (int i = 1; i <= 1000; i++)
{
var contact = new Entity("contact");
contact["firstname"] = $"Contact";
contact["lastname"] = $"Number {i}";
contact["emailaddress1"] = $"contact{i}@example.com";
largeEntitySet.Add(contact);
}
var createRequests = largeEntitySet.Select(entity => new CreateRequest { Target = entity });
// Execute in batches asynchronously
var asyncBatchResult = await executor.ExecuteRequestsAsync<CreateRequest, CreateResponse>(
requests: createRequests,
requestsPerBatch: 100,
referenceKeySelector: req => req.Target["emailaddress1"]?.ToString() ?? string.Empty,
continueOnError: true,
cancellationToken: cancellationToken
);
Console.WriteLine($"Async batch operation completed:");
Console.WriteLine($"- Total Entities: {asyncBatchResult.TotalProcessed}");
Console.WriteLine($"- Duration: {asyncBatchResult.Duration.TotalMinutes:F2} minutes");Asynchronous operations are not yet supported in the CUDOperations. That will come in a future update.
For very large datasets, batch operations can be more efficient by grouping requests into ExecuteMultiple calls:
// Create 1000 entities using batches of 100
var largeEntitySet = new List<Entity>();
for (int i = 1; i <= 1000; i++)
{
var contact = new Entity("contact");
contact["firstname"] = $"Contact";
contact["lastname"] = $"Number {i}";
contact["emailaddress1"] = $"contact{i}@example.com";
largeEntitySet.Add(contact);
}
// Execute in batches with parallel processing
var batchResult = cudOps.CreateEntities(
entities: largeEntitySet,
requestsPerBatch: 100, // 100 requests per ExecuteMultiple call
referenceAttribute: "emailaddress1",
continueOnError: true // Continue processing even if some batches fail
);
Console.WriteLine($"Batch Operation Results:");
Console.WriteLine($"- Total Entities: {batchResult.TotalProcessed}");
Console.WriteLine($"- Successful: {batchResult.SuccessCount}");
Console.WriteLine($"- Failed: {batchResult.ErrorCount}");
Console.WriteLine($"- Total Duration: {batchResult.Duration.TotalMinutes:F2} minutes");
// Access batch-level results if needed
if (batchResult.BatchResults != null)
{
Console.WriteLine($"- Successful Batches: {batchResult.BatchResults.Count(b => b.IsSuccess)}");
Console.WriteLine($"- Failed Batches: {batchResult.BatchResults.Count(b => !b.IsSuccess)}");
}// Delete operations now support batching with enhanced error handling
var entityReferencesToDelete = new List<EntityReference>();
for (int i = 0; i < 500; i++)
{
entityReferencesToDelete.Add(new EntityReference("account", Guid.NewGuid()));
}
// Delete with batching support
var batchDeleteResult = cudOps.DeleteEntities(
entityReferences: entityReferencesToDelete,
requestsPerBatch: 50,
continueOnError: true // Continue with remaining batches even if some fail
);
// Or delete by entity name and IDs with batching
var accountIds = Enumerable.Range(1, 200).Select(_ => Guid.NewGuid()).ToList();
var batchDeleteByIdsResult = cudOps.DeleteEntities(
entityLogicalName: "account",
entityIds: accountIds,
requestsPerBatch: 25,
continueOnError: true
);
Console.WriteLine($"Batch delete completed: {batchDeleteResult.SuccessCount} successful deletions");DataverseOps works with any OrganizationRequest/Response types:
using Microsoft.Xrm.Sdk.Messages;
// Initialize the core parallel executor for custom requests
var executor = new ExecuteRequestsInParallel(
serviceClient: serviceClient,
maxParallelOperations: 5,
logger: logger,
captureTimingData: true
);
// Example: Parallel WhoAmI requests (for demonstration)
var whoAmIRequests = Enumerable.Range(1, 10)
.Select(i => new WhoAmIRequest())
.ToList();
// Execute custom requests in parallel
var whoAmIResult = executor.ExecuteRequests<WhoAmIRequest, WhoAmIResponse>(
requests: whoAmIRequests,
referenceKeySelector: req => $"WhoAmI_{Guid.NewGuid()}" // Custom reference generator
);
// Process results
foreach (var result in whoAmIResult.SuccessResults)
{
Console.WriteLine($"User ID: {result.Response?.UserId}");
}The Dataverse SDK class ServiceClient does not have an interface and is not mockable for testing purposes. In most scenarios you do not need to directly mock ServiceClient and instead use it as a concrete implementation of another interface, such as IOrganizationService. This will not work for DataverseOps which requires the ServiceClient.Clone method that is not covered by an interface that ServiceClient implements. To make DataverseOps testable we have implemented an interface, IServiceClient, and a concrete implementation, ServiceClientWrapper, that implements IServiceClient and the other interfaces that ServiceClient implements. This will allow you to use write tests (as shown in DataversOps.Test) and do dependency injection (as shown below.)
The ExecuteRequestsInParellel and CUDOperations constructors are overloaded and will accept a ServiceClient object or any object implementing IServiceClient.
DataverseOps now supports the IServiceClient interface for better testability:
using DataverseOps.ServiceClientWrapper;
// Using interface for dependency injection
public class DataService
{
private readonly CUDOperations _cudOps;
public DataService(IServiceClient serviceClient, ILogger<DataService> logger)
{
_cudOps = new CUDOperations(serviceClient, maxParallelOperations: 8, logger: logger);
}
public async Task<ExecutionResult<CreateResponse>> CreateAccountsAsync(
IEnumerable<Entity> accounts,
CancellationToken cancellationToken = default)
{
// Use the async methods for better scalability
var executor = new ExecuteRequestsInParallel(_cudOps._serviceClient);
var requests = accounts.Select(entity => new CreateRequest { Target = entity });
return await executor.ExecuteRequestsAsync<CreateRequest, CreateResponse>(
requests: requests,
referenceKeySelector: req => req.Target["name"]?.ToString() ?? string.Empty,
cancellationToken: cancellationToken
);
}
}
// In your DI container setup
services.AddScoped<IServiceClient>(provider =>
new ServiceClientWrapper(new ServiceClient("connection-string")));
services.AddScoped<DataService>();// Configure for maximum resilience
var resilientOps = new CUDOperations(
serviceClient: serviceClient,
maxParallelOperations: 4, // Lower parallelism for stability
logger: logger,
captureTimingData: true
);
var mixedEntities = new List<Entity>
{
new Entity("account") { ["name"] = "Valid Account" },
new Entity("account") { ["name"] = "" }, // This might fail validation
new Entity("invalidtable") { ["name"] = "Invalid Table" }, // This will fail
new Entity("account") { ["name"] = "Another Valid Account" }
};
// Execute with error handling
var mixedResult = resilientOps.CreateEntities(
entities: mixedEntities,
requestsPerBatch: 2,
referenceAttribute: "name",
continueOnError: true // Continue processing despite errors
);
// Detailed error analysis
Console.WriteLine($"Execution Summary:");
Console.WriteLine($"Success Rate: {(double)mixedResult.SuccessCount / mixedResult.TotalProcessed:P1}");
if (mixedResult.HasErrors)
{
Console.WriteLine($"Error Details:");
foreach (var error in mixedResult.ErrorResults)
{
Console.WriteLine($"- Reference: {error.ReferenceValue}");
Console.WriteLine($" Error: {error.ErrorMessage}");
Console.WriteLine($" Transaction ID: {error.InternalTransactionId}");
// Access full exception details if needed
if (error.Exception != null)
{
Console.WriteLine($" Exception Type: {error.Exception.GetType().Name}");
}
}
}
// Analyze batch-level errors if using batching
if (mixedResult.BatchResults != null)
{
var failedBatches = mixedResult.BatchResults.Where(b => !b.IsSuccess);
foreach (var failedBatch in failedBatches)
{
Console.WriteLine($"Failed Batch: {failedBatch.ReferenceValue}");
Console.WriteLine($" Error: {failedBatch.ErrorMessage}");
}
}// Enable detailed performance monitoring
var performanceOps = new CUDOperations(
serviceClient: serviceClient,
maxParallelOperations: 10,
logger: logger,
captureTimingData: true // Enable timing data capture
);
var testEntities = Enumerable.Range(1, 50)
.Select(i => new Entity("account") { ["name"] = $"Performance Test {i}" })
.ToList();
var perfResult = performanceOps.CreateEntities(testEntities, "name");
// Analyze performance metrics
var timingData = perfResult.SuccessResults
.Where(r => r.TimingData != null)
.Select(r => r.TimingData!.SdkTransactionTimeMilliseconds)
.ToList();
if (timingData.Any())
{
Console.WriteLine($"Performance Metrics:");
Console.WriteLine($"- Average Request Time: {timingData.Average():F2}ms");
Console.WriteLine($"- Fastest Request: {timingData.Min():F2}ms");
Console.WriteLine($"- Slowest Request: {timingData.Max():F2}ms");
Console.WriteLine($"- Total Parallel Duration: {perfResult.Duration.TotalMilliseconds:F2}ms");
Console.WriteLine($"- Estimated Sequential Time: {timingData.Sum():F2}ms");
}Convenience wrapper for common Create, Update, Upsert, and Delete operations.
Constructors:
CUDOperations(ServiceClient serviceClient, int maxParallelOperations = 8, ILogger? logger = null, bool captureTimingData = false)
CUDOperations(IServiceClient serviceClient, int maxParallelOperations = 8, ILogger? logger = null, bool captureTimingData = false)Key Methods:
CreateEntities(IEnumerable<Entity>, string referenceAttribute = "")- Parallel createsCreateEntities(IEnumerable<Entity>, int requestsPerBatch, string referenceAttribute = "", bool continueOnError = true)- Batch createsUpdateEntities(IEnumerable<Entity>, string referenceAttribute = "")- Parallel updatesUpdateEntities(IEnumerable<Entity>, int requestsPerBatch, string referenceAttribute = "", bool continueOnError = true)- Batch updatesUpsertEntities(IEnumerable<Entity>, string referenceAttribute = "")- Parallel upsertsUpsertEntities(IEnumerable<Entity>, int requestsPerBatch, string referenceAttribute = "", bool continueOnError = true)- Batch upsertsDeleteEntities(IEnumerable<EntityReference>)- Parallel deletesDeleteEntities(IEnumerable<EntityReference>, int requestsPerBatch, bool continueOnError = true)- Batch deletesDeleteEntities(string entityLogicalName, IEnumerable<Guid> entityIds)- Parallel deletes by IDsDeleteEntities(string entityLogicalName, IEnumerable<Guid> entityIds, int requestsPerBatch, bool continueOnError = true)- Batch deletes by IDs
Core engine for executing any OrganizationRequest types in parallel.
Constructors:
ExecuteRequestsInParallel(ServiceClient serviceClient, int maxParallelOperations = 8, ILogger? logger = null, bool captureTimingData = false)
ExecuteRequestsInParallel(IServiceClient serviceClient, int maxParallelOperations = 8, ILogger? logger = null, bool captureTimingData = false)Key Methods:
ExecuteRequests<TRequest, TResponse>(IEnumerable<TRequest>, Func<TRequest, string>? referenceKeySelector = null)ExecuteRequests<TRequest, TResponse>(IEnumerable<TRequest>, int requestsPerBatch, Func<TRequest, string>? referenceKeySelector = null, bool continueOnError = true)ExecuteRequestsAsync<TRequest, TResponse>(IEnumerable<TRequest>, Func<TRequest, string>? referenceKeySelector = null, CancellationToken cancellationToken = default)ExecuteRequestsAsync<TRequest, TResponse>(IEnumerable<TRequest>, int requestsPerBatch, Func<TRequest, string>? referenceKeySelector = null, bool continueOnError = true, CancellationToken cancellationToken = default)
Contains comprehensive results from parallel execution.
Properties:
TotalProcessed- Total number of requests processedSuccessCount- Number of successful requestsErrorCount- Number of failed requestsDuration- Total execution timeResults- List of individual resultsSuccessResults- Filtered successful resultsErrorResults- Filtered error resultsBatchResults- List of batch-level results (when using batching)
Result of a single request execution.
Properties:
IsSuccess- Whether the request succeededResponse- The response object (if successful)ErrorMessage- Error message (if failed)ReferenceValue- Custom reference value for trackingTimingData- Performance timing information (if enabled)InternalTransactionId- Unique identifier for tracking
Interface for Dataverse service client operations, enabling better testability and dependency injection.
Key Features:
- Implements
IOrganizationServiceandIOrganizationServiceAsync2 - Provides
Clone()methods for thread-safe operations - Supports both synchronous and asynchronous execution
- Includes additional methods like
CreateAndReturnAsync()
These are designed to solve for a problem in parallel operations where you have a large collection of responses but you may not be able to correlate them to the requests. This most commonly is an issue when creating records but can also exist when using alternate keys for updates/upserts.
Imagine you have a List<Entity> object with 1000 account records to be created. Dataverse's response will be the primary key (accountid) but you have no way of knowing which item in the request list goes to which i in the response list. Assuming you have some attribute/field on the record that is unique within the list you can have this value included in the SingleExecutionResult<TResponse> object as ReferenceValue.
If you don't need or care about this information leave referenceKeySelector or referenceAttribute null and it will be skipped.
Logging is optional. If you don't pass an ILogger object then logging will be skipped. Any object that implements Microsoft.Extensions.Logging.ILogger is supported.
- Use synchronous methods for console applications or when you don't need to free up threads
- Use asynchronous methods for web applications, services, or when you need better scalability
- Always use
CancellationTokenwith async methods for proper cancellation support
- Set
continueOnError: truefor batch operations unless you need all-or-nothing behavior - Check both individual results and batch results when using batching
- Use
TimingDatato identify performance bottlenecks
- Start with
maxParallelOperations: 8and adjust based on your Dataverse environment's capacity - Enable
captureTimingDataduring development and testing, disable in production for (minimally) better performance
Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.
Copyright Β© 2025 Nicolas Nowinski. All Rights Reserved. This project is licensed under the MIT License. See the LICENSE.MD file for details.