Bulk Insert (as batch sized) into SQL Database from .NET Web Application

DotNet Full Stack Dev
7 min readMar 26, 2024

--

Bulk insertion of data into a SQL database is a common requirement in many applications, especially when dealing with large datasets. In this blog post, we’ll explore a step-by-step approach to perform bulk insertions from a .NET web application into a SQL database. We’ll use C# code snippets to illustrate each step and demonstrate how to insert data in batches to optimize performance.

Embark on a journey of continuous learning and exploration with DotNet-FullStack-Dev. Uncover more by visiting our https://dotnet-fullstack-dev.blogspot.com reach out for further information.

Step 1: Prepare Data for Bulk Insertion:

Before performing bulk insertion, ensure that the data to be inserted is properly formatted and available in the application. For demonstration purposes, let’s assume we have a list of objects representing records to be inserted into the database.

List<MyDataObject> dataList = GetDataForBulkInsertion();

Step 2: Establish Connection to SQL Database:

Next, establish a connection to the SQL database using ADO.NET or Entity Framework. Ensure that the connection string is configured correctly in the application’s configuration file.

string connectionString = "YourConnectionString";
using (SqlConnection connection = new SqlConnection(connectionString))
{
connection.Open();
// Perform bulk insertion
}

Step 3: Create SQL Command for Bulk Insertion:

Create a SQL command for bulk insertion, specifying the table name and column mappings. For bulk insertion, we’ll use SQL Server’s SqlBulkCopy class, which efficiently loads data into a SQL Server table.

using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
bulkCopy.DestinationTableName = "YourTableName";
// Define column mappings if necessary
// bulkCopy.ColumnMappings.Add("SourceColumn", "DestinationColumn");
// Specify batch size
bulkCopy.BatchSize = 10;
// Write data to SQL Server
bulkCopy.WriteToServer(dataReader);
}

Step 4: Perform Bulk Insertion in Batches:

To optimize performance and avoid memory issues, insert data in batches. Split the data into smaller chunks and perform bulk insertion for each batch.

int batchSize = 10;
for (int i = 0; i < dataList.Count; i += batchSize)
{
IEnumerable<MyDataObject> batch = dataList.Skip(i).Take(batchSize);
DataTable dataTable = ConvertToDataTable(batch);
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection))
{
bulkCopy.DestinationTableName = "YourTableName";
// Define column mappings if necessary
// bulkCopy.ColumnMappings.Add("SourceColumn", "DestinationColumn");
// Specify batch size
bulkCopy.BatchSize = batchSize;
// Write data to SQL Server
bulkCopy.WriteToServer(dataTable);
}
}

Step 5: Handle Errors and Close Connection:

Handle any potential errors during bulk insertion and close the database connection after the operation is completed.

// Handle errors
catch (Exception ex)
{
// Handle exception
}
// Close connection
finally
{
connection.Close();
}

When performing bulk insertions, it’s essential to monitor the process and handle any potential failures gracefully. Here’s how you can handle various scenarios:

  1. Monitoring Bulk Insertion Success or Failure:

After each batch insertion, you can check the result to determine if it was successful or if any errors occurred. The SqlBulkCopy class provides an event called SqlRowsCopied that can be used to track the number of rows copied. You can also catch exceptions during the bulk insertion process.

bulkCopy.SqlRowsCopied += (sender, e) =>
{
Console.WriteLine($"{e.RowsCopied} rows copied successfully.");
};

try
{
// Perform bulk insertion
}
catch (Exception ex)
{
Console.WriteLine($"Bulk insertion failed with error: {ex.Message}");
}

2. Handling Batch Failure:

If one batch fails during bulk insertion, you can catch the exception and decide how to proceed. You may choose to log the error, retry the insertion, or skip the batch altogether.

try
{
// Perform bulk insertion for a batch
}
catch (Exception ex)
{
Console.WriteLine($"Batch insertion failed with error: {ex.Message}");
// Decide how to handle the failure (e.g., retry, skip, log)
}

3. Reinserting a Failed Batch or Record:

If a batch or record fails during insertion, you can attempt to reinsert it based on your error handling strategy. You may need to identify the failed batch or record and retry the insertion process.

// Example of retrying insertion for a failed batch
try
{
// Retry bulk insertion for the failed batch
}
catch (Exception ex)
{
Console.WriteLine($"Retry insertion for batch failed with error: {ex.Message}");
// Handle the failure accordingly
}
// Example of retrying insertion for a failed record
try
{
// Retry insertion for the failed record within the batch
}
catch (Exception ex)
{
Console.WriteLine($"Retry insertion for record failed with error: {ex.Message}");
// Handle the failure accordingly
}

Let’s consider a scenario where we are bulk inserting data from a list of objects into a SQL Server database using SqlBulkCopy in C#. We'll handle errors gracefully and retry the insertion for failed batches or records.

using System;
using System.Collections.Generic;
using System.Data;
using System.Data.SqlClient;
using System.Linq;

public class Program
{
private const string ConnectionString = "YourConnectionString";
private const string TableName = "YourTableName";
private const int BatchSize = 10;

public static void Main()
{
List<MyDataObject> dataList = GetDataForBulkInsertion();
InsertDataInBatches(dataList);
}

private static List<MyDataObject> GetDataForBulkInsertion()
{
// Generate sample data for bulk insertion
List<MyDataObject> dataList = Enumerable.Range(1, 100).Select(i => new MyDataObject { Id = i, Name = "Name " + i }).ToList();
return dataList;
}

private static void InsertDataInBatches(List<MyDataObject> dataList)
{
using (SqlConnection connection = new SqlConnection(ConnectionString))
{
connection.Open();

for (int i = 0; i < dataList.Count; i += BatchSize)
{
IEnumerable<MyDataObject> batch = dataList.Skip(i).Take(BatchSize);
DataTable dataTable = ConvertToDataTable(batch);

using (SqlTransaction transaction = connection.BeginTransaction())
{
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction))
{
bulkCopy.DestinationTableName = TableName;
bulkCopy.BatchSize = BatchSize;

// Column mappings if necessary
// bulkCopy.ColumnMappings.Add("SourceColumn", "DestinationColumn");

// Event handler for rows copied
bulkCopy.SqlRowsCopied += (sender, e) =>
{
Console.WriteLine($"{e.RowsCopied} rows copied successfully.");
};

try
{
bulkCopy.WriteToServer(dataTable);
transaction.Commit();
}
catch (Exception ex)
{
Console.WriteLine($"Error occurred: {ex.Message}");
transaction.Rollback(); // Rollback transaction on error
// Log or handle error accordingly
}
}
}
}
}
}

private static DataTable ConvertToDataTable(IEnumerable<MyDataObject> data)
{
DataTable dataTable = new DataTable();
dataTable.Columns.Add("Id", typeof(int));
dataTable.Columns.Add("Name", typeof(string));

foreach (MyDataObject item in data)
{
dataTable.Rows.Add(item.Id, item.Name);
}

return dataTable;
}
}

public class MyDataObject
{
public int Id { get; set; }
public string Name { get; set; }
}

In this example:

  • We have a Main method where we generate sample data for bulk insertion.
  • The InsertDataInBatches method handles bulk insertion in batches, with error handling and transaction management.
  • Inside the bulk insertion loop, each batch of data is converted to a DataTable, and then inserted using SqlBulkCopy.
  • Error handling is done using try-catch blocks, and transactions ensure data consistency by rolling back in case of errors.
  • Event handlers are used to track the number of rows copied during insertion.

This approach ensures that bulk insertion is performed efficiently while handling errors and maintaining data integrity.

Let’s modify the code to include retry logic for failed batches during bulk insertion.

private static void InsertDataInBatches(List<MyDataObject> dataList)
{
using (SqlConnection connection = new SqlConnection(ConnectionString))
{
connection.Open();

for (int i = 0; i < dataList.Count; i += BatchSize)
{
IEnumerable<MyDataObject> batch = dataList.Skip(i).Take(BatchSize);
DataTable dataTable = ConvertToDataTable(batch);

bool insertionSuccessful = false;
int retryAttempts = 3; // Number of retry attempts

while (!insertionSuccessful && retryAttempts > 0)
{
using (SqlTransaction transaction = connection.BeginTransaction())
{
using (SqlBulkCopy bulkCopy = new SqlBulkCopy(connection, SqlBulkCopyOptions.Default, transaction))
{
bulkCopy.DestinationTableName = TableName;
bulkCopy.BatchSize = BatchSize;

// Event handler for rows copied
bulkCopy.SqlRowsCopied += (sender, e) =>
{
Console.WriteLine($"{e.RowsCopied} rows copied successfully.");
};

try
{
bulkCopy.WriteToServer(dataTable);
transaction.Commit();
insertionSuccessful = true; // Mark insertion as successful
}
catch (Exception ex)
{
Console.WriteLine($"Error occurred: {ex.Message}");
transaction.Rollback(); // Rollback transaction on error
// Log or handle error accordingly
retryAttempts--; // Decrement retry attempts
}
}
}
}

if (!insertionSuccessful)
{
Console.WriteLine($"Batch insertion failed after {retryAttempts} retry attempts.");
// Log or handle failed batch accordingly
}
}
}
}

In this modified code:

  • We introduce a retryAttempts variable to control the number of retry attempts for a failed batch.
  • The insertion loop will attempt to insert the batch until it succeeds or reaches the maximum retry attempts.
  • If the insertion is unsuccessful after retrying, appropriate error handling or logging can be performed.
  • This ensures that failed batches are retried for a certain number of attempts before giving up, improving the chances of successful bulk insertion.

If the bulk insertion fails even after the specified number of retry attempts, you might consider implementing additional error handling or escalation procedures. Here are some options:

  1. Log and Alert: Log the failure details and trigger an alert/notification to inform the appropriate stakeholders about the issue. This could involve sending an email, creating a ticket in a bug tracking system, or triggering an alert in a monitoring tool.
  2. Manual Intervention: If the failure persists despite retry attempts, manual intervention may be required. A developer or operations team member can investigate the issue further, identify the root cause, and take corrective action.
  3. Automated Recovery: Implement automated recovery mechanisms to handle persistent failures. This could involve automatically adjusting the parameters of the bulk insertion process, such as batch size or timeout settings, or triggering alternative processing pathways.
  4. Rollback and Retry: In cases where partial data has been inserted before the failure, you may need to rollback the entire transaction and retry the bulk insertion process from the beginning. This ensures data consistency and integrity.
  5. Error Handling Improvements: Analyze the reasons for failure and consider making improvements to error handling mechanisms, such as implementing more robust retry logic, optimizing database configurations, or addressing underlying issues in the data or environment.

Conclusion

Performing bulk insertions from a .NET web application into a SQL database can significantly improve performance and efficiency, especially when dealing with large datasets. By following the step-by-step approach outlined in this blog post and utilizing C# code snippets, developers can seamlessly implement bulk insertion functionality in their applications while effectively handling errors to ensure robust data management and database performance.

The response to persistent bulk insertion failures depends on the specific requirements, criticality of the data, and available resources for investigation and resolution. It’s essential to have clear escalation procedures and mechanisms in place to address such scenarios effectively and minimize the impact on the application and its users.

--

--

DotNet Full Stack Dev
DotNet Full Stack Dev

Written by DotNet Full Stack Dev

Join me to master .NET Full Stack Development & boost your skills by 1% daily with insights, examples, and techniques! https://dotnet-fullstack-dev.blogspot.com

No responses yet