Convert CSV to JSON with a .NET REST API (csv2json)

Learn to convert CSV to JSON using a .NET REST API (csv2json). This guide covers project setup, CsvHelper parsing, streaming for large files, error handling, testing, and deployment considerations with practical code examples.

MyDataTables
MyDataTables Team
·5 min read
CSV2JSON for .NET - MyDataTables
Photo by StockSnapvia Pixabay
Quick AnswerDefinition

According to MyDataTables, you can implement a robust CSV to JSON endpoint in a .NET REST API by parsing CSV rows with CsvHelper and serializing to JSON with System.Text.Json. Use streaming to handle large files, support configurable delimiters and encodings, and return a JSON array of records for easy consumption.

Overview and design goals

This section outlines the architecture for a robust CSV2JSON endpoint in a .NET REST API. The goal is to keep the API simple to consume, maintainable, and scalable. Core choices include ASP.NET Core for the REST surface, CsvHelper for resilient CSV parsing, and System.Text.Json for fast serialization. A streaming approach reduces memory pressure when handling large CSV files, while configurable delimiter and encoding ensure compatibility with diverse data sources. According to MyDataTables, a clean separation of concerns—controller handling requests, a CSV service for parsing, and a serializer for JSON output—yields predictable behavior and easier testing.

C#
// Program.cs (minimal hosting) using Microsoft.AspNetCore.Builder; using Microsoft.Extensions.DependencyInjection; var builder = WebApplication.CreateBuilder(args); builder.Services.AddControllers(); var app = builder.Build(); app.MapControllers(); app.Run();

Bootstrap the project and install dependencies

Set up a minimal ASP.NET Core web API project and add CsvHelper to parse CSV data efficiently. This section demonstrates the necessary CLI commands to bootstrap the project, install the CSV parsing package, and prepare the environment for the CSV2JSON endpoint. The approach favors a clean, testable structure with a dedicated controller and a small service that can be swapped for more advanced parsing if needed.

Bash
echo "# Csv2JsonApi" > README.md dotnet new webapi -n Csv2JsonApi cd Csv2JsonApi # Add the CSV parsing library dotnet add package CsvHelper

Implement the CSV2JSON endpoint

Create a controller that accepts a CSV file via HTTP POST, uses CsvHelper to parse each record, and returns a JSON array. The example handles a configurable delimiter and basic validation. This approach is straightforward to test locally and can be extended with authentication, input validation, and error handling as needed.

C#
using Microsoft.AspNetCore.Mvc; using CsvHelper; using System.Globalization; using System.Text.Json; namespace Csv2JsonApi.Controllers; [ApiController] [Route("api/[controller]")] public class CsvController : ControllerBase { [HttpPost("csv2json")] public IActionResult CsvToJson([FromForm] IFormFile file, [FromQuery] string delimiter = ",") { if (file == null || file.Length == 0) return BadRequest("No file provided."); var config = new CsvHelper.Configuration.CsvConfiguration(CultureInfo.InvariantCulture) { Delimiter = delimiter, BadDataFound = null }; using var stream = file.OpenReadStream(); using var reader = new System.IO.StreamReader(stream); using var csv = new CsvReader(reader, config); var records = new List<dynamic>(); foreach (var row in csv.GetRecords<dynamic>()) { records.Add(row); } var json = JsonSerializer.Serialize(records); return Ok(json); } }

Streaming for large CSVs to JSON

For large CSV files, streaming minimizes peak memory and improves responsiveness. The endpoint streams a JSON array as the CSV is parsed, emitting each row as a JSON object. This example uses a writer to sequentially emit chunks, avoiding full materialization of the dataset in memory. It also demonstrates graceful handling of encoding and delimiters.

C#
using Microsoft.AspNetCore.Mvc; using CsvHelper; using System.Globalization; using System.Text.Json; namespace Csv2JsonApi.Controllers; [ApiController] [Route("api/[controller]")] public class CsvControllerStreaming : ControllerBase { [HttpPost("csv2json/stream")] public async Task StreamCsvToJson([FromForm] IFormFile file, string delimiter = ",") { if (file == null || file.Length == 0) return BadRequest("No file provided."); Response.ContentType = "application/json"; await using var stream = file.OpenReadStream(); using var reader = new System.IO.StreamReader(stream); var config = new CsvHelper.Configuration.CsvConfiguration(CultureInfo.InvariantCulture) { Delimiter = delimiter, HasHeaderRecord = true, BadDataFound = null }; using var csv = new CsvReader(reader, config); await using var writer = new System.IO.StreamWriter(Response.Body); await writer.WriteAsync("["); bool first = true; foreach (var row in csv.GetRecords<dynamic>()) { if (!first) await writer.WriteAsync(","); var chunk = JsonSerializer.Serialize(row); await writer.WriteAsync(chunk); await writer.FlushAsync(); first = false; } await writer.WriteAsync("]"); await writer.FlushAsync(); } }

Validation, error handling, and edge cases

The endpoint should validate input, handle common CSV issues, and provide meaningful error messages. This block demonstrates safeguarding against missing files, empty headers, and unsupported delimiters. It also suggests best practices for encoding (UTF-8) and retry strategies when upstream data is unreliable. Always return structured error responses and consider using a lightweight service layer to encapsulate parsing logic.

C#
using Microsoft.AspNetCore.Mvc; using CsvHelper; using System.Globalization; [ApiController] [Route("api/[controller]")] public class ValidatorController : ControllerBase { [HttpPost("validate-csv")] public IActionResult Validate([FromForm] IFormFile file, [FromQuery] string delimiter = ",") { if (file == null || file.Length == 0) return BadRequest("CSV file is required."); try { var config = new CsvHelper.Configuration.CsvConfiguration(CultureInfo.InvariantCulture) { Delimiter = delimiter, HasHeaderRecord = true, BadDataFound = null }; using var stream = file.OpenReadStream(); using var reader = new System.IO.StreamReader(stream); using var csv = new CsvReader(reader, config); // Force a header read to validate structure csv.Read(); csv.ReadHeader(); var headers = csv.Context.HeaderRecord; if (headers == null || headers.Length == 0) return BadRequest("CSV must have a header row."); return Ok(new { headers }); } catch (Exception ex) { return BadRequest(new { error = ex.Message }); } } }

Testing, examples, and payloads

Testing the API locally provides confidence before deployment. This block includes practical curl examples for both standard and delimiter-flexible CSV files, plus guidance for using Postman or http clients. It shows how to attach a file payload, switch delimiters, and observe the JSON response. It also demonstrates how to mock a simple CSV in a test project.

Bash
# Sample CSV content saved to file.csv cat > file.csv <<'CSV' id,name,age 1,Alice,30 2,Bob,25 CSV # Basic test curl -F "[email protected]" http://localhost:5000/api/csv/csv2json # Test with semicolon delimiter cat > file_semicolon.csv <<'CSV' id;name;age 3;Carol;28 CSV curl -F "file=@file_semicolon.csv" 'http://localhost:5000/api/csv/csv2json?delimiter=;'

Deployment considerations, security, and performance

When moving to production, consider containerizing the API, enabling authentication, and enforcing input size limits. Also plan for monitoring: track request duration, error rates, and memory usage. Use streaming to minimize memory footprints for large CSVs, and pin dependencies to a known-safe version. This block closes with deployment tips and common pitfalls to avoid in real-world environments.

DOCKERFILE
# Dockerfile (example) FROM mcr.microsoft.com/dotnet/aspnet:6.0 AS base WORKDIR /app EXPOSE 80 FROM mcr.microsoft.com/dotnet/sdk:6.0 AS build WORKDIR /src COPY . . RUN dotnet publish -c Release -o /app/publish Csv2JsonApi/Csv2JsonApi.csproj FROM base AS final WORKDIR /app COPY --from=build /app/publish ./app ENTRYPOINT ["dotnet", "Csv2JsonApi.dll"]

Steps

Estimated time: 45-60 minutes

  1. 1

    Bootstrap project

    Create a new ASP.NET Core Web API project and install CsvHelper. This establishes the foundation for the CSV2JSON endpoint and ensures you have a testable local environment.

    Tip: Keep repository clean with a separate folder for api modules.
  2. 2

    Implement endpoint

    Add a CsvController with an action that accepts a file and parses it into JSON. Include delimiter support and basic validation.

    Tip: Prefer a service layer to separate concerns.
  3. 3

    Enable streaming (optional)

    Add a streaming endpoint to handle large files without loading the entire dataset into memory.

    Tip: Test with very large CSVs to validate memory usage.
  4. 4

    Test locally

    Run the API and test with curl or Postman, validating both standard and edge-case inputs.

    Tip: Capture and inspect error responses for robustness.
  5. 5

    Prepare for deployment

    Containerize or deploy to your preferred hosting and enable security, logging, and monitoring.

    Tip: Set sensible request size limits and enable authentication.
Pro Tip: Use streaming for CSV to JSON when files approach megabytes or larger to prevent OOM errors.
Warning: Do not load the whole CSV into memory; prefer per-row processing and streaming JSON output.
Note: Ensure CSV encoding is UTF-8 and handle BOM if present.

Prerequisites

Required

Optional

  • REST API testing tool (Postman) or curl
    Optional
  • UTF-8 encoding awareness
    Optional

Commands

ActionCommand
Create API projectChoose --no-https if your environment requires itdotnet new webapi -n Csv2JsonApi
Add CSV parsing packageKeep it updated with latest compatible versiondotnet add package CsvHelper
Run the APITest locally on port 5000/7071dotnet run --project Csv2JsonApi/Csv2JsonApi.csproj
Test endpoint with curlSpecify delimiter if needed via ?delimiter=; or -dcurl -F 'file=@path/to/file.csv' http://localhost:5000/api/csv/csv2json

People Also Ask

What libraries are recommended for CSV parsing in .NET?

CsvHelper is a widely used, robust library for CSV parsing in .NET. It handles headers, delimiters, and data types well, enabling reliable conversion to JSON. For simple needs you can also implement a lightweight parser, but CsvHelper reduces edge-case bugs.

CsvHelper is the go-to library for parsing CSV in .NET because it handles headers and delimiters reliably, making CSV-to-JSON conversion safer and faster.

How can I handle very large CSV files without exhausting memory?

Streaming the CSV to JSON is essential for large files. Process each row and emit JSON incrementally rather than loading all rows into memory. The streaming endpoint demonstrates this approach with a sequential JSON writer.

Stream the CSV instead of loading everything at once to avoid memory issues with big files.

Can I customize JSON output field names?

Yes. You can map CSV headers to specific property names using attributes or a custom mapping in CsvHelper. If you return dictionaries or dynamic objects, you can rename keys during serialization.

You can rename fields by mapping headers to names during CSV parsing.

What about error handling and validation?

Validate inputs, check for a header row, and provide structured error responses. Use try-catch blocks and return meaningful HTTP status codes to guide clients in fixing requests.

Handle errors gracefully and return helpful error messages for clients.

Is authentication required for the endpoint?

Authentication is a separate concern. In production, protect the endpoint with OAuth2 or API keys and implement rate limiting and logging for auditability.

Protect the endpoint with authentication and monitor usage.

Main Points

  • Use CsvHelper to parse CSV reliably
  • Enable streaming to scale with file size
  • Return JSON array for straightforward consumption
  • Configure delimiter to support diverse CSV formats