PowerShell Convert JSON to CSV: A Practical Guide
Learn a robust PowerShell workflow to convert JSON to CSV, including flattening nested fields, handling large files, and exporting clean CSV outputs. Practical examples and best practices for data analysts and developers.
Overview: Why PowerShell is ideal for JSON to CSV
PowerShell provides a lightweight, scriptable path to transform JSON data into a tabular CSV format using built-in cmdlets like ConvertFrom-Json and Export-Csv. This approach is especially appealing to data analysts and developers who want to stay within a single scripting environment without pulling in external libraries. According to MyDataTables, native PowerShell capabilities reduce external dependencies and offer consistent behavior across Windows, macOS, and Linux environments. This section sets the stage for practical, repeatable workflows that handle flat JSON objects and lays the groundwork for handling more complex structures later in the guide.
$json = Get-Content -Path "data.json" -Raw | ConvertFrom-Json
$json | Export-Csv -Path "output.csv" -NoTypeInformationFor readers who need deterministic column ordering, consider applying Select-Object before exporting to enforce a stable schema. This base pattern is the starting point for most JSON-to-CSV tasks and scales well with modest JSON sizes.
Understanding JSON structures and CSV schemas
Before you convert, you must map JSON fields to CSV columns. JSON can be an array of objects, a single object, or a nested structure. A simple array like:
[
{"id":1, "name":"Alice", "email":"[email protected]"},
{"id":2, "name":"Bob", "email":"[email protected]"}
]maps naturally to a CSV with columns id, name, and email. If your JSON contains nested objects (for example, address or metadata) or arrays, you’ll need to flatten or normalize those fields before exporting. The following approach demonstrates a flat example and explains how to think about mapping rules to CSV columns.
$input = Get-Content -Path "input.json" -Raw | ConvertFrom-Json
$input | Export-Csv -Path "output.csv" -NoTypeInformation
- Parameters: The field names in JSON become column headers in CSV, assuming consistent keys across records.
- Considerations: Missing keys translate to empty cells; non-string values are coerced to strings in CSV.
Basic conversion: flat JSON
Flat JSON structures (arrays of objects with uniform keys) are the easiest case to start with. You can treat each object as a row and each key as a column. The following example demonstrates loading a file and exporting it directly. This pattern is robust for straightforward datasets and serves as a baseline for more complex transformations.
# Load a JSON array and convert directly to CSV
$records = Get-Content -Path "patients.json" -Raw | ConvertFrom-Json
$records | Export-Csv -Path "patients.csv" -NoTypeInformationIf your JSON lacks a record-wide set of keys, you may need to normalize fields first or supply a fixed schema by using Select-Object to enforce columns. This ensures the final CSV aligns with downstream systems that expect certain headers.
Handling nested data: flattening and normalization
Real-world JSON often contains nested objects and arrays. To produce a clean CSV, you typically flatten the structure or generate additional rows for multi-valued fields. The example below shows how to flatten a nested roles array into a single comma-delimited string while keeping core fields intact. This keeps output readable and compatible with traditional spreadsheet tooling.
$input = Get-Content -Path "employees.json" -Raw | ConvertFrom-Json
$flat = $input | ForEach-Object {
[PSCustomObject]@{
id = $_.id
name = $_.name
email = $_.email
roles = ($_.roles -is [System.Array]) ? ($_.roles -join ",") : $_.roles
}
}
$flat | Export-Csv -Path "employees.csv" -NoTypeInformation- Flattening tips: If a nested object has several fields, map only the ones you actually need, to avoid overly wide CSVs.
- Optional: You can also expand a nested object, e.g., address.city, into separate columns with a similar approach.
Working with large JSON files and streaming considerations
Large JSON files pose a memory challenge because ConvertFrom-Json loads the entire payload into memory. When dealing with thousands to millions of records, adopt strategies that minimize peak memory usage. One common pattern is to process line-delimited JSON (NDJSON) or manage chunks of an array manually. The NDJSON approach reads one JSON object per line and exports incrementally, reducing peak memory.
# NDJSON approach: each line is a standalone JSON object
Get-Content -Path "data.ndjson" | ForEach-Object { $_ | ConvertFrom-Json } | Export-Csv -NoTypeInformation -Path "output.csv" -AppendIf your data is truly a single massive JSON array, consider pre-splitting the file or using a streaming parser from a .NET library, then piping results to CSV. In practice, most PowerShell scripts handle reasonably sized datasets through chunked processing or by filtering fields to reduce memory load. For extremely large datasets, a dedicated data processing tool may be preferable, but PowerShell remains a viable option for many workflows.
Creating a reusable function for JSON-to-CSV conversion
Reusable code improves consistency and reduces drift across projects. The following function encapsulates the parsing, optional field selection, and exporting logic. You can reuse this function in multiple scripts or import it as a module.
function Convert-JsonToCsv {
param(
[string]$InputPath,
[string]$OutputPath,
[string[]]$Fields = $null
)
$data = Get-Content -Path $InputPath -Raw | ConvertFrom-Json
if ($Fields) {
$data | Select-Object -Property $Fields | Export-Csv -Path $OutputPath -NoTypeInformation
} else {
$data | Export-Csv -Path $OutputPath -NoTypeInformation
}
}Usage:
Convert-JsonToCsv -InputPath "input.json" -OutputPath "output.csv" -Fields @("id","name","email","roles")This function provides a single entry point for multiple datasets and helps enforce a consistent schema across automation pipelines.
End-to-end example: a complete, reusable script
Below is a compact end-to-end example that leverages the reusable function to convert a plain JSON file into CSV with a defined field set. It demonstrates configuration, invocation, and a basic validation step to sanity-check the result.
# End-to-end script: read input.json, write output.csv, only selected fields
$InputPath = "input.json"
$OutputPath = "output.csv"
$Selected = @("id", "name", "email", "roles")
Convert-JsonToCsv -InputPath $InputPath -OutputPath $OutputPath -Fields $Selected
# Simple validation: show first few lines of the result
Import-Csv -Path $OutputPath | Select-Object -First 5 | Format-Table -AutoSizeThis pattern is portable across teams and projects, and you can extend it to incorporate error handling, logging, or integration with CI pipelines. The approach emphasizes clarity, repeatability, and minimal dependencies, aligning with pragmatic data workflows.
