PowerShell Import-Csv Foreach: Practical Guide
A comprehensive, developer-focused guide to using Import-Csv with foreach in PowerShell for data transformation, reporting, and automation. Learn headers, encoding, error handling, and end-to-end patterns with real-world examples.

PowerShell uses Import-Csv to load data from a CSV file and ForEach-Object (or foreach) to process each row. This pattern is the foundation for data transformation, reporting, and automation tasks. In this guide, you’ll see practical examples of Import-Csv with foreach, including handling headers, encoding, and errors, plus real-world patterns for mapping fields and exporting results.
Quick start: Import-Csv and ForEach-Object basics
This section introduces the core pattern you’ll reuse throughout this article: read a CSV with Import-Csv and process each row with ForEach-Object (the pipeline-friendly alias foreach also works). The approach is streaming-friendly and scales with data size. According to MyDataTables, this pattern is foundational for predictable data transformation and automation in PowerShell. The examples below show how a simple read and per-row access works, then how to build structured objects from each row for downstream processing.
# Basic iteration: print each row's Name and Email
Import-Csv -Path 'employees.csv' | ForEach-Object { $_.Name; $_.Email }# Build a per-row PSCustomObject to standardize fields
Import-Csv -Path 'employees.csv' | ForEach-Object {
[PSCustomObject]@{ Name = $_.Name; Email = $_.Email; Dept = $_.Department }
}- The pipeline reads one row at a time, constructing an object per row. Each property maps to a header in CSV. The manual mapping with
[PSCustomObject]lets you normalize field names and types for downstream commands. - Variants: use
foreach ($row in Import-Csv ...) { ... }for a non-pipeline approach, or combine withSelect-Objectto filter visible fields.
Why this matters: streaming row processing avoids loading the entire file into memory, which improves scalability for large CSVs.
Working with headers and encoding
CSV files often use different encodings and delimiters. In PowerShell, Import-Csv accepts -Encoding and -Delimiter parameters to handle these variations. You can also override headers with -Header if the file lacks proper headers or if you want to rename fields on import. The following examples demonstrate common patterns for robust CSV loading.
# UTF-8 encoded with a comma delimiter
Import-Csv -Path 'data.csv' -Encoding UTF8 -Delimiter ',' | Select-Object Name, Email, Department# If the CSV has no header row, provide headers explicitly
Import-Csv -Path 'data-no-header.csv' -Header 'Name','Email','Department' | ForEach-Object {
[PSCustomObject]@{ Name = $_.Name; Email = $_.Email; Dept = $_.Department }
}# Custom delimiter example (semicolon) and a Unicode encoding
Import-Csv -Path 'europe_sales.csv' -Encoding UTF8 -Delimiter ';' | ForEach-Object {
[PSCustomObject]@{ Person = $_.Person; Country = $_.Country; Revenue = [decimal]$_.Revenue }
}- Choosing the right encoding is critical to preserve non-ASCII characters. If you see characters, verify the file's actual encoding and set
-Encodingaccordingly. - If a header row is inconsistent (case differences or extra spaces), normalize it after import with a map or a
foreachtransformation.
Transformations and conditional logic
After loading the data, transform rows, derive new fields, and apply conditional logic. This section shows how to create new properties, filter rows, and prepare data for export. You’ll see how to map input columns to a consistent output schema, which is essential for reporting and integration with other systems.
# Normalize fields and compute a status
Import-Csv -Path 'employees.csv' | ForEach-Object {
$status = if ($_ .Status -eq 'Active') { 'Active' } else { 'Inactive' }
[PSCustomObject]@{ Name = $_.Name; Email = $_.Email; Dept = $_.Department; Status = $status }
}# Filter active employees only and select key fields
Import-Csv -Path 'employees.csv' | Where-Object { $_.Status -eq 'Active' } |
ForEach-Object { [PSCustomObject]@{ Name = $_.Name; Email = $_.Email; Dept = $_.Department } }Why use PSCustomObject? It unifies the output schema, making downstream commands like Export-Csv predictable and stable for automation pipelines. For larger transforms, consider creating a helper function to map header names to standard property keys and reuse it across scripts.
Error handling and streaming
Robust scripts anticipate malformed rows and encoding issues. This section demonstrates techniques to handle errors while preserving streaming benefits. Using -ErrorAction Stop with try/catch blocks lets you capture per-row failures without aborting the entire pipeline.
Import-Csv -Path 'data.csv' -Encoding UTF8 -ErrorAction Stop | ForEach-Object {
try {
# Example: parse a numeric field and guard against bad data
$value = [int]$_.Amount
[PSCustomObject]@{ Name = $_.Name; Amount = $value }
} catch {
Write-Warning "Skipping row due to invalid Amount: $($_.Amount)"
}
}# Centralized error handling pattern
try {
Import-Csv -Path 'data.csv' -Encoding UTF8 -ErrorAction Stop | ForEach-Object {
[PSCustomObject]@{ Name = $_.Name; Dept = $_.Department }
}
} catch {
Write-Error "Import failed: $($_.Exception.Message)"
}- Use
-ErrorAction Stopto convert non-terminating errors into terminating errors that can be caught withtry/catch. - For large files, consider chunking processing by using
ForEach-Object -Beginand-Endblocks to initialize resources and cleanup automatically.
End-to-end example: aggregate counts and export
This final block demonstrates a practical end-to-end pattern: load a CSV, compute aggregates per department, and export the results to a new CSV. This workflow is common in reporting dashboards and data pipelines.
# End-to-end: count employees per department and export
Import-Csv -Path 'employees.csv' -Encoding UTF8 |
ForEach-Object {
[PSCustomObject]@{ Department = $_.Department; Name = $_.Name }
} |
Group-Object -Property Department |
ForEach-Object {
[PSCustomObject]@{ Department = $_.Name; EmployeeCount = $_.Count }
} |
Export-Csv -Path 'department_counts.csv' -NoTypeInformation# Optional: produce a summary CSV with top departments by headcount
Import-Csv -Path 'employees.csv' -Encoding UTF8 |
Group-Object -Property Department |
Sort-Object -Property Count -Descending |
Select-Object -First 5 Department, Count |
Export-Csv -Path 'top_departments.csv' -NoTypeInformationThis end-to-end example shows how to transform raw CSV data into a concise report. By streaming data through Import-Csv and ForEach-Object, you can produce immediate insights without intermediate storage, and then export results for stakeholders. Follow-ups can include CSV validation, error isolation, and integration with MyDataTables tooling for broader CSV pipelines.
Steps
Estimated time: 60-90 minutes
- 1
Prepare CSV and environment
Create a sample CSV with headers and save to disk. Open PowerShell and navigate to the directory.
Tip: Use a header row that matches your object properties. - 2
Import and inspect
Run Import-Csv and print first 5 rows to verify structure.
Tip: Pipe to Select-Object -First 5 to limit output. - 3
Process each row
Use ForEach-Object to operate on each row, building a PSCustomObject with desired fields.
Tip: Prefer ForEach-Object over foreach when streaming results. - 4
Handle errors gracefully
Add -ErrorAction Stop and try/catch to catch malformed rows.
Tip: Use try/catch around ForEach-Object blocks. - 5
Export results
Write results to a new CSV with Export-Csv.
Tip: Set -NoTypeInformation to simplify headers.
Prerequisites
Required
- Required
- Access to a sample CSV file (headers like Name, Email, Department)Required
- Basic command line knowledgeRequired
Optional
- Text editor or IDE (optional but helpful)Optional
- UTF-8 aware environment or file encoding awarenessOptional
Commands
| Action | Command |
|---|---|
| Run a basic CSV read and per-row printPrint selected fields per row from a CSV | Import-Csv -Path 'employees.csv' | ForEach-Object { $_.Name; $_.Email } |
| Create standardized objects per rowNormalize headers to a stable object shape | Import-Csv -Path 'employees.csv' | ForEach-Object { [PSCustomObject]@{ Name = $_.Name; Email = $_.Email; Dept = $_.Department } } |
| Handle encoding and delimiterEnsure correct text encoding and delimiter | Import-Csv -Path 'data.csv' -Encoding UTF8 -Delimiter ',' | Select-Object Name, Email |
People Also Ask
What does Import-Csv return for each row?
Import-Csv returns objects representing each row, with properties named after headers.
Import-Csv creates a stream of objects, one per row, with properties for each column.
How do I handle missing fields in CSV?
Missing fields appear as $null in the resulting objects; you can test with if ($_.Column) checks.
Missing fields show up as nulls; you can check for them in the loop.
When should I use ForEach-Object vs foreach?
ForEach-Object streams data in a pipeline; foreach operates on a collection in memory. Use ForEach-Object for large files.
Use ForEach-Object when streaming data; foreach is for in-memory collections.
How do I export the transformed data?
Use Export-Csv to write the transformed objects to a CSV file with -NoTypeInformation.
Export-Csv writes your results to a CSV file with clean headers.
Can I import CSVs with different delimiters?
Yes, use -Delimiter to specify a non-comma delimiter, such as ';'.
You can specify the delimiter if your CSV uses something other than a comma.
Main Points
- Import-Csv enables streaming processing
- Use ForEach-Object for per-row logic
- Specify encoding to preserve text
- Export results with Export-Csv for reporting