Featured resource
2026 Tech Forecast
2026 Tech Forecast

1,500+ tech insiders, business leaders, and Pluralsight Authors share their predictions on what’s shifting fastest and how to stay ahead.

Download the forecast
  • Lab
    • Libraries: If you want this lab, consider one of these libraries.
    • Core Tech
Labs

Guided: C# 14 Data Processing with LINQ

Build a LINQ-powered analytics console in C# that parses a CSV sales dataset and analyzes both flat and nested collections (invoices with line items). You will implement reports using filtering, grouping, aggregation, ordering, and shaping—without writing explicit loops—validated by step-by-step tests.

Lab platform
Lab Info
Level
Beginner
Last updated
May 04, 2026
Duration
40m

Contact sales

By clicking submit, you agree to our Privacy Policy and Terms of Use, and consent to receive marketing emails from Pluralsight.
Table of Contents
  1. Challenge

    Introduction: Data Processing with LINQ

    Introduction

    In this guided lab you'll build a small analytics layer for a fictional sales business using C# 14 and LINQ. You'll start with raw CSV text and finish with composable, reusable reports that rank regions, products, and customers by revenue.

    What You'll Learn

    • Parse raw text into strongly typed records so your queries can stay clean.
    • Filter and order sequences with Where, OrderBy, and ThenBy.
    • Group and aggregate data with GroupBy, Sum, and Count.
    • Work with nested collections using SelectMany and per-item aggregation.
    • Compose small query methods into an end-to-end analytics pipeline.

    How The Lab Works

    • The lab is organized into five hands-on steps, each with one or two tasks.
    • The starter code lives in the Analytics project. You'll mostly edit files under Analytics/Data, Analytics/Reports, and the domain models in Analytics/Domain.
    • Each task is validated by automated tests in the Analytics.Tests project. The lab's check buttons run the relevant test class for you.
    • A Program.cs demo runner is provided so you can execute your code end-to-end as you go.

    Project Layout

    • Analytics/Domain/ – record types like SaleRecord, Invoice, and RegionSummary.
    • Analytics/Data/ – data sources (SalesDataSource, InvoiceDataSource) and the CSV parser you will implement.
    • Analytics/Reports/ – the LINQ-based query and report methods you will implement.
    • Analytics/Program.cs – a small demo program that calls your code so you can see real output.

    Running Your Code

    From the terminal pane, you can build and run the demo program at any time to see your work in action:

    dotnet run --project Analytics
    

    When you're ready, head to Step 1 to start parsing the sales data.

    info> If you get stuck, you can refer to the provided solution code for each task, available in the solution folder.

    This lab experience was developed by the Pluralsight team using Forge, an internally developed AI tool utilizing Gemini technology. All sections were verified by human experts for accuracy prior to publication. For issue reporting, please contact us.

  2. Challenge

    Step 1: Load and Validate Flat Sales Data

    In this step you'll turn raw CSV text into a strongly typed in-memory dataset. LINQ shines when you can treat your data as a sequence of rich objects rather than strings.

    Key ideas:

    • Start by shaping data into a model (record/class) that represents a row.
    • Keep parsing and validation separate from reporting so queries remain simple.
    • Make outputs deterministic (stable ordering) to keep tests and reporting predictable. ### Try It Out

    Once both tasks pass, run the demo program from the terminal to confirm sales data loads correctly:

    dotnet run --project Analytics
    

    You should see a line like Loaded sales records: N printed to the console.

  3. Challenge

    Step 2: Filtering and Ordering without Loops

    Filtering is where most analytics pipelines begin. In LINQ, filtering is typically done with Where, and you can compose multiple filters by chaining operators.

    Key ideas:

    • Prefer small, reusable query methods that return IEnumerable<T>.
    • Apply ordering at boundaries where you need stable output (reports, tests).
    • Keep filtering logic inclusive/exclusive explicit, especially with dates.
  4. Challenge

    Step 3: Grouping and Aggregating the Flat Dataset

    Grouping transforms a list of transactions into summaries. This is where you'll use GroupBy plus aggregations like Sum and Count.

    Key ideas:

    • Choose a grouping key that matches the business question (Region, Product).
    • Shape results into dedicated report records so they're easy to test and display.
    • Order descending by key performance metrics (like revenue) to create rankings. ### Try It Out

    Run the demo program to view your region summary printed to the console:

    dotnet run --project Analytics
    

    You should see a Region summary: section with totals per region.

  5. Challenge

    Step 4: Nested Object Analytics (Invoices and Line Items)

    Many business domains store data in nested structures: an invoice has line items; an order has order lines. LINQ can handle these with nested aggregation and flattening.

    Key ideas:

    • Aggregate nested collections with Sum across child items.
    • Flatten nested sequences with SelectMany when you need to group across parents.
    • Keep calculations (Quantity * UnitPrice) in one well-named place to avoid mistakes. ### Try It Out

    Run the demo program to verify your invoice totals are calculated correctly:

    dotnet run --project Analytics
    

    Look for the Invoice totals: section, which lists each invoice's InvoiceId, customer, and subtotal.

  6. Challenge

    Step 5: Compose an End-to-End Analytics Pipeline

    Now you'll connect the pieces: parse -> filter -> aggregate -> rank. The goal is to build reusable building blocks that compose into higher-level reports.

    Key ideas:

    • Build small pure functions that can be combined.
    • Avoid side effects (like printing) in query methods.
    • Reuse existing report methods rather than duplicating logic. ### Try It Out

    With every task complete, run the full demo one more time to see your end-to-end pipeline in action:

    dotnet run --project Analytics
    

    The program loads sales and invoices, filters and aggregates them, and prints region and invoice reports built from the components you implemented.

  7. Challenge

    Conclusion

    Congratulations

    You've completed the Data Processing with LINQ lab. You took a raw CSV string and a set of nested invoice objects and turned them into a clean, composable analytics layer—without writing a single manual loop.

    What You Built

    • A typed CSV parser that produces SaleRecord objects from raw text.
    • A duplicate-detection check using GroupBy over OrderId.
    • Reusable filters (FilterSalesByDateRange, FilterSalesByMinimumAmount) with deterministic ordering.
    • Aggregation reports (BuildRegionSummaries, BuildTopProducts) using GroupBy, Sum, and Count.
    • Nested-collection analytics (ComputeInvoiceTotals, customer revenue roll-up) using SelectMany and per-invoice aggregation.
    • An end-to-end pipeline that composes filters and aggregation into a single business report, plus a Top N regions view.

    Concepts You Practiced

    • Modeling data with records before querying it.
    • Filtering and ordering with Where, OrderBy, and ThenBy.
    • Grouping and aggregating with GroupBy, Sum, and Count.
    • Flattening hierarchical data with SelectMany.
    • Composing small, pure LINQ methods into larger reports.
    • Keeping output deterministic so it's easy to test and reason about.

    Run It Again

    Feel free to run the demo program any time to see your finished pipeline produce real output:

    dotnet run --project Analytics
    

    Where to Go Next

    • Explore additional LINQ operators like Aggregate, Join, GroupJoin, and Zip.
    • Try swapping the in-memory data sources for a database using LINQ-to-Entities (Entity Framework Core).
    • Investigate IQueryable<T> to understand how LINQ providers translate queries to other backends.

    Great work—your LINQ toolbox is now ready for real-world data processing.

About the author

Pluralsight’s AI authoring technology is designed to accelerate the creation of hands-on, technical learning experiences. Serving as a first-pass content generator, it produces structured lab drafts aligned to learning objectives defined by Pluralsight’s Curriculum team. Each lab is then enhanced by our Content team, who configure the environments, refine instructions, and conduct rigorous technical and quality reviews. The result is a collaboration between artificial intelligence and human expertise, where AI supports scale and efficiency, and Pluralsight experts ensure accuracy, relevance, and instructional quality, helping learners build practical skills with confidence.

Real skill practice before real-world application

Hands-on Labs are real environments created by industry experts to help you learn. These environments help you gain knowledge and experience, practice without compromising your system, test without risk, destroy without fear, and let you learn from your mistakes. Hands-on Labs: practice your skills before delivering in the real world.

Learn by doing

Engage hands-on with the tools and technologies you’re learning. You pick the skill, we provide the credentials and environment.

Follow your guide

All labs have detailed instructions and objectives, guiding you through the learning process and ensuring you understand every step.

Turn time into mastery

On average, you retain 75% more of your learning if you take time to practice. Hands-on labs set you up for success to make those skills stick.

Get started with Pluralsight