Unit Testing EF Core Code with In-Memory Database

When I started working with Entity Framework Core (EF Core), unit testing wasn’t something I gave much attention to. I was more focused on building features that worked. But as my projects grew bigger, especially in systems where business rules were deeply tied to the database layer, I realized that not having tests was painful. Fixing one issue sometimes broke another.

so, in one of my projects, I decided to take testing seriously. That’s where I discovered EF Core’s in-memory provider. At first, it felt like magic “Wait, I can test database logic without having a real SQL Server?” But as I used it more, I learned its strengths and a few gotchas that every developer should know.

In this post, I’ll share how I use the EF Core in-memory database for unit testing, what kind of problems it helps solve, and where you should be careful. My goal is not to sound like documentation but to tell you how this actually works

Why Unit Testing EF Core Matters

If you’ve worked on business-heavy systems, you know that database logic can get complicated. Sometimes, it’s not just about fetching data. You might have:

  • Queries that apply complex filters based on user input.
  • Business rules like “only include active items where expiry date hasn’t passed”.
  • Transactions that update multiple tables.

You want to make sure all those logic paths behave correctly every time you change your code.

The problem is, you can’t always run tests against a real database. It’s slow, you might mess up production data, and maintaining a clean test environment is a headache.

That’s where EF Core’s InMemory provider comes in. It lets you simulate a database in memory fast, lightweight, and isolated for each test.

Setting Up a Simple Example

Let’s imagine we have a simple auction system. We have an Auction entity and a DbContext called AuctionDbContext.

C#
using Microsoft.EntityFrameworkCore;
using System;

public class Auction
{
    public int Id { get; set; }
    public string ItemName { get; set; } = string.Empty;
    public decimal StartingPrice { get; set; }
    public bool IsActive { get; set; }
    public DateTime CreatedAt { get; set; }
}

public class AuctionDbContext : DbContext
{
    public AuctionDbContext(DbContextOptions<AuctionDbContext> options) : base(options) { }

    public DbSet<Auction> Auctions { get; set; }
}

Now, in your test project (I usually use xUnit or NUnit), you can configure EF Core to use an InMemoryDatabase like this:

C#
using Microsoft.EntityFrameworkCore;
using Xunit;

public class AuctionTests
{
    private AuctionDbContext GetDbContext()
    {
        var options = new DbContextOptionsBuilder<AuctionDbContext>()
            .UseInMemoryDatabase(databaseName: Guid.NewGuid().ToString()) // unique per test
            .Options;

        return new AuctionDbContext(options);
    }

    [Fact]
    public void Should_Add_Auction_Successfully()
    {
        // Arrange
        var context = GetDbContext();
        var auction = new Auction
        {
            ItemName = "Vintage Watch",
            StartingPrice = 1000,
            IsActive = true,
            CreatedAt = DateTime.UtcNow
        };

        // Act
        context.Auctions.Add(auction);
        context.SaveChanges();

        // Assert
        var result = context.Auctions.FirstOrDefault(x => x.ItemName == "Vintage Watch");
        Assert.NotNull(result);
        Assert.Equal(1000, result.StartingPrice);
    }
}

In the code above, each test creates its own in-memory database. This ensures that your tests don’t affect each other.

When I first learned this trick, I was surprised how fast the tests ran no SQL Server, no migrations, no connection strings. Just pure EF Core logic running in memory.

Simulating Queries and Business Rules

Of course, real-world queries are rarely that simple. Let’s say you have logic that filters auctions by status and date.

C#
public class AuctionService
{
    private readonly AuctionDbContext _context;

    public AuctionService(AuctionDbContext context)
    {
        _context = context;
    }

    public IEnumerable<Auction> GetActiveAuctions(DateTime fromDate)
    {
        return _context.Auctions
            .Where(x => x.IsActive && x.CreatedAt >= fromDate)
            .OrderBy(x => x.CreatedAt)
            .ToList();
    }
}

Then you can test it like this:

C#
[Fact]
public void Should_Return_Active_Auctions_After_Specific_Date()
{
    var context = GetDbContext();
    context.Auctions.AddRange(
        new Auction { ItemName = "Item A", IsActive = true, CreatedAt = DateTime.UtcNow.AddDays(-1) },
        new Auction { ItemName = "Item B", IsActive = true, CreatedAt = DateTime.UtcNow.AddDays(-5) },
        new Auction { ItemName = "Item C", IsActive = false, CreatedAt = DateTime.UtcNow }
    );
    context.SaveChanges();

    var service = new AuctionService(context);

    var result = service.GetActiveAuctions(DateTime.UtcNow.AddDays(-2)).ToList();

    Assert.Single(result);
    Assert.Equal("Item A", result.First().ItemName);
}

When I used this pattern in one of my actual projects, it helped catch a bug where the developer forgot to filter by IsActive. The test failed right away, which saved us from pushing a bug to production.

Testing with Relationships

If you have related entities, the in-memory provider still works fine. Let’s extend our model:

C#
public class Bid
{
    public int Id { get; set; }
    public decimal Amount { get; set; }
    public int AuctionId { get; set; }
    public Auction Auction { get; set; } = null!;
}

We add it to our DbContext:

C#
public DbSet<Bid> Bids { get; set; }

Now let’s test that we can get the highest bid per auction:

C#
public class BidService
{
    private readonly AuctionDbContext _context;

    public BidService(AuctionDbContext context)
    {
        _context = context;
    }

    public decimal GetHighestBid(int auctionId)
    {
        return _context.Bids
            .Where(x => x.AuctionId == auctionId)
            .OrderByDescending(x => x.Amount)
            .Select(x => x.Amount)
            .FirstOrDefault();
    }
}

And our test:

C#
[Fact]
public void Should_Return_Highest_Bid_For_Auction()
{
    var context = GetDbContext();

    var auction = new Auction { ItemName = "Antique Vase", IsActive = true, CreatedAt = DateTime.UtcNow };
    context.Auctions.Add(auction);
    context.SaveChanges();

    context.Bids.AddRange(
        new Bid { AuctionId = auction.Id, Amount = 500 },
        new Bid { AuctionId = auction.Id, Amount = 750 },
        new Bid { AuctionId = auction.Id, Amount = 600 }
    );
    context.SaveChanges();

    var service = new BidService(context);
    var result = service.GetHighestBid(auction.Id);

    Assert.Equal(750, result);
}

In one of my fintech projects, we had a similar setup with “transactions” and “settlements”. Testing those relationships used to require a database with sample data. After switching to in-memory tests, I could simulate multiple scenarios quickly without touching a real DB.

Gotchas and Limitations

While the InMemory provider is great, there are some things to watch out for.

  1. It doesn’t enforce relational rules. For example, if you forget to set a foreign key, it won’t throw an exception like SQL Server would. So, if you rely on foreign key constraints, you won’t catch those errors here.
  2. LINQ translations can behave differently. The in-memory provider evaluates everything in memory. It doesn’t go through SQL translation like a real provider. This means queries that would fail on SQL might still work in InMemory.
  3. For more realistic tests, use SQLite in-memory. If you want to test relational behavior, you can use SQLite in-memory mode instead. It’s a bit slower but closer to production. If you want to test relational behavior, you can use SQLite in-memory mode instead. It’s a bit slower but closer to production.

Example setup for SQLite:

C#
var options = new DbContextOptionsBuilder<AuctionDbContext>()
    .UseSqlite("Filename=:memory:")
    .Options;

Combining Unit Tests with Repository or Service Layers

In my projects, I usually wrap EF Core queries inside repositories or services. So instead of testing EF directly, I test my business logic that depends on EF.

Here’s a quick example:

C#
public interface IAuctionRepository
{
    void Add(Auction auction);
    IEnumerable<Auction> GetActiveAuctions();
}

public class AuctionRepository : IAuctionRepository
{
    private readonly AuctionDbContext _context;

    public AuctionRepository(AuctionDbContext context)
    {
        _context = context;
    }

    public void Add(Auction auction)
    {
        _context.Auctions.Add(auction);
        _context.SaveChanges();
    }

    public IEnumerable<Auction> GetActiveAuctions()
    {
        return _context.Auctions.Where(x => x.IsActive).ToList();
    }
}

Then your test might look like:

C#
[Fact]
public void Should_Return_Only_Active_Auctions()
{
    var context = GetDbContext();
    var repo = new AuctionRepository(context);

    context.Auctions.AddRange(
        new Auction { ItemName = "A", IsActive = true },
        new Auction { ItemName = "B", IsActive = false }
    );
    context.SaveChanges();

    var result = repo.GetActiveAuctions();

    Assert.Single(result);
    Assert.Equal("A", result.First().ItemName);
}

This pattern keeps the focus on your business rules instead of EF internals.


Final Thoughts

Unit testing EF Core code using the InMemory database saved me a lot of time and headaches in real projects. It’s not perfect, but it’s good enough for most business logic validation.

If your project depends heavily on database behavior (like foreign keys or complex SQL functions), you can combine this with SQLite-based tests. But for most use cases, the InMemory provider is fast, simple, and reliable.

I still remember the first time I saw all my tests pass after refactoring a big chunk of EF Core queries that confidence was worth every minute spent setting up tests.


Sources

Assi Arai
Assi Arai
Articles: 46