Table of Contents
π 2 minutes read
The Problem
You’re building a CSV export feature for your admin panel. Users can download product catalogs with thousands of rows. You write this:
$products = Product::with(['category', 'brand', 'variants'])
->get();
foreach ($products as $product) {
$row = [
$product->name,
$product->category->name,
$product->brand->name,
$product->variants->count(),
];
fputcsv($file, $row);
}
The job runs for 2 minutes, then crashes with “Allowed memory size exhausted.” You’re loading 10,000 Eloquent models with all their relationships into memory.
The Fix
Use the query builder directly. Fetch only the columns you need:
DB::table('products')
->leftJoin('categories', 'products.category_id', '=', 'categories.id')
->leftJoin('brands', 'products.brand_id', '=', 'brands.id')
->select([
'products.name as product_name',
'categories.name as category_name',
'brands.name as brand_name',
DB::raw('(SELECT COUNT(*) FROM variants WHERE variants.product_id = products.id) as variant_count')
])
->chunk(1000, function ($rows) use ($file) {
foreach ($rows as $row) {
fputcsv($file, [
$row->product_name,
$row->category_name,
$row->brand_name,
$row->variant_count,
]);
}
});
Why It Works
- Chunking β Processes 1,000 rows at a time instead of loading all 10,000
- Raw arrays β No Eloquent model overhead (attributes, casts, accessors, relationships)
- Selective columns β Fetches only what the CSV needs, not every database column
- Manual joins β One query instead of N+1 queries for each relationship
Trade-Offs
You lose Eloquent’s conveniences (accessors, casts, automatic timestamps). But for batch exports and reports, raw performance matters more than elegant code.
Save Eloquent for interactive UIs where you’re displaying 10-50 records at a time. For bulk operations that touch thousands of rows, drop down to the query builder.
Leave a Reply