Table of Contents
Use LazyCollection for Memory-Efficient API Pagination
When working with external APIs that return large datasets, memory management becomes critical. Loading thousands of records into memory at once can quickly exhaust your PHP memory limit.
Laravel’s LazyCollection solves this elegantly using PHP generators to yield items one at a time instead of loading everything upfront.
The Problem: Memory Bloat
// ❌ Bad: Loads all products into memory
public function syncProducts(): Collection
{
$products = $this->apiClient->fetchAllProducts();
return collect($products); // Could be 10,000+ items
}
This approach loads the entire API response into memory. With 10,000 products, you’re looking at hundreds of megabytes of RAM consumption before you’ve even started processing.
The Solution: LazyCollection with Generators
// ✅ Good: Yields items one at a time
public function syncProducts(): LazyCollection
{
return LazyCollection::make(function () {
yield from $this->apiClient->fetchAllProducts();
});
}
// Process efficiently
$this->syncProducts()->each(function ($product) {
Product::updateOrCreate(
['external_id' => $product['id']],
['name' => $product['name'], 'price' => $product['price']]
);
});
With LazyCollection, memory usage stays constant regardless of dataset size. Each item is processed and discarded before the next one is loaded.
Real-World Impact
| Approach | 10K Records | 100K Records |
|---|---|---|
Collection |
~512MB | ~5GB (crashes) |
LazyCollection |
~32MB | ~32MB |
When to Use LazyCollection
- API pagination – Syncing data from external services
- Database exports – Generating large CSV files
- Batch processing – Working with millions of records
- File parsing – Reading large log files or CSV imports
Advanced: Chunked API Calls
If your API supports pagination, combine LazyCollection with chunking:
public function fetchAllProducts(): LazyCollection
{
return LazyCollection::make(function () {
$page = 1;
do {
$response = Http::get('/api/products', ['page' => $page]);
$products = $response->json('data');
foreach ($products as $product) {
yield $product;
}
$page++;
} while (!empty($products));
});
}
This pattern requests one page at a time, yields each item, then fetches the next page only when needed.
Key Takeaway
When dealing with large external datasets, LazyCollection isn’t just an optimization—it’s often the difference between a working sync script and a memory exhausted crash. Use it whenever you’re processing data where the full dataset size is unknown or potentially large.
Leave a Reply