Blog

  • Laravel Table Names: When Singular Breaks Your Queries

    Laravel Table Names: When Singular Breaks Your Queries

    You write a raw query in Laravel. It works in staging. You push to production, and suddenly: Table ‘database.products’ doesn’t exist.

    Plot twist: The table is called product (singular). Welcome to the world of legacy databases.

    Laravel’s Plural Assumption

    Laravel’s Eloquent ORM follows a convention: table names are plural, model names are singular.

    // Model: Product
    // Expected table: products (plural)
    
    class Product extends Model
    {
        // Laravel auto-assumes 'products' table
    }

    This works great… until you inherit a database where someone used singular table names.

    The Problem with Raw Queries

    When you use Eloquent methods, Laravel uses $model->getTable(), which you can override:

    class Product extends Model
    {
        protected $table = 'product'; // Override to singular
    }
    
    // Eloquent queries work fine
    Product::where('status', 'active')->get();

    But raw queries don’t use getTable():

    // ❌ Breaks if table is actually 'product' (singular)
    DB::table('products')->where('status', 'active')->get();
    
    // ❌ Also breaks
    DB::select("SELECT * FROM products WHERE status = ?", ['active']);

    The Fix: Use Model Table Names

    Always reference the model’s table name dynamically:

    // ✅ Uses the model's $table property
    $table = (new Product)->getTable();
    
    DB::table($table)->where('status', 'active')->get();
    
    // ✅ Or inline
    DB::table((new Product)->getTable())
        ->where('status', 'active')
        ->get();

    For raw SQL strings:

    $table = (new Product)->getTable();
    
    DB::select("SELECT * FROM {$table} WHERE status = ?", ['active']);

    Now if the table name changes (migration, refactor, database merge), you update it once in the model.

    Why This Happens

    Common scenarios where table names don’t match Laravel conventions:

    • Legacy databases: Built before Laravel, different naming standards
    • Multi-framework codebases: Database shared between Laravel and another app
    • Database naming policies: Company standards enforce singular table names
    • Third-party integrations: External systems dictate schema

    Bonus: Check All Your Models

    Want to see which models override table names?

    grep -r "protected \$table" app/Models/

    Or in Tinker:

    collect(get_declared_classes())
        ->filter(fn($c) => is_subclass_of($c, Model::class))
        ->mapWithKeys(fn($c) => [$c => (new $c)->getTable()])
        ->toArray();

    The Takeaway

    Never hardcode table names in queries. Use getTable() to respect model configuration. When the table name changes, your queries won’t break.

    Future-you debugging at 2 AM will thank you.

  • Laravel Collections: Use put() for Automatic Deduplication

    Laravel Collections: Use put() for Automatic Deduplication

    You’re fetching products from multiple overlapping API requests. Maybe paginated results, maybe different filters that return some of the same items. You push them all into a collection and end up with duplicates.

    There’s a better way than calling unique() at the end.

    The Problem with Push

    Here’s the naive approach:

    $results = collect([]);
    
    // Loop through multiple API calls
    foreach ($queries as $query) {
        $response = Http::get('/api/products', $query);
        
        foreach ($response->json('data') as $product) {
            $results->push($product);
        }
    }
    
    // Now de-duplicate
    $results = $results->unique('id');

    This works, but you’re storing duplicates in memory the entire time, then filtering them out at the end. With large datasets, that’s wasteful.

    Use Put() with Keys

    Instead of push(), use put() with the product ID as the key:

    $results = collect([]);
    
    foreach ($queries as $query) {
        $response = Http::get('/api/products', $query);
        
        foreach ($response->json('data') as $product) {
            // Key by product ID — duplicates auto-overwrite
            $results->put($product['id'], $product);
        }
    }
    
    // No need for unique() — collection is already deduplicated

    Now when the same product ID appears twice, the second one overwrites the first. Automatic deduplication during collection, not after.

    Why This Is Better

    • Constant memory: Each unique product ID only stored once
    • No post-processing: Skip the unique() call entirely
    • Latest data wins: If data changes between API calls, you keep the freshest version

    Don’t Need the Keys?

    If you need a sequential array at the end (without the ID keys), just call values():

    $results = $results->values();
    
    // Now indexed [0, 1, 2, ...] instead of [123, 456, ...]

    Works for Any Unique Identifier

    This pattern works for any deduplication scenario:

    // Dedup users by email
    $users->put($user['email'], $user);
    
    // Dedup orders by order number
    $orders->put($order['order_number'], $order);
    
    // Dedup products by SKU
    $products->put($product['sku'], $product);

    When Push() Is Better

    Use push() when:

    • You want duplicates (like tracking multiple events)
    • You need insertion order preserved without keys
    • Items don’t have a natural unique identifier

    Otherwise, put() with keys is your friend.

    The Takeaway

    Stop using push() + unique(). Use put() with a unique key for automatic deduplication during collection. Cleaner code, less memory, same result.

  • Dry-Run Mode with Transaction Rollback

    Dry-Run Mode with Transaction Rollback

    You’re about to run a one-off script that updates 10,000 database records. You’ve tested it on staging. You’ve code-reviewed it. But you still want to see exactly what it’ll do in production before committing.

    Enter: the dry-run transaction pattern.

    The Problem with Fake Dry-Runs

    Most “dry-run” modes look like this:

    if ($dryRun) {
        $this->info("Would update order #{$order->id}");
    } else {
        $order->update(['status' => 'processed']);
    }

    The issue? You’re not running the real code. If there’s a bug in the actual update logic — a constraint violation, a triggered event that fails, a missing column — you won’t find out until you run it for real.

    The Transaction Rollback Trick

    Instead, run the actual code path, but wrap it in a transaction and roll it back:

    use Illuminate\Support\Facades\DB;
    
    class ProcessOrdersCommand extends Command
    {
        public function handle()
        {
            $dryRun = $this->option('dry-run');
            
            if ($dryRun) {
                $this->warn('🔍 DRY-RUN MODE — changes will be rolled back');
                DB::beginTransaction();
            }
            
            try {
                $orders = Order::pending()->get();
                
                foreach ($orders as $order) {
                    $order->update(['status' => 'processed']);
                    $this->info("✓ Processed order #{$order->id}");
                }
                
                if ($dryRun) {
                    DB::rollback();
                    $this->info('✅ Dry-run complete — no changes saved');
                } else {
                    $this->info('✅ All changes committed');
                }
                
            } catch (\Exception $e) {
                if ($dryRun) {
                    DB::rollback();
                }
                throw $e;
            }
        }
    }

    What You Get

    With this pattern:

    • Real execution: Every query runs, every constraint is checked
    • Event triggers fire: Model observers, jobs, notifications — all execute
    • Nothing persists: Rollback undoes all database changes
    • Safe preview: See exactly what would happen, but commit nothing

    Bonus: Track What Changed

    Want to see before/after values?

    $changes = [];
    
    foreach ($orders as $order) {
        $original = $order->toArray();
        $order->update(['status' => 'processed']);
        $changes[] = [
            'id' => $order->id,
            'before' => $original['status'],
            'after' => $order->status
        ];
    }
    
    if ($dryRun) {
        $this->table(['ID', 'Before', 'After'], 
            array_map(fn($c) => [$c['id'], $c['before'], $c['after']], $changes)
        );
    }

    Now your dry-run shows a summary table of what would change.

    When Not to Use This

    This pattern doesn’t help with:

    • External API calls: Transaction rollback won’t undo HTTP requests
    • File operations: Transactions don’t cover filesystem changes
    • Email/notifications: Side effects outside the database still fire

    For those, you still need conditional logic or feature flags.

    The Takeaway

    Dry-run via transaction rollback tests your real code path. If it works in dry-run, it works for real. No surprises, no “but it worked in staging” excuses.

    Add --dry-run to every risky command. Your production database will thank you.

  • Stop Swallowing Exceptions in PHP

    Stop Swallowing Exceptions in PHP

    You inherit a codebase. Every API call wrapped in try-catch. Every exception swallowed. Every error returns an empty collection.

    And nobody knows when things break.

    The Anti-Pattern

    Here’s what I found in a legacy integration:

    public function getProducts(): Collection
    {
        try {
            $response = Http::get($this->apiUrl . '/products');
            return collect($response->json('data'));
        } catch (Exception $e) {
            // "Handle" the error by pretending it didn't happen
            return collect([]);
        }
    }

    What happens when the API is down? Nothing. The method returns empty. The caller thinks there are no products. The error disappears into the void.

    Weeks later, you’re debugging why users see blank pages. The real error? A 500 from the API three weeks ago that nobody noticed because the logs were clean.

    Let It Fail

    The fix is brutal: delete the try-catch.

    public function getProducts(): Collection
    {
        $response = Http::get($this->apiUrl . '/products');
        return collect($response->json('data'));
    }

    Now when the API fails, the exception bubbles up. Your error tracking (Sentry, Bugsnag, whatever) catches it. You get alerted. You fix it.

    But What About Graceful Degradation?

    If you genuinely want to fail gracefully, make it explicit:

    public function getProducts(): Collection
    {
        try {
            $response = Http::timeout(5)->get($this->apiUrl . '/products');
            return collect($response->json('data'));
        } catch (RequestException $e) {
            // Log it so you know it happened
            Log::warning('Product API failed', [
                'error' => $e->getMessage(),
                'url' => $this->apiUrl
            ]);
            
            // Return cached/stale data as fallback
            return Cache::get('products.fallback', collect([]));
        }
    }

    Now you’re:

    • Logging the failure (visibility)
    • Using a fallback strategy (resilience)
    • Not silently lying to callers (honesty)

    When to Catch

    Catch exceptions when you have a recovery strategy:

    • Retry logic: API hiccup? Try again.
    • Fallback data: Cache, defaults, partial results.
    • User-facing context: Transform technical errors into user messages.

    If you’re just catching to return empty, you’re hiding problems.

    Laravel’s Global Exception Handler

    Laravel already has a system for this. Exceptions bubble to App\Exceptions\Handler, where you can:

    // app/Exceptions/Handler.php
    public function register()
    {
        $this->reportable(function (RequestException $e) {
            // Log to Sentry/Slack/etc
        });
        
        $this->renderable(function (RequestException $e) {
            return response()->json([
                'error' => 'External service unavailable'
            ], 503);
        });
    }

    Now all API failures follow the same pattern. No more scattered try-catch blocks pretending everything’s fine.

    The Takeaway

    Empty catch blocks are lies. If you can’t handle the error meaningfully, let it bubble. Your future self — the one debugging at 2 AM — will thank you.

  • Laravel Queue Jobs: Stop Using $this in Closures

    Laravel Queue Jobs: Stop Using $this in Closures

    You dispatch a Laravel job. It runs fine locally. You push to production, and suddenly: CallbackNotFound errors everywhere.

    The culprit? A closure inside your job that references $this->someService.

    The Problem

    Here’s what breaks:

    class ProcessOrder implements ShouldQueue
    {
        use SerializesModels;
        
        public function __construct(
            private PaymentService $paymentService
        ) {}
        
        public function handle()
        {
            $orders = Order::pending()->get();
            
            // This closure references $this
            $orders->each(function ($order) {
                // 💥 BOOM after serialization
                $this->paymentService->charge($order);
            });
        }
    }

    When Laravel serializes the job for the queue, closures can’t capture $this properly. The callback becomes unresolvable after deserialization, and your job dies silently or throws cryptic errors.

    The Fix

    Stop referencing $this inside closures. Use app() to resolve the service fresh from the container:

    class ProcessOrder implements ShouldQueue
    {
        use SerializesModels;
        
        public function handle()
        {
            $orders = Order::pending()->get();
            
            // ✅ Resolve fresh from container
            $orders->each(function ($order) {
                app(PaymentService::class)->charge($order);
            });
        }
    }

    Now the closure is self-contained. Every time it runs, it pulls the service from the container. No serialization issues, no broken callbacks.

    Why It Works

    Queue jobs get serialized to JSON and stored (database, Redis, SQS, etc.). When the worker picks up the job:

    1. Laravel deserializes the job class
    2. Runs your handle() method
    3. Executes any closures inside

    But PHP can’t serialize object references inside closures. Using app() defers service resolution until the closure actually runs — after deserialization.

    Bonus: Same Rule for Collection Transforms

    This isn’t just a queue problem. Any time you serialize data with closures (caching collections, API responses, etc.), the same rule applies:

    // ❌ Breaks if serialized
    $cached = $items->map(fn($item) => $this->transformer->format($item));
    
    // ✅ Safe
    $cached = $items->map(fn($item) => app(Transformer::class)->format($item));

    The Takeaway

    Never use $this-> inside closures in queue jobs. Resolve services from the container with app() or pass them as closure parameters instead.

    Your queue workers will thank you.

  • Before/After API Testing: Compare Bytes, Not Just Objects

    Before/After API Testing: Compare Bytes, Not Just Objects

    When refactoring code that talks to external APIs, how do you know you didn’t break something subtle?

    Compare both the parsed response AND the raw wire format:

    // Test old implementation
    $oldClient = new OldApiClient($config);
    $oldParsed = $oldClient->fetchData($params);
    $oldRaw = $oldClient->getLastRawResponse();
    
    // Test new implementation  
    $newClient = new NewApiClient($config);
    $newParsed = $newClient->fetchData($params);
    $newRaw = $newClient->getLastRawResponse();
    
    // Compare both
    assert($oldParsed == $newParsed);  // Functional behavior
    assert($oldRaw === $newRaw);       // Wire-level compatibility

    Why both? Because:

    • Parsed objects verify functional correctness
    • Raw responses catch encoding issues, header differences, whitespace handling
    • Some APIs are picky about request formatting even if the parsed result looks identical

    This approach caught cases where new code was adding HTTP headers the old code didn’t send, and another where namespace handling differed slightly. Both “worked” but matching exact wire format means safer deployment.

    When refactoring integrations, test the bytes on the wire, not just the objects in memory.

  • Keep DTOs Thin — Move Logic to Services

    Keep DTOs Thin — Move Logic to Services

    When building API integrations, it’s tempting to put helper methods on your DTOs. I learned this creates problems.

    Started with “fat” DTOs:

    class Product {
        public function getUniqueVariants() { ... }
        public function isVariantAvailable($type) { ... }
        public function getDisplayType() { ... }
    }
    

    Seemed convenient — call $product->getUniqueVariants() anywhere. But issues emerged:

    1. Serialization problems: DTOs with methods are harder to cache/serialize
    2. Testing complexity: need to mock entire DTO structures just to test one method
    3. Coupling: business logic tied to the external API’s data structure

    The fix — move all logic to a service class:

    class ApiClient {
        public function getUniqueVariants(Product $product) { ... }
        public function isVariantAvailable(Product $product, string $type) { ... }
    }
    

    Now DTOs are pure data containers — just public properties or readonly classes. Business logic lives in services where it belongs.

    Feels less convenient at first (passing DTOs as arguments), but far more maintainable. DTOs stay simple and serializable. Services become testable in isolation.

  • Auto-Generate DTOs from JSON API Responses

    Auto-Generate DTOs from JSON API Responses

    Working with third-party APIs means writing a lot of DTOs. So I built a quick code generator that turns JSON fixtures into typed PHP classes.

    The workflow:

    1. Capture fixtures: add HTTP middleware to dump API responses to JSON files
    2. Analyze schema: scan fixtures, merge similar structures, infer types
    3. Generate DTOs: output properly typed classes with serializer annotations
    php artisan generate:dtos \
      --output=app/Services/SDK/ThirdParty \
      --namespace="App\\Services\\SDK\\ThirdParty"
    

    The generator handles:

    • Type inference from JSON values (string/int/float/bool/null)
    • Nullability detection: if a field is missing in any fixture, it’s nullable
    • Nested objects: creates separate classes for complex types
    • Array types: uses Str::singular() to name item classes (products → Product)

    This saved hours of manual DTO writing. The key insight: API responses are the source of truth. Let the actual data structure define your types, not the other way around.

  • HTTP Response throw() Makes Your Error Handling Unreachable

    HTTP Response throw() Makes Your Error Handling Unreachable

    Found dead code in an API client today. Classic mistake with Laravel’s HTTP client.

    The code looked reasonable at first glance:

    $response = Http::post($url, $data)->throw();
    
    if ($response->failed()) {
        throw $response->serverError() 
            ? new ServerException() 
            : $response->toException();
    }
    

    Spot the problem? The throw() method already throws an exception if the request fails. That if ($response->failed()) block will never execute — it’s unreachable.

    Laravel’s throw() is basically:

    if ($this->failed()) {
        throw $this->toException();
    }
    return $this;
    

    So you either use throw() for automatic exception handling, OR check failed() manually. Not both.

    The fix: if you need custom exception logic, don’t use throw():

    $response = Http::post($url, $data);
    
    if ($response->failed()) {
        throw $response->serverError() 
            ? new CustomServerException() 
            : new CustomClientException();
    }
    

    Small mistake, but good reminder to understand what framework helpers actually do under the hood.

  • Laravel Collections: flip() for Instant Lookups

    Laravel Collections: flip() for Instant Lookups

    Here’s a Laravel Collection performance trick: use flip() to turn expensive searches into instant lookups.

    I was looping through a date range, checking if each date existed in a collection. Using contains() works, but it’s O(n) for each check — if you have 100 dates to check against 50 items, that’s 5,000 comparisons.

    // Slow: O(n) per check
    $available = collect(['2026-01-25', '2026-01-26', '2026-01-27']);
    if ($available->contains($date)) { ... }
    

    The fix — flip() converts values to keys, making lookups O(1):

    // Fast: O(1) per check
    $available = collect(['2026-01-25', '2026-01-26'])
        ->flip(); // Now: ['2026-01-25' => 0, '2026-01-26' => 1]
    
    if ($available->has($date)) { ... }
    

    For small datasets, the difference is negligible. But checking hundreds or thousands of items? This tiny change saves significant processing time. Costs nothing to implement.