Optimizing Data Delivery with Laravel Streaming Responses

0

Laravel’s streaming response feature is a game-changer when it comes to handling large datasets. By sending data in small increments as it’s generated, this feature reduces memory usage and speeds up response times.

Let’s dive into a real-world example to see how this works. Imagine you need to export a ton of data from your database. With streaming responses, you can do this efficiently without maxing out your memory or keeping your users waiting.

Here’s a snippet of code that shows how you can stream a large data export in Laravel:

“`php
namespace AppHttpControllers;

use AppModelsOrder;
use IlluminateSupportFacadesDB;

class ExportController extends Controller{
public function exportOrders() {
return response()->stream(function () {
// Output CSV headers
echo “Order ID,Customer,Total,Status,Daten”;
ob_flush();
flush();

// Process orders in chunks to maintain memory efficiency
Order::query()
->with(‘customer’)
->orderBy(‘created_at’, ‘desc’)
->chunk(500, function ($orders) {
foreach ($orders as $order) {
echo sprintf(
“%s,%s,%.2f,%s,%sn”,
$order->id,
str_replace(‘,’, ‘ ‘, $order->customer->name),
$order->total,
$order->status,
$order->created_at->format(‘Y-m-d H:i:s’)
);
ob_flush();
flush();
}
});
}, 200, [
‘Content-Type’ => ‘text/csv’,
‘Content-Disposition’ => ‘attachment; filename=”orders.csv”‘,
‘X-Accel-Buffering’ => ‘no’
]);
}
}
“`

By using streaming responses like this, you can handle large datasets efficiently, keep memory usage low, and provide instant feedback to your users. It’s a win-win for everyone involved!

Leave a Reply

Your email address will not be published. Required fields are marked *