dev-resources.site
for different kinds of informations.
Building a Quick CSV Export Command in Laravel
Building a Quick CSV Export Command in Laravel
When working with Laravel applications, you might find yourself needing to export data to a CSV file. Whether you're building a reporting feature or simply exporting data for analysis, itās essential to implement a solution that is both efficient and straightforward.
Hereās a recommended approach for creating a quick CSV export command in Laravel. This method uses Laravelās chunking functionality to handle large datasets gracefully, writing directly to the output stream with PHPās fputcsv
function.
Why This Approach?
- Efficiency: By leveraging Laravelās chunking, you process the data in smaller batches, reducing memory usage.
- Scalability: You can export large datasets without running into memory issues.
-
Simplicity: Using
fputcsv
ensures that CSV formatting is handled correctly without needing additional libraries.
Code Example
use Illuminate\Database\Eloquent\Collection;
$chunk = 500; // Number of records per chunk
$output = fopen('php://stdout', 'wb+'); // Open a stream to StdOut
// Write the CSV headers
fputcsv($output, ['email', 'name']);
// Query the data and write to CSV in chunks
User::select(['email', 'name'])
->chunk($chunk, function (Collection $users) use ($output) {
$users->each(
fn($user) => fputcsv($output, [$user->email, $user->name])
);
});
Breaking Down the Code
1. Define the Chunk Size
The $chunk
variable determines how many records are processed in each iteration. Adjust this number based on your dataset size and memory availability.
$chunk = 500;
2. Open a Stream to StdOut
Using php://stdout
, you can write directly to the output stream. This is particularly useful for commands that generate CSV exports, as the output can be piped directly to a file or another process.
$output = fopen('php://stdout', 'wb+');
3. Write the CSV Headers
Before writing the data rows, define the CSV headers with fputcsv
.
fputcsv($output, ['email', 'name']);
4. Query the Data in Chunks
Use Laravelās chunk
method to process the data in smaller batches. This prevents memory exhaustion when dealing with large datasets.
User::select(['email', 'name'])
->chunk($chunk, function (Collection $users) use ($output) {
$users->each(
fn($user) => fputcsv($output, [$user->email, $user->name])
);
});
5. Write Each Row
For each user in the chunk, use fputcsv
to format and write their data as a CSV row.
fn($user) => fputcsv($output, [$user->email, $user->name]);
Use Case: Artisan Command
You can encapsulate this functionality within an Artisan command, allowing you to execute the export with a single terminal command. Hereās how you can structure your Artisan command:
<?php
namespace App\Console\Commands;
use Illuminate\Console\Command;
use Illuminate\Database\Eloquent\Collection;
use App\Models\User;
class ExportUsersToCsv extends Command
{
protected $signature = 'export:users';
protected $description = 'Export users to a CSV file';
public function handle()
{
$chunk = 500;
$output = fopen('php://stdout', 'wb+');
// Write the CSV headers
fputcsv($output, ['email', 'name']);
// Query the data and write to CSV in chunks
User::select(['email', 'name'])
->chunk($chunk, function (Collection $users) use ($output) {
$users->each(
fn($user) => fputcsv($output, [$user->email, $user->name])
);
});
}
}
Conclusion
This approach to CSV export is efficient, scalable, and simple to implement. By leveraging Laravelās chunking and PHPās native fputcsv
, you ensure your application can handle even large datasets without performance issues.
Featured ones: