How to Make Parallel HTTP Requests in Laravel (and When You Should)

When dealing with external APIs, one of the most common bottlenecks is network latency. Imagine your Laravel app needs to fetch data from multiple third-party endpoints, calling them one by one can quickly become a performance killer.
Fortunately, Laravel’s HTTP client has a built-in way to send multiple requests in parallel using request pools. This allows you to execute all the requests at once and wait for all of them to complete, drastically improving response times.
The Problem: Sequential Requests Are Slow
Let’s say you need to call three different APIs to build a dashboard:
$user = Http::get('https://api.example.com/user');
$posts = Http::get('https://api.example.com/posts');
$comments = Http::get('https://api.example.com/comments');
Each request waits for the previous one to finish before starting the next.
If each takes 500ms, the total time becomes 1.5 seconds, and that’s before any processing.
Wouldn’t it be great if all three ran at the same time? That’s where concurrent HTTP requests come in.
Making Requests in Parallel with Http::pool()
Laravel provides the pool()
method to easily dispatch multiple HTTP requests concurrently.
Here’s how it looks in action:
use Illuminate\Http\Client\Pool;
use Illuminate\Support\Facades\Http;
$responses = Http::pool(fn (Pool $pool) => [
$pool->get('https://api.example.com/user'),
$pool->get('https://api.example.com/posts'),
$pool->get('https://api.example.com/comments'),
]);
$user = $responses[0]->json();
$posts = $responses[1]->json();
$comments = $responses[2]->json();
All three requests are sent together.
If each API call takes 500ms, the total time is roughly 500ms instead of 1.5 seconds, a 3× speed improvement.
Accessing Responses by Name
If you want more readable access, you can name each request:
$responses = Http::pool(fn (Pool $pool) => [
$pool->as('user')->get('https://api.example.com/user'),
$pool->as('posts')->get('https://api.example.com/posts'),
$pool->as('comments')->get('https://api.example.com/comments'),
]);
$user = $responses['user']->json();
$posts = $responses['posts']->json();
$comments = $responses['comments']->json();
This makes your code easier to maintain and understand, especially when you’re pooling many requests.
Adding Headers or Other Options
You can’t chain global options like withHeaders()
or middleware()
directly on the Http::pool()
call.
Instead, you must define them individually for each request:
$headers = ['X-API-Key' => 'secret'];
$responses = Http::pool(fn (Pool $pool) => [
$pool->withHeaders($headers)->get('https://api.example.com/user'),
$pool->withHeaders($headers)->get('https://api.example.com/posts'),
$pool->withHeaders($headers)->get('https://api.example.com/comments'),
]);
Each request inside the pool can have its own headers, timeouts, or authentication settings.
When to Use Parallel Requests
Parallel requests are not always the answer, but they shine in specific scenarios:
1. Aggregating Multiple APIs
When building dashboards, analytics pages, or reports that rely on data from multiple endpoints.
2. Microservices Communication
If your app communicates with multiple internal services, pooling can reduce the overall request time drastically.
3. Fetching Related Data
For example, fetching product details, stock info, and reviews from different APIs at once.
4. Background Jobs
When processing data asynchronously in queued jobs or scheduled tasks that call external APIs.
Conclusion
Laravel’s Http::pool()
makes concurrency simple and elegant, no need to spin up multiple processes or manage Guzzle manually.
Parallel requests can cut your request time significantly, improve user experience, and make your integrations feel snappier. Just use them wisely where they truly make an impact.