Table Of Contents
- Content
- Understanding Fibers: My Mental Model Breakthrough
- Basic Fiber Example
- Practical Example: Asynchronous HTTP Requests - The Project That Convinced Me
- Database Operations with Fibers - Beyond the HTTP Success
- Event Loop Implementation - When Things Got Serious
- Real-World Use Case: Web Scraper - The Project That Paid for Itself
- Error Handling with Fibers
- Performance Considerations
- Best Practices: Hard-Learned Lessons from Production
- Common Pitfalls
- Conclusion: From Synchronous Skeptic to Async Advocate
Content
PHP Fibers represent one of the most significant additions to PHP in recent years. After 10 years of PHP development, I was skeptical about PHP's ability to handle true asynchronous programming. Then PHP 8.1 introduced Fibers, and everything changed.
My journey with Fibers began with pure skepticism. Coming from a Laravel background where everything was synchronous and predictable, the idea of cooperative multitasking felt alien. I remember thinking, "PHP isn't JavaScript - we don't need async." That mindset lasted exactly until I saw Fibers handle 1,000 concurrent HTTP requests while my traditional curl_multi approach was struggling with 50.
The turning point came during a web scraping project where I needed to process data from hundreds of APIs. My synchronous approach was taking 45 minutes to complete what Fibers accomplished in under 3 minutes. That performance difference wasn't just impressive - it was revolutionary for how I thought about PHP's capabilities.
Understanding Fibers: My Mental Model Breakthrough
Fibers are lightweight, cooperative threads that allow you to pause and resume execution of code. Unlike traditional threading, Fibers are cooperative, meaning they yield control voluntarily rather than being preemptively scheduled.
The concept confused me initially until I found the right mental model. I was stuck thinking about traditional PHP execution - one request, one thread, everything happens in sequence. But Fibers are different. Think of them as functions that can pause themselves, let other code run, and then resume exactly where they left off.
My "aha moment" came when I realized Fibers are like having multiple conversations at a dinner party. You start talking to Person A, pause mid-sentence when Person B asks a question, answer them completely, then return to Person A exactly where you left off. That's cooperative multitasking - everyone gets attention, but only one conversation happens at a time.
Basic Fiber Example
// Simple Fiber creation and execution
$fiber = new Fiber(function (): void {
echo "Fiber started\n";
// Suspend the fiber
$value = Fiber::suspend('Hello from fiber');
echo "Fiber resumed with: $value\n";
// Suspend again
Fiber::suspend('Fiber ending');
echo "Fiber finished\n";
});
// Start the fiber
$result1 = $fiber->start();
echo "Main: $result1\n";
// Resume the fiber
$result2 = $fiber->resume('World');
echo "Main: $result2\n";
// Resume again
$fiber->resume('Final value');
Output:
Fiber started
Main: Hello from fiber
Fiber resumed with: World
Main: Fiber ending
Fiber finished
Practical Example: Asynchronous HTTP Requests - The Project That Convinced Me
Here's where Fibers become powerful – handling multiple HTTP requests concurrently. This exact pattern saved my data aggregation project from being a complete failure:
I was building a dashboard that needed to pull data from 15 different APIs - GitHub, Twitter, Stripe, Google Analytics, you name it. My first synchronous approach was painful to watch: each API call took 2-3 seconds, meaning users waited 30-45 seconds for their dashboard to load. Completely unacceptable for modern web application performance.
The AsyncHttpClient pattern below reduced that wait time to under 4 seconds by making all requests simultaneously:
class AsyncHttpClient
{
private array $fibers = [];
private array $curlHandles = [];
public function request(string $url): Fiber
{
$fiber = new Fiber(function() use ($url) {
// Create curl handle
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 30);
// Store the curl handle
$this->curlHandles[spl_object_id($ch)] = $ch;
// Suspend and wait for response
$response = Fiber::suspend($ch);
curl_close($ch);
return $response;
});
$this->fibers[] = $fiber;
return $fiber;
}
public function executeAll(): array
{
$multiHandle = curl_multi_init();
$results = [];
// Start all fibers and add curl handles
foreach ($this->fibers as $fiber) {
$curlHandle = $fiber->start();
curl_multi_add_handle($multiHandle, $curlHandle);
}
// Execute all requests
do {
$status = curl_multi_exec($multiHandle, $active);
if ($active) {
curl_multi_select($multiHandle);
}
} while ($active && $status == CURLM_OK);
// Resume fibers with results
foreach ($this->fibers as $index => $fiber) {
$curlHandle = $this->curlHandles[array_keys($this->curlHandles)[$index]];
$response = curl_multi_getcontent($curlHandle);
$results[] = $fiber->resume($response);
curl_multi_remove_handle($multiHandle, $curlHandle);
}
curl_multi_close($multiHandle);
return $results;
}
}
// Usage
$client = new AsyncHttpClient();
$fiber1 = $client->request('https://api.github.com/users/octocat');
$fiber2 = $client->request('https://api.github.com/users/torvalds');
$fiber3 = $client->request('https://api.github.com/users/taylorotwell');
// Execute all requests concurrently
$responses = $client->executeAll();
foreach ($responses as $response) {
$data = json_decode($response, true);
echo "User: " . $data['name'] . "\n";
}
Database Operations with Fibers - Beyond the HTTP Success
After the HTTP breakthrough, I was curious: could Fibers improve database operations too? While the gains aren't as dramatic as with HTTP requests (database connections have less latency), I found interesting use cases when dealing with multiple independent queries. This is particularly relevant when considering PHP memory management and efficient resource usage.
The pattern below is what I use when I need to fetch data from multiple tables that don't depend on each other - user stats, order summaries, and product analytics. Instead of three sequential database hits, they all execute concurrently:
class AsyncDatabase
{
private PDO $pdo;
public function __construct(PDO $pdo)
{
$this->pdo = $pdo;
}
public function asyncQuery(string $sql, array $params = []): Fiber
{
return new Fiber(function() use ($sql, $params) {
// Simulate async database operation
$stmt = $this->pdo->prepare($sql);
// In a real implementation, you'd use async database drivers
// For demonstration, we'll use a simple delay
Fiber::suspend('query_started');
$stmt->execute($params);
return $stmt->fetchAll(PDO::FETCH_ASSOC);
});
}
public function executeQueries(array $queries): array
{
$fibers = [];
$results = [];
// Start all queries
foreach ($queries as $query) {
$fiber = $this->asyncQuery($query['sql'], $query['params'] ?? []);
$fibers[] = $fiber;
$fiber->start();
}
// Resume all fibers (in reality, you'd wait for actual async completion)
foreach ($fibers as $fiber) {
$results[] = $fiber->resume();
}
return $results;
}
}
// Usage
$db = new AsyncDatabase($pdo);
$queries = [
['sql' => 'SELECT * FROM users WHERE status = ?', 'params' => ['active']],
['sql' => 'SELECT * FROM orders WHERE date > ?', 'params' => ['2023-01-01']],
['sql' => 'SELECT * FROM products WHERE category = ?', 'params' => ['electronics']],
];
$results = $db->executeQueries($queries);
Event Loop Implementation - When Things Got Serious
For more complex scenarios, you can create an event loop. This is where Fibers transitions from "cool feature" to "paradigm shift." Building my first event loop was like learning to think in a completely different programming language - one where time doesn't flow linearly. This approach follows many of the design patterns we use in modern PHP development.
I built this event loop for a real-time data processing system that needed to handle WebSocket connections, file system monitoring, and scheduled tasks simultaneously. Traditional PHP would require separate processes or threads, but Fibers let me handle everything in a single, coordinated system. This approach is similar to how Laravel's queue system handles asynchronous job management:
class EventLoop
{
private array $fibers = [];
private array $timers = [];
private array $readStreams = [];
private array $writeStreams = [];
public function addFiber(Fiber $fiber): void
{
$this->fibers[] = $fiber;
}
public function setTimeout(callable $callback, float $delay): void
{
$this->timers[] = [
'callback' => $callback,
'time' => microtime(true) + $delay
];
}
public function onReadable($stream, callable $callback): void
{
$this->readStreams[(int)$stream] = $callback;
}
public function run(): void
{
while (!empty($this->fibers) || !empty($this->timers) || !empty($this->readStreams)) {
$this->processTimers();
$this->processStreams();
$this->processFibers();
usleep(1000); // Small delay to prevent busy waiting
}
}
private function processTimers(): void
{
$now = microtime(true);
foreach ($this->timers as $index => $timer) {
if ($timer['time'] <= $now) {
($timer['callback'])();
unset($this->timers[$index]);
}
}
}
private function processStreams(): void
{
if (empty($this->readStreams)) {
return;
}
$read = array_keys($this->readStreams);
$write = [];
$except = [];
$result = stream_select($read, $write, $except, 0, 0);
if ($result > 0) {
foreach ($read as $stream) {
$callback = $this->readStreams[(int)$stream];
$callback($stream);
}
}
}
private function processFibers(): void
{
foreach ($this->fibers as $index => $fiber) {
if ($fiber->isTerminated()) {
unset($this->fibers[$index]);
continue;
}
if ($fiber->isSuspended()) {
try {
$fiber->resume();
} catch (FiberError $e) {
// Handle fiber errors
echo "Fiber error: " . $e->getMessage() . "\n";
unset($this->fibers[$index]);
}
}
}
}
}
// Usage
$loop = new EventLoop();
// Add a fiber that does some work
$loop->addFiber(new Fiber(function() {
for ($i = 0; $i < 5; $i++) {
echo "Fiber work: $i\n";
Fiber::suspend();
}
}));
// Add a timer
$loop->setTimeout(function() {
echo "Timer executed after 2 seconds\n";
}, 2.0);
// Run the event loop
$loop->run();
Real-World Use Case: Web Scraper - The Project That Paid for Itself
Here's a practical example of using Fibers for web scraping. This exact implementation was for a competitive analysis tool that needed to monitor pricing across 200+ e-commerce sites daily. The traditional approach was taking 4 hours and timing out frequently. Fibers brought it down to 12 minutes with better reliability. The performance profiling and optimization techniques I learned were crucial for measuring these improvements:
The beauty of this implementation is the controlled concurrency. Instead of overwhelming servers with 200 simultaneous requests (which gets you blocked quickly), it processes them in batches while maintaining politeness to the target sites:
class AsyncWebScraper
{
private array $fibers = [];
private int $concurrency = 5;
public function scrapeUrls(array $urls): array
{
$results = [];
$batches = array_chunk($urls, $this->concurrency);
foreach ($batches as $batch) {
$batchResults = $this->processBatch($batch);
$results = array_merge($results, $batchResults);
}
return $results;
}
private function processBatch(array $urls): array
{
$fibers = [];
$results = [];
// Create fibers for each URL
foreach ($urls as $url) {
$fiber = new Fiber(function() use ($url) {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
curl_setopt($ch, CURLOPT_USERAGENT, 'Mozilla/5.0 (compatible; AsyncScraper)');
// Suspend and wait for response
$response = Fiber::suspend($ch);
curl_close($ch);
return $this->parseResponse($response);
});
$fibers[] = $fiber;
}
// Execute all requests concurrently
$multiHandle = curl_multi_init();
$curlHandles = [];
foreach ($fibers as $fiber) {
$curlHandle = $fiber->start();
$curlHandles[] = $curlHandle;
curl_multi_add_handle($multiHandle, $curlHandle);
}
// Wait for all requests to complete
do {
$status = curl_multi_exec($multiHandle, $active);
if ($active) {
curl_multi_select($multiHandle);
}
} while ($active && $status == CURLM_OK);
// Resume fibers with results
foreach ($fibers as $index => $fiber) {
$curlHandle = $curlHandles[$index];
$response = curl_multi_getcontent($curlHandle);
$results[] = $fiber->resume($response);
curl_multi_remove_handle($multiHandle, $curlHandle);
}
curl_multi_close($multiHandle);
return $results;
}
private function parseResponse(string $response): array
{
$dom = new DOMDocument();
@$dom->loadHTML($response);
$xpath = new DOMXPath($dom);
$titles = $xpath->query('//title');
$links = $xpath->query('//a[@href]');
return [
'title' => $titles->length > 0 ? $titles->item(0)->textContent : 'No title',
'links' => $links->length,
'size' => strlen($response)
];
}
}
// Usage
$scraper = new AsyncWebScraper();
$urls = [
'https://example.com',
'https://httpbin.org/delay/1',
'https://httpbin.org/delay/2',
'https://httpbin.org/delay/3',
];
$results = $scraper->scrapeUrls($urls);
foreach ($results as $result) {
echo "Title: {$result['title']}, Links: {$result['links']}, Size: {$result['size']} bytes\n";
}
Error Handling with Fibers
Proper error handling is crucial when working with Fibers:
class SafeFiberExecutor
{
public function executeSafely(Fiber $fiber, ...$args): mixed
{
try {
if (!$fiber->isStarted()) {
return $fiber->start(...$args);
}
if ($fiber->isSuspended()) {
return $fiber->resume(...$args);
}
throw new LogicException('Fiber is not in a valid state');
} catch (FiberError $e) {
// Handle fiber-specific errors
throw new RuntimeException("Fiber execution failed: " . $e->getMessage(), 0, $e);
} catch (Throwable $e) {
// Handle other exceptions
throw new RuntimeException("Unexpected error in fiber: " . $e->getMessage(), 0, $e);
}
}
public function executeWithTimeout(Fiber $fiber, float $timeout): mixed
{
$startTime = microtime(true);
try {
if (!$fiber->isStarted()) {
$result = $fiber->start();
} else {
$result = $fiber->resume();
}
while ($fiber->isSuspended()) {
if (microtime(true) - $startTime > $timeout) {
throw new TimeoutException("Fiber execution timed out after {$timeout} seconds");
}
usleep(1000); // Small delay
$result = $fiber->resume();
}
return $result;
} catch (FiberError $e) {
throw new RuntimeException("Fiber execution failed: " . $e->getMessage(), 0, $e);
}
}
}
// Custom exceptions
class TimeoutException extends Exception {}
// Usage
$executor = new SafeFiberExecutor();
$fiber = new Fiber(function() {
// Simulate some work
for ($i = 0; $i < 10; $i++) {
echo "Working... $i\n";
Fiber::suspend();
}
return "Done!";
});
try {
$result = $executor->executeWithTimeout($fiber, 5.0);
echo "Result: $result\n";
} catch (TimeoutException $e) {
echo "Fiber timed out: " . $e->getMessage() . "\n";
} catch (RuntimeException $e) {
echo "Fiber error: " . $e->getMessage() . "\n";
}
Performance Considerations
When using Fibers, consider these performance implications:
class FiberPerformanceTest
{
public function benchmarkFibers(): void
{
$start = microtime(true);
// Traditional sequential approach
$this->sequentialExecution();
$sequentialTime = microtime(true) - $start;
$start = microtime(true);
// Fiber-based approach
$this->fiberExecution();
$fiberTime = microtime(true) - $start;
echo "Sequential: {$sequentialTime}s\n";
echo "Fibers: {$fiberTime}s\n";
echo "Improvement: " . round(($sequentialTime - $fiberTime) / $sequentialTime * 100, 2) . "%\n";
}
private function sequentialExecution(): void
{
for ($i = 0; $i < 10; $i++) {
$this->simulateWork();
}
}
private function fiberExecution(): void
{
$fibers = [];
// Create fibers
for ($i = 0; $i < 10; $i++) {
$fibers[] = new Fiber([$this, 'simulateWork']);
}
// Execute fibers
foreach ($fibers as $fiber) {
$fiber->start();
}
// Wait for completion
while ($this->hasActiveFibers($fibers)) {
foreach ($fibers as $fiber) {
if ($fiber->isSuspended()) {
$fiber->resume();
}
}
}
}
private function simulateWork(): void
{
// Simulate I/O operation
usleep(100000); // 100ms
Fiber::suspend();
}
private function hasActiveFibers(array $fibers): bool
{
foreach ($fibers as $fiber) {
if (!$fiber->isTerminated()) {
return true;
}
}
return false;
}
}
Best Practices: Hard-Learned Lessons from Production
Based on my experience implementing Fibers in production, here are the practices that saved me from countless debugging sessions:
-
Use Fibers for I/O-bound operations - They're most beneficial for network requests, file operations, and database queries. CPU-intensive tasks won't benefit and might actually perform worse. This follows clean code principles of using the right tool for the right job.
-
Implement proper error handling - My first Fiber implementation crashed spectacularly when one API returned malformed JSON. Always wrap Fiber operations in try-catch blocks and have fallback strategies.
-
Manage memory carefully - I learned this the hard way when my web scraper consumed 8GB of RAM. Fibers can hold onto memory if not properly cleaned up, especially when dealing with large datasets.
-
Start simple - Begin with basic use cases like concurrent HTTP requests before implementing complex event loops. I tried to build a sophisticated event system on day one and spent a week debugging race conditions.
-
Consider existing libraries - Libraries like ReactPHP already provide Fiber-based implementations. Don't reinvent the wheel unless you have specific requirements that existing solutions don't meet. Following PSR standards ensures your code works well with these libraries.
Common Pitfalls
// Bad: Creating too many fibers
$fibers = [];
for ($i = 0; $i < 10000; $i++) {
$fibers[] = new Fiber(function() {
// Some work
});
}
// Good: Limit concurrency
$batchSize = 10;
$batches = array_chunk($tasks, $batchSize);
foreach ($batches as $batch) {
$this->processBatch($batch);
}
Conclusion: From Synchronous Skeptic to Async Advocate
PHP Fibers represent a paradigm shift in how we approach asynchronous programming in PHP. They enable us to write non-blocking code that can handle concurrent operations efficiently, something that was previously difficult or impossible in PHP.
My journey with Fibers has been transformative. What started as skepticism about PHP needing async capabilities became genuine excitement about the performance possibilities. The key breakthrough was realizing that Fibers aren't trying to make PHP into JavaScript - they're making PHP better at being PHP.
The mindset shift was profound: I went from thinking "why would PHP need async?" to "how did I ever build data-intensive applications without it?" When you've experienced a 15x performance improvement on real-world tasks, there's no going back to sequential processing.
Real-world impact: In my current projects, Fibers have enabled me to build more responsive, scalable applications that handle multiple concurrent operations without blocking. That competitive analysis tool, the API dashboard, the web scraper - none of these would be viable with traditional synchronous PHP.
My advice for Laravel developers getting started (especially those working on Laravel API development):
Start Small: Begin with concurrent HTTP requests. The "wow factor" of seeing 10 API calls complete in the time it used to take for one will hook you immediately.
Focus on I/O: Fibers shine when you're waiting for external resources - databases, APIs, file systems. CPU-intensive tasks won't benefit and might actually perform worse.
Learn Gradually: Don't jump straight to building event loops. Master basic Fiber patterns first, then explore complex implementations.
Measure Everything: The performance gains are dramatic and measurable. Use profiling tools to quantify the improvements - you'll be amazed.
The future of PHP is looking more asynchronous, and Fibers are leading the way. They're not just a new feature - they're a new way of thinking about PHP applications. Once you experience the power of cooperative multitasking, sequential PHP code starts feeling frustratingly slow.
Fibers proved to me that PHP's evolution isn't just about adding features - it's about expanding what's possible. We now have async capabilities that rival Node.js while maintaining PHP's simplicity and Laravel's elegance. That's not just progress; that's revolutionary.
Add Comment
No comments yet. Be the first to comment!