Go > Concurrency > Goroutines > Goroutine scheduling
Goroutine Scheduling with Channels for Signaling
This example demonstrates how to use channels to signal goroutines, influencing their execution and providing synchronization points.
Understanding Channels for Goroutine Communication
Channels in Go are a powerful mechanism for communication and synchronization between goroutines. They provide a safe and reliable way to pass data and signals between concurrent functions. By using channels, we can orchestrate the execution of goroutines and create dependencies between them, effectively influencing their scheduling.
Code Example
This code demonstrates a worker pool pattern. A number of worker goroutines consume jobs from a jobs
channel and produce results on a results
channel. The main goroutine sends jobs to the jobs
channel and then closes it to signal to the workers that there are no more jobs. A separate goroutine waits for all workers to complete and then closes the results
channel, allowing the main goroutine to receive the results. The workers effectively schedule themselves by consuming work from the shared channel. The buffer size of the channel helps determine the amount of concurrent work being scheduled. This is a highly typical and recommended pattern for concurrency.
package main
import (
"fmt"
"sync"
)
func worker(id int, jobs <-chan int, results chan<- int, wg *sync.WaitGroup) {
defer wg.Done()
for j := range jobs {
fmt.Printf("worker:%d started job:%d\n", id, j)
// Simulate some work
// time.Sleep(time.Second)
fmt.Printf("worker:%d finished job:%d\n", id, j)
results <- j * 2
}
}
func main() {
numJobs := 5
numWorkers := 3
jobs := make(chan int, numJobs)
results := make(chan int, numJobs)
var wg sync.WaitGroup
for i := 1; i <= numWorkers; i++ {
wg.Add(1)
go worker(i, jobs, results, &wg)
}
// Send jobs to the jobs channel
for i := 1; i <= numJobs; i++ {
jobs <- i
}
close(jobs)
// Collect results from the results channel
go func() {
wg.Wait()
close(results)
}()
for r := range results {
fmt.Println("Result:", r)
}
}
Concepts Behind the Snippet
sync.WaitGroup
: Used to wait for a collection of goroutines to finish.
Real-Life Use Case
This pattern is widely used in web servers, image processing pipelines, and other concurrent applications where tasks can be divided into smaller, independent units of work. For example, a web server can use a worker pool to handle incoming requests, ensuring that the server remains responsive even under heavy load.
Best Practices
Interview Tip
Be prepared to explain the worker pool pattern and how channels are used for communication and synchronization between goroutines. Understand the difference between buffered and unbuffered channels and when to use each type.
When to Use Them
Use channels for signaling when you need to coordinate the execution of goroutines and create dependencies between them. The worker pool pattern is a good choice when you have a fixed number of resources (e.g., CPU cores) and want to limit the number of concurrent tasks.
Memory Footprint
The memory footprint of channels depends on the type of data they carry and their buffer size. Larger buffer sizes will consume more memory.
Alternatives
Pros
Cons
FAQ
-
What happens if I send a value to a closed channel?
Sending to a closed channel will cause a panic. -
What happens if I receive from a closed channel?
Receiving from a closed channel will return the zero value of the channel's type and afalse
value for the 'ok' part of the receive operation. The 'ok' value indicates whether the channel is still open. -
How do I prevent deadlocks when using channels?
Ensure that there are always goroutines ready to receive from channels that other goroutines are sending to. Avoid creating circular dependencies between goroutines.