- Speeds up processing by %{number_of_cpus} X
- Protects data from destructive operations
Usage
data = something_big()
require 'parallel'
#runs in number_of_your_cpus
results = Parallel.map(data) do |chunk|
expensive_computation chunk
end
#4 processes
results = Parallel.map(data, :in_processes=>4) do |chunk|
expensive_computation chunk
end
#same with threads (no speedup through multiple cpus,
#but speedup for blocking operations)
results = Parallel.map(data, :in_threads=>4) do |chunk|
blocking_computation chunk
end
Parallel.each(data){|chunk| ..same.. }
Go parallel !
Nice work! I’ve been looking for something like this for a while now. I need to run the same operations on a large number of files. Hopefully this will speed things up.