Hi, found that if queue contains a lot of records (in my case ≈250000 items) indexing works incredibly slow. This line cause the troubles:
doc = collection.find_and_modify(:update => {"$set" => {:run_at => Time.now.utc + queue.retry_interval, :error => nil, :lock => lock}}, :query => conditions, :limit => queue.batch_size, :sort => [[:priority, Mongo::DESCENDING], [:run_at, Mongo::ASCENDING]])
It's in Sunspot::IndexQueue::Entry::MongoImpl#next_batch!.
Now it updates each doc separately, think things could be improved if we will set :run_at for the whole batch of records instead of setting it individually for each record.
Hi, found that if queue contains a lot of records (in my case ≈250000 items) indexing works incredibly slow. This line cause the troubles:
It's in
Sunspot::IndexQueue::Entry::MongoImpl#next_batch!.Now it updates each doc separately, think things could be improved if we will set :run_at for the whole batch of records instead of setting it individually for each record.