python - Celery add tasks to queue when ready -
I am new to celery I am trying to do a distributed function with celery
Let's say that I have any job in my work file and I am doing the following as part of some distributed resources: Now the chunk_generator basically fetches a generator pattern which goes to the database and some metadata. My problem is that these work gets deposited before the work queue is already sent. In fact, before generating in the queue, my generator takes about 30 minutes to fetch all meta data. I know how it works for The above gener_sots will display as soon as a generator generates results. In other words, there is anything optional for You try to start them immediately and try to add examples as a result of task.py :
@ celery.task dff gener_sorts (params, meta_data_dict): '' 'This will generate some type' '' pass
taskset = TaskSet (tasks.generate_sorts.subtask (args = (Params, meta_data_dict)) Meta_Data_Disk Chunk_generator for print) Print "Task Disclosure" taskset_result = taskset.apply_async () Print "Waiting for results" Results = WorkComedia_Salt. Join_native () Print "Result:" Process # Results
TaskSet i
TaskSet I'm looking for some alternative, that is, I will be able to do the equivalent in the distributed way below.
pool.imap_unordered (gener_sorts, chunk_generator)
TaskSet , where I can get a job from the generator for the first time, before the generator can bring everything to the generator instead of waiting to get it, before That I can start some work.
asyncResult Should: A:
result result from result of sales_set = ResultSet () chunk_generator in the meta_data_choice: select the job immediately in the # queue result = task.generate_sorts.delay (params, meta_data_dict) result_set Results (results) Print "Waiting for result" result = result_set.join_native () print "results:" take action on # result
Comments
Post a Comment