Using Microsoft Flow Parallelism to process CDS Records
Do you need to process a lot of Common Data Service records as fast as possible? Then you should consider setting the Concurrency degrees of parallelism to as high a level as your service will support. In an earlier blog I did a test of updating 15,000 Account records. It took 168 minutes to retrieve the records and update two text fields. I didn’t make any changes to the default setting for the number of loops that would run at a time.
For the next experiment I then went in to the flow and edited the settings for the Apply to each action to move the slider all the way to 50.
Now if we take a look at the flow we can see that took 3 hours, executed in 17 minutes by allowing multiple loops to run at the same time.
So instead of processing 88 records/minute the flow processed 882 records/minute. Wow! That is a significant production rate improvement.
Note: This does come with some risks, if you use it with a service that has it’s allowed API call rate exceeded you will generate a flow error, causing the flow to fail. Check out this doc to learn more about Concurrency, looping, and debatching limits.