Before using the Azure ML Batch Execution Task, you can load source data from your local systems to be used during the execution of the Azure ML Prediction Web Service.
The Azure ML Storage destination is used to load data to the Azure blob storage file that will be used during the prediction service batch execution.
You simply configure the Azure ML Destination to map your local source data to the Azure ML file that is expected during execution and go. The component takes care of creating the file and uploading it to the Azure blog storage for you.
Azure ML Storage Source
After executing the Azure ML Batch Task, there will be a resulting file written by the prediction / scoring web service that will contain the results of the prediction / scoring process.
The Azure ML Storage Source is used to download this file and output it as source data that can be used in your data flow.
This allow you to immediately get the results of a scoring process and continue to process the data based on your business rules.
Azure ML Batch Execution Task
Each Azure ML Prediction / Scoring web service has an endpoint that can be used to kick off running a batch execution process.
The Azure ML Batch Execution Task give you the power to start the execution process right from SSIS. You no longer need to run this process manually.
The Azure ML Batch Execution Task can be used with the Azure ML Storage Destination and Azure ML Storage Source to insert data before a batch execution and then retrieve the results of a batch execution.
*Must have an Azure ML account to use this task.