|
- DocumentDB Sharding Using Azure Data Factory
Well I am not sure exactly, but if we are migrating data from any source to documentDB and wanted to shared data between multiple collection based on the partition key by specifying the details in pipeline JSON something like that
- Cannot start trigger Azure Data Factory from PowerShell
The issue seems to be in the ARM template, the payload of trigger has a property called pipeline For scheduled trigger it should be 'pipelines' Refer the json of the trigger from the UI, and you will see that it should be an array of pipelines, not a single pipeline Proposed as answer byKranthiPakala-MSFTMicrosoft employeeFriday, July 19, 2019 5:30 PM Marked as answer byLeszek Koc (DTC
- Community - Microsoft Azure
Powered by Dynamics 365 Customer Service Learn more here Privacy Terms of use © Microsoft 2021
- Integration Run-time on Azure VM - social. msdn. microsoft. com
I have a question that I feel should be easy for those of you more experienced in Azure My network admin and I have been in a disagreement about setting up an Integration Run-time for Data Factory on an Azure VM
- fileName for SAS Blob Storage - social. msdn. microsoft. com
I need to copy some json files on a daily basis from a blob storage to a data lake store using Data Factory The folder structure and files names in the blob storage are as follows The same folder structure and names have to be maintained in the data lake store also
- How to flatten arrays in Mapping Data Flow?
The Mapping Data Flow within Azure Data Factory is advertised as being a powerful tool However, a simple flattening of an array, which would only take a couple of lines in Python or any other programming language, seems to be next to impossible Attached is an example of a file in Avro format
- Copy a pipeline from one Data Factory to another
Thanks for your reply, yes I have looked at the ARM template but my Data Factory has quite a few pipelines and there doesn't seem to be a option just to select the one particular pipeline?
- (ADF V2) Pre-Copy script to delete data on Oracle DB fails when no . . .
I am attempting to clean up data in an Oracle sink table using the "pre-copy script" option in the sink definition
|
|
|