I have a scenario in which, through datafactory a folder must be created dynamically, which implies that its name must be generated dynamically, this folder is created inside a root folder, once this folder is created, then create a couple of Json files in it, but I can't figure out how to do it.
The idea is to generate the json files from sql server and through datafactory consult these files and load them in the folder once it is created.
I think it can be done by copy activity but i am literally stuck and i don't know what to do
any ideas?...
There is no need to create a folder separately. We can specify folder name in directory field of Sink dataset. If the specified folder does not exist, a folder will be created, and file will be copied to it.
Please follow below steps:
Step1: add source to Azure data factory
Step2: Create Set variable Add name and Value.
Step3: Go to sink add Storage account and select Json format.
Step4: Open Json file -> Add parameter name and string type -> connection add dynamic Contant.
Step5: Go to Sink, you will find Parameter name and add dynamic content of Set variable.
This is the output of Storage account.
Note: Edit the value for set variable activity to give required dynamic name ( I used 'output' as value for demonstration) for your
output folder.
Related
how to rename folders when transferring from one container to anther container by using dynamically in azure data factory
I want to modify the folder name by dynamically
I tried to dynamically rename the folder in ADF pipeline while copying. Below is the process.
Lookup activity is taken with the dataset containing old_name and new_name of the folder.
In For-each, Copy data is taken. In source dataset, folder name is given dynamically.
Wildcard File Path: container2/#{item().old_name}/*
In sink dataset, new folder name is given dynamically in another container.
File Path: container1/#{item().new_name}/
Files are copied to new container with new folder names.
I have on-prem file System Data set with pathpath "Z:\ProjectX"
It has contains files like Z:\ProjectX\myfile1.json
I would like to delete all json files in "Z:\ProjectX".
I wonder how to do? What value should be set to Folder?
In source settings, select the File path type as Wildcard file path and provide the Wildcard file name as ‘*.json’ to delete all the files of type JSON.
I would like to copy any file in Blob container to another Blob container. No transformation is needed. How to do it?
However I get validate error:
Copy data1:
Dataset yellow_tripdata_2020_1 location is a folder, the wildcard file name is required for
Copy data1
As the error states: the wildcard file name is required for Copy data1.
On your data source, in the file field, you should enter a pattern that matches the files you want to copy. So *.* if you want to copy all the files, and something like *.csv if you only want to copy over CSV files.
I have a "copy data" activity in Azure Data Factory. I want to copy .csv files from blob container X to Blob container Y. I don't need to change the content of the files in any way, but I want to add a timestamp to the name, e.g. rename it. However, I get the following error "Binary copy does not support copying from folder to file". Both the source and the sink are set up as binary.
If you want to copy the files and rename them, you pipeline should like this:
Create a Get Metadata active to get the file list(dataset Binary1):
Create For Each active to copy the each file:#activity('Get Metadata1').output.childItems:
Foreach inner active, create a copy active with source dataset
Binary2(same with Binary2) with dataset parameter to specify the source file:
Copy active sink setting, create the sink Binary3 also with
parameter to rename the files:
#concat(split(item().name,'.')[0],utcnow(),'.',split(item().name,'.')[1]):
Run the pipeline and check the output:
Note: The example I made just copy the files to the same container but with new name.
My release pipeline uploads to blob storage using
Azure File Copy
I want to delete the existing files in the blob before copying over the new files.
The help shows that
cleanTargetBeforeCopy: false
only applies to a VM
(since it is a release pipeline I can't edit the YAML anyway)
The tool tip for Optional Arguments shows
Optional AzCopy.exe arguments that will be applied when uploading to
blob like, /NC:10. If no optional arguments are specified here, the
following optional arguments will be added by default. /Y,
/SetContentType, /Z, /V, /S (only if container name is not $root),
/BlobType:page (only if specified storage account is a premium
account). If source path is a file, /Pattern will always be added
irrespective of whether or not you have specified optional arguments.
I want to delete the existing files in the blob before copying over the new files.
If you want to override the blobs during run copy file task, we neeed to add another optional argument at all.
As you mentioned that if we don't add optional Arguments. /Y paramter is added by default.
The blobs will be replaced by the new files by default when run Azure Copy Files task.
If you want to clean the container, you could use the Azure Powershell command to delete the container and recreate the new one before run the Azure copy file task.