Configuring sink data set in azure data factory - copy

I am trying to copy multiple folders with their files (.dat and .csv ) from ftp to Azure storage account , so I am using a get metadata for each and copy activity. My problem is that when setting the file path in the output data set I am not sure how to set the filename so it picks up all files in my folder.
I added a filename parameter in the data set and in the copydata sink I set it as #item().name but it's not working instead of copying the files, it copies the folder. the second try is that I dont set the filename in the directory, and it does copy the files but it adds the extension.txt to the files instead of keeping their original format.
Thank you for your help,
enter image description here
enter image description here

You need to add a third parameter for the sink dataset for filename.
Here you can pass parameter as you have for container and folder.
Filename can be set from the Get Metadata activity output.

Related

How to copy file based on date in Azure Data Factory

I have a list of files in a adls container which contain date in the name as given below:
TestFile-Name-20221120. csv
TestFile-Name-20221119. csv
TestFile-Name-20221118. csv
and i want to copy files which contain today date only like TestFile-Name-20221120. csv on today and so on.
I've used get metedata activity to get list of files and then for each to iterate over each file and then used set variable to extract name from the file like 20221120 but not sure how to proceed further.
We have something similar running. We check an SFTP folder for the existanc e of files, using the Get Metadata activity. In our case, there can be folders or files. We only want to process files, and very specific ones for that matter (I.e. we have 1 pipeline per filename we can process, as the different filenames would contain different columns/datatypes etc).
Our pipeline looks like this:
Within our Get Metadata component, we basically just filter for the name of the object we want, and we only want files ending in .zip, meaning we added a Filename filter:
:
In your case, the first part would be 'TestFile-Name-', and the second part would be *.csv'.
We then have a For Each loop set up, to process anything (the child items) we retrieved in the Get Metadata step. Within the For Each we defined an If Condition to only process files, and not folders.
In our cases, we use the following expression:
#equals(item().type, 'File')
In your case, you could use something like:
#endsWith(item().name, concat(<variable containing your date>, '.csv'))
Assuming all the file names start with TestFile-Name-,
and you want to copy the data of file with todays date,
use get metadata activity to check if the file exists and the file name can be dynamic like
#concat('TestFile-Name-',utcnow(),'.csv')
Note: you need to fromat utcnow as per the needed format
and if file exists, then proceed for copy else ignore

How to copy the files from SFTP to target folder created dynamically based on source filename (Blob storage)?

I am new to ADF, need help for 2 scenarios
1.I have to copy files from SFTP to blob storage(Azure Gnen2) using ADF. In the source SFTP folder, there are 3- 5 different the files. For example
S09353.DB2K.AFC00R46.F201130.txt
S09353.DB2K.XYZ00R46.F201130.txt
S09353.DB2K.GLY00R46.F201130.txt
On copying, this files are copied and placed under corresponding folders which are created dynamically based on file types.
For example: S09353.DB2K.AFC00R46.F201130.txt copy to AFC00R46 folder
S09353.DB2K.XYZ00R46.F201130.txt copy to XYZ00R46 folder.
2.Another requirement is need to copy csv files from blob storage to SFTP. On coping, the files need to copy to target folder created dynamically based on file name:
for example: cust-fin.csv----->copy to--------->Finance folder
please help me on this
The basic solution to your problem is to use Parameters in DataSets. This example is for a Blob Storage connection, but the approach is the same for SFTP as well. Another hint: if you are just moving files, use Binary DataSets.
Create Parameter(s) in DataSet
Reference Parameter(s) in DataSet
Supply Parameter(s) in the Pipeline
In this example, I am passing Pipeline parameters to a GetMetadata Activity, but the principles are the same for all DataSet types. The values could also be hard coded, expressions, or variables.
Your Process
If you need this to be dynamic for each file name, then you'll probably want to break this into parts:
Use a GetMetadata Activity to list the files from SFTP.
Foreach over the return list and process each file individually.
Inside the Foreach -> Parse each file name individually to extract the Folder name to a variable.
Inside the Foreach -> Use the Variable in a Copy Activity to populate the Folder name in the DataSet.

Powershell Only:File name change according to mapping file and then copy to another directory

I have a mapping file in import.CSV as following:
Name|Business
Jack|Carpenter
Rose|Secretary
William|Clerk
Now, I have a directory which contains files like
90986883#Jack#Sal#1000.dat
76889992#Rose#Sal#2900.dat
67899279#William#Sal#1900.dat
12793298#Harry#Sal#2500.dat
Please note #Sal will always be there after Name. I need to pick these files and put into another directory and end result in second directory should look like.
90986883#Carpenter#Sal#1000.dat
76889992#Secretary#Sal#2900.dat
67899279#Clerk#Sal#1900.dat
Basically Files need to be renamed based upon CSV file and if Name is not there in file name , then there is no action required. Please note that source file should not be changed.
I will appreciate all kind of help.

How to parameterise Dataset definition filename in Azure Data factory

I have a blob storage container folder (source) that gets several csv files. My task is to pick the csv files starting with "file". See example filename below::
file12345.csv
The numeric part varies every time.
I have set the "fixed" Container and Directory names in the image below but it seems the File parameter does not accept wildcard "File*.csv".
How can I pass a wildcard to the Dataset definition?
Thanks
You can't do that operation in Soure dataset.
Just choose the container or folder in the dataset like bellow:
Choose the Wildcard file path in Source settings:
The will help you filter the filename wildcard "File*.csv".
Ref: Copy activity properties:
Hope this helps.

access folders and get files

I start with Matlab and would like to know how could I access to a folder and get contents to access files and read them.
I have a variable in workspace tmpfolder that is equal to 'path to folder' but I don't find how could I make dir(tmpfolder) and get files, browse any file content to get a string value...
I would start with dir() and fopen().
More generally, try starting at the beginning: Working with Files and Folders.
If you have an image file in jpeg format in another folder named myimage and a text file called mytext, use:
prefix_image='myimage';
prefix_data='mytext';
fileformat='.jpg';
dataformat='.txt';
folder='C:\Users\khaled\Documents\MATLAB\';
image = imread(strcat(folder,prefix_image,fileformat));
data=textread(strcat(folder,prefix_data,fileformat),'%f');