Load ForEach SQL into single CSV in Azure Data Factory - azure-data-factory

I have an ADF where I am executing a stored procedure in a ForEach and using Copy Data to load the output into a CSV file
On each iteration of the ForEach the CSV is being cleared down and loading that iteration's data
I require it to preserve the already loaded data and insert the output from the iteration
The CSV should have a full dataset of all iterations
How can I achieve this? I tried using the "Merge Files" option in the Sink Copy Behavior but doesn't work for SQL to CSV

As #All About BI mentioned, currently the append behavior which you are looking for is not supported.
You can raise a feature request from the ADF portal.
Alternatively, you can check the below process to append data in CSV.
In my repro, I am generating the loop items using Set Variable activity and passing it to ForEach activity.
Inside ForEach activity, using copy data activity, executing the stored procedure in Source, and copying data of Stored procedure to a CSV file.
In the Copy data activity sink, generate the file name using the current item of ForEach loop, to get data into different files for each iteration. Also adding a constant to identify the file name which can be deleted at the end after merging the files.
File_name: #concat('1_sp_sql_data_',string(item()),'.csv')
Add another copy data activity after the ForEach activity, to combine all the files data from the ForEach iteration to a single file. Here I am using the wildcard path (*) to get all files from the folder.
In Sink, add the destination filename with copy behavior as Merge files to copy all source data to a single sink file.
After merging the files data is copied to a single file, but the files will not be deleted. So when you run the pipeline next time, there is a chance the old files were also been merged with new files again.
• To avoid, this adding delete activity to delete the files generated in ForEach activity.
• As I have added a constant to generate these files, it will be easy to delete the files based on the filename (deleting all files which start with “1_”).
Destination file:

You could try this -
Load each iteration data to a separate csv file.
Later union them all or merge
As right now we don't have the ability to append rows in csv.

Related

Move Entire Azure Data Lake Folders using Data Factory?

I'm currently using Azure Data Factory to load flat file data from our Gen 2 data lake into Synapse database tables. Unfortunately, we receive (many) thousands of files into timestamped folders for each feed. I'm currently using Synapse external tables to copy this data into standard heap tables.
Since each folder contains so many files, I'd like to move (or Copy/Delete) the entire folder (after processing) somewhere else in the lake. Is there some practical way to do that with Azure Data Factory?
Yes, you can use copy activity with a wild card. I tried to reproduce the same in my environment and I got the below results:
First, add source dataset and select wildcard with folder name. In my scenario, I have a folder name pool.
Then select sink dataset with file path
The pipeline run is successful. It transferred the file from one location to another location with the required name. Look at the following image for reference.

ADF Data Flow creates empty file

I have setup a very basic data transformation using a "Data flow". I'm taking in a CSV file and modifying one of the columns and writing to a new CSV file in an "output" directory. I noticed that after the pipeline runs not only does it create the output folder but it also creates and empty file with the same name as the output folder.
Did I setup something wrong or is this empty file normal?
Sink Settings
Sink
Settings
Mapping
Optimize
Storage
Thank you,
Scott

How do I capture and write Copy activity failures to a table or file?

I have a Copy activity that copies data from one source to another. Some rows in the source don't fit into the sink because of structure or data types therefore they fail correctly.
How do I capture these failed rows and write it to a table or file?
There is a log enable property('enableCopyActivityLog' property) for copy activity in ADF that would do the needful task of copying the data into log files.
https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-log

Invalid data stored in Delimited Text after running ADF Data Flow

I'm trying to store data from an input to csv file in blob storage via ADF data flow. The pipeline ran successfully. However on checking the csv file, I see some invalid data included. Here are the settings of Delimited Text and Sink. Please let me know what I am missing?
I tested and repeat the error.
The error is caused by that all the csv files in the csv/test folder have different schema.
Even if the pipeline runs with no error, but the data in to the single file will has the error.
In Data Factory, when we try to merge more files to one, or copy data from more files to single, the files in the folder must have the same schema.
Note that please using wildcard paths to filter all the csv files:
For example, I have two csv files which have same schema in the container:
Source dataset preview:
Only if the source dataset preview is correct, the output file also will be correct.

Parse multiple json files in one activity

I would like to use data factory to regularly download 500000 json files from a web API and store them in a blob storage container. Then I need to parse the json files to extract some values from each file and store these values together with an ID (part of filename) in a database. I can do this using a ForEach activity and run a custom activity for each file, but this is very slow, so I would prefer some batch activity which could run the same parsing code on each file. Is there some way to do this?
If your source json files have same schema, you can leverage the Copy Activity which can parse those files in a single run. But if possible, I would suggest to split those files into different sub folder (e.g. 1000 files per folder), so that each copy run needs less time and ease the management.
Refer to this doc for more details: https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-overview