How to delete specific files from the Source folder using delete task in Azure devOps - azure-devops

I am trying to add a task to delete files with specific type from source folder and all the sub folders using delete task in Azure DevOps pipeline.
Delete task:
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/delete-files?view=azure-devops doesn't seem to provide any information on the patterns.
I have tried following combinations but none of them worked.
(.xyz)
(.xyz)
*.xyz
*.xyz\
my expectation is to delete files with .xyz type from all the sub folders.

Try setting:
**/*.xyz
As the Contents variable value.
The full range of pattern filters is described in the documentation here.

Related

How to delete sub folder using azure data factory delete activity?

i am able to achieve by using files or delete the entire folders by selecting file path.
But i unable to do delete sub-folder/directory
dataset image
pipeline image
delete file is working for me
unable to delete the empty folder(test1,test2)
Could any one help me
You need to check Delete file recursively in Delete Activity Source.
Delete file recursively - Delete all files in the input folder and its subfolders recursively or just the ones in the selected folder. This setting is disabled when a single file is selected.
For more information refer this link
Edit -
Known limitation
Delete activity does not support deleting list of folders described by wildcard.
When using file attribute filter in delete activity: modifiedDatetimeStart and modifiedDatetimeEnd to select files to be deleted, make sure to set "wildcardFileName": "*" in delete activity as well.

Azure data factory File Content Replace in the Azure blob

Good morning,
We have Azure Data Factory (ADF). We have 2 files that we want to merge into one another. The files are currently in the Azure Blob storage. Below are the contents of the files. We are trying to take the contents of File2.txt and replace the '***' in File1.txt. When finished, it should look like File3.txt.
File1.txt
OP01PAMTXXXX01997
***
CL9900161313
File2.txt
ZCBP04178 2017052520220525
NENTA2340 2015033020220330
NFF232174 2015052720220527
File3.txt
OP01PAMTXXXX01997
ZCBP04178 2017052520220525
NENTA2340 2015033020220330
NFF232174 2015052720220527
CL9900161313
Does anyone know how we can do this? I have been working with this for 2 days and it would seem that this should not be a difficult thing to do.
All the best,
George
You can merge 2 files or more using ADF but i can't see a way where we can merge with a condition / control the way we merge files, so what i can recommend is to use Azure Function and do the merge programmatically.
if you want to know how to merge files without preserving line priorities use my approach:
create a pipeline
add a "Copy activity"
in copy activity use this basic settings:
in source -> chose WildCard file path (select the folder that files are located at), make sure in wildcard path to write "*" in filename this will guarantee to chose all files under the same folder.
this will merge all the files under the same folder.
in Sink -> make sure to select in Copy behavior Merge Files mode.
Output :

AzCopy ignore if source file is older

Is there an option to handle the next situation:
I have a pipeline and Copy Files task implemented in it, it is used to upload some static html file from git to blob. Everything works perfect. But sometimes I need this file to be changed in the blob storage (using hosted application tools). So, the question is: can I "detect" if my git file is older than target blob file and ignore this file for the copy task to leave it untouched. My initial idea was to use Azure file copy and use an "Optional Arguments" textbox. However, I couldn't find required option in the documentation. Does it allow such things? Or should this case be handled some other way?
I think you're looking for the isSourceNewer value for the --overwrite option.
--overwrite string Overwrite the conflicting files and blobs at the destination if this flag is set to true. (default true) Possible values include true, false, prompt, and ifSourceNewer.
More info: azcopy copy - Options
Agree with ickvdbosch. The isSourceNewer value for the --overwrite option could meet your requirements.
error: couldn't parse "ifSourceNewer" into a "OverwriteOption"
Based on my test, I could reproduce this issue in Azure file copy task.
It seems that the isSourceNewer value couldn't be set to Overwrite option in Azure File copy task.
Workaround: you could use PowerShell task to run the azcopy script to upload the files with --overwrite=ifSourceNewer
For example:
azcopy copy "filepath" "BlobURLwithSASToken" --overwrite=ifSourceNewer --recursive
For more detailed info, you could refer to this doc.
For the issue about the Azure File copy task, I suggest that you could submit a feedback ticket in the following link: Report task issues.

How do I clean a blob before copying to it (Azure release pipeline)

My release pipeline uploads to blob storage using
Azure File Copy
I want to delete the existing files in the blob before copying over the new files.
The help shows that
cleanTargetBeforeCopy: false
only applies to a VM
(since it is a release pipeline I can't edit the YAML anyway)
The tool tip for Optional Arguments shows
Optional AzCopy.exe arguments that will be applied when uploading to
blob like, /NC:10. If no optional arguments are specified here, the
following optional arguments will be added by default. /Y,
/SetContentType, /Z, /V, /S (only if container name is not $root),
/BlobType:page (only if specified storage account is a premium
account). If source path is a file, /Pattern will always be added
irrespective of whether or not you have specified optional arguments.
I want to delete the existing files in the blob before copying over the new files.
If you want to override the blobs during run copy file task, we neeed to add another optional argument at all.
As you mentioned that if we don't add optional Arguments. /Y paramter is added by default.
The blobs will be replaced by the new files by default when run Azure Copy Files task.
If you want to clean the container, you could use the Azure Powershell command to delete the container and recreate the new one before run the Azure copy file task.

Tagging Azure Resources from .csv

Is there an easy way to read a .csv in a VSTS pipeline from a PowerShell script?
I have a script that can tag Azure Resources and it gets the key-value pairs from a .csv file. It works a charm when running it locally and running:
$csv = Import-Csv "d:\tagging\tags.csv"
But I'm struggling to find a way to reference the .csv in VSTS (Devops Services). I've put the .csv with the script in the same repo/folder, and I've created an Azure PowerShell script task.
I need to know what the Import-Csv should look like if it's in VSTS. Do I need to add additional steps so that the agent downloads the .csv when running the script?
This is the current error:
The hosted agent can't find the file and reports "Could not find file 'D:\a_tasks\AzurePowerShell_72s1a1931b-effb-4d2e-8fd8-f8472a07cb62\3.1.6\tags.csv'.
Let's say you put the file in your repo in the location /AwesomeCSV/MyCSV.csv. Your CSV's location, from a build perspective, would be $(Build.SourcesDirectory)/AwesomeCSV/MyCSV.csv.
So basically, pass in $(Build.SourcesDirectory)/AwesomeCSV/MyCSV.csv to the script as an argument, or reference it as an environment variable in your script as $env:BUILD_SOURCESDIRECTORY.