AzCopy ignore if source file is older - azure-devops

Is there an option to handle the next situation:
I have a pipeline and Copy Files task implemented in it, it is used to upload some static html file from git to blob. Everything works perfect. But sometimes I need this file to be changed in the blob storage (using hosted application tools). So, the question is: can I "detect" if my git file is older than target blob file and ignore this file for the copy task to leave it untouched. My initial idea was to use Azure file copy and use an "Optional Arguments" textbox. However, I couldn't find required option in the documentation. Does it allow such things? Or should this case be handled some other way?

I think you're looking for the isSourceNewer value for the --overwrite option.
--overwrite string Overwrite the conflicting files and blobs at the destination if this flag is set to true. (default true) Possible values include true, false, prompt, and ifSourceNewer.
More info: azcopy copy - Options

Agree with ickvdbosch. The isSourceNewer value for the --overwrite option could meet your requirements.
error: couldn't parse "ifSourceNewer" into a "OverwriteOption"
Based on my test, I could reproduce this issue in Azure file copy task.
It seems that the isSourceNewer value couldn't be set to Overwrite option in Azure File copy task.
Workaround: you could use PowerShell task to run the azcopy script to upload the files with --overwrite=ifSourceNewer
For example:
azcopy copy "filepath" "BlobURLwithSASToken" --overwrite=ifSourceNewer --recursive
For more detailed info, you could refer to this doc.
For the issue about the Azure File copy task, I suggest that you could submit a feedback ticket in the following link: Report task issues.

Related

Download of an email attatchment on Jenkins

I have been testing the Jenkins "Poll Mail Trigger" plugin to trigger a project via email. To develop the project I need to download the attached file that is sent.
In the project settings, I enable the only download option that exists: "Download to a timestamped Directory".
Also, the only information regarding the download is this: "The Attachments field, allows you to determine what to do with email attachments. If attachments are downloaded, the pmt_attachmentsDirectory variable with be set to the download directory."
Then, after doing several tests without configuring any directory for the download of the attached file -assuming that the download should have been automatic-, I proceeded to change the environment variables in the Jenkins configuration, I put the following:
Name: pmt_attachmentsDirectory
Value: *D: *
So, what I'm trying to do is save the file to my Drive D: \
The file is never downloaded to the specified disk, and just for verification, I am trying to move the file with the following Windows command:
move "% pmt_attachmentsDirectory%" "D: \ Users \ wrodriguez \ Downloads \ Test_Programs"
And finally it throws me the following output: "The system cannot find the specified file."
So, I am really lost here, can anybody help me with this issue? Is there another plugin on Jenkins that I can use to download an attatchment?
Thanks for your help.
I haven't worked on that plugin (or Jenkins) in 5 years, so forgive my rustiness.
Looking at the code, the plugin creates a timestamped folder, and saves any attachments there.
It then injects an environment variable named pmt_attachmentsDirectory into the Jenkins job instance's environment variables - the value is the absolute path of folder (it's not configurable).
Troubleshooting:
have you enabled saving Attachments in the Plugin options?
check the job instance's environment variables - does it have pmt_attachmentsDirectory listed?
print the value of pmt_attachmentsDirectory - e.g. echo "%pmt_attachmentsDirectory%" - what does it say?
does the directory exist?
does the directory contain any files?
does the "View Polling Log" have any errors?
do the Jenkins logs have any errors?
Other notes:
"% pmt_attachmentsDirectory%" - shouldn't have a space e.g. should be "%pmt_attachmentsDirectory%"

How to delete specific files from the Source folder using delete task in Azure devOps

I am trying to add a task to delete files with specific type from source folder and all the sub folders using delete task in Azure DevOps pipeline.
Delete task:
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/delete-files?view=azure-devops doesn't seem to provide any information on the patterns.
I have tried following combinations but none of them worked.
(.xyz)
(.xyz)
*.xyz
*.xyz\
my expectation is to delete files with .xyz type from all the sub folders.
Try setting:
**/*.xyz
As the Contents variable value.
The full range of pattern filters is described in the documentation here.

How do I clean a blob before copying to it (Azure release pipeline)

My release pipeline uploads to blob storage using
Azure File Copy
I want to delete the existing files in the blob before copying over the new files.
The help shows that
cleanTargetBeforeCopy: false
only applies to a VM
(since it is a release pipeline I can't edit the YAML anyway)
The tool tip for Optional Arguments shows
Optional AzCopy.exe arguments that will be applied when uploading to
blob like, /NC:10. If no optional arguments are specified here, the
following optional arguments will be added by default. /Y,
/SetContentType, /Z, /V, /S (only if container name is not $root),
/BlobType:page (only if specified storage account is a premium
account). If source path is a file, /Pattern will always be added
irrespective of whether or not you have specified optional arguments.
I want to delete the existing files in the blob before copying over the new files.
If you want to override the blobs during run copy file task, we neeed to add another optional argument at all.
As you mentioned that if we don't add optional Arguments. /Y paramter is added by default.
The blobs will be replaced by the new files by default when run Azure Copy Files task.
If you want to clean the container, you could use the Azure Powershell command to delete the container and recreate the new one before run the Azure copy file task.

AEM: Issue using Command Line DAM Workflow

I like to execute a command line programm as a DAM workflow. I tried to implement the ImageMagic example from here: Best Practices for Configuring ImageMagick:
I addded a new Workflow Model,
added "command line" from the "DAM Workflow" list.
In the Argument tab set Mime type to "image/jpeg" (even tried wihtout Mime type)
and in Commands: "C:\Program Files\ImageMagick-7.0.7-Q16\magick.exe" convert ${file} -flip ${file}-flipped.jpg (instead of magick convet ... because in another discussion using an absolute path instead of global name helped people Re: CommmandLineProcess : ImageMagick)
I then added a luncher. And uploaded an Image to the DAM.
In the workflow > instances overview, i see that the workflow was started, it's running and the command line job is set to active.
Unfortunantly this state is never chnaged and no new asset is generated via imageMagic.
I even tried replacing the command with something simple like "ren C:\test\foo.txt bar.txt" which renames a local file. The chnage never happend either.
My question is what am i doing wrong, and how can i debug / find the command outputs? in \crx-quickstart\logs i couldn't find any logs regarding CommandLineProcess.
Thanx

Azure Storage: use AzCopy.exe to copy a folder from blob storage to another storage account

Using AzCopy.exe, I am able to copy over an entire container successfully. However, I cannot figure out how to copy over a blob where the name includes a folder structure. I have tried the following:
.\AzCopy.exe /Source:https://sourceaccount.blob.core.windows.net/container /Dest:https://destaccount.blob.core.windows.net/container /SourceKey:sourceKey== /DestKey:destKey== /S /Pattern:CorruptZips/2013/6
While also changing the /Pattern: to things like:
/Pattern:CorruptZips/2013/6/*
/Pattern:CorruptZips/2013/6/.
/Pattern:CorruptZips/2013/6/
And everything just says that there are zero records copied. Can this be done or is it just for container/file copying? Thank you.
#naspinski, there is the other tool named Azure Data Factory which can help copying a folder from a blob storage account to another one. Please refer to the article Move data to and from Azure Blob using Azure Data Factory to know it and follow the steps below to do.
Create a Data Factory on Azure portal.
Click the Copy Data button as below to move to the powercopytool interface, and follow the tips to copy the folder step by step.
Took me a few tries to get this. Here is the key:
If the specified source is a blob container or virtual directory, then
wildcards are not applied.
In other words, you can't wildcard copy files nested in a folder structure in a container. You have two options:
Use /S WITHOUT a pattern to recursive copy everything
Use /S and specify the full file path in your pattern without a wildcard
Example:
C:\Users\myuser>azcopy /Source:https://source.blob.core.windows.net/system /Dest:https://dest.blob.core.windows.net/system /SourceKey:abc /DestKey:xyz /S /V /Pattern:"Microsoft.Compute/Images/vmimage/myimage.vhd"
EDIT: Oops, my answer was worded incorrectly!
Please specify the command without /S:
AzCopy /Source:https://myaccount.blob.core.windows.net/mycontainer1 /Dest:https://myaccount.blob.core.windows.net/mycontainer2 /SourceKey:key /DestKey:key /Pattern:abc.txt
You can find the information from "Copy single blob within Storage account" in http://aka.ms/azcopy .