cloudformation package uploading hash instead of zip - aws-cloudformation

I have a serverless api I'm trying to upload to cloudformation and am having some issues. According to the docs here,
For example, if your AWS Lambda function source code is in the /home/user/code/lambdafunction/ folder, specify CodeUri: /home/user/code/lambdafunction for the AWS::Serverless::Function resource. The command returns a template and replaces the local path with the S3 location: CodeUri: s3://mybucket/lambdafunction.zip.
I'm using a relative path (I've tried an absolute path as well), so I have CodeUri: ./ instead of /user/libs/code/functionDirectory/. When I package the files, it looks like a hash is being uploaded to S3, but it's not a zip (when I try and download it, my computer doesn't recognize the file type)
Is this expected? I was expecting a .zip file to be upload. Am I completely missing something here?
Thanks for any help.
Walker

Yes, it is expected. When you use CodeUri the files are archived and stored in S3, it can be extracted with unzip command or any other utility.
> file 009aebc05d33e5dddf9b9570e7ee45af
009aebc05d33e5dddf9b9570e7ee45af: Zip archive data, at least v2.0 to extract
> unzip 009aebc05d33e5dddf9b9570e7ee45af
Archive: 009aebc05d33e5dddf9b9570e7ee45af
replace AWSSDK.SQS.dll? [y]es, [n]o, [A]ll, [N]one, [r]ename:

Related

AzCopy ignore if source file is older

Is there an option to handle the next situation:
I have a pipeline and Copy Files task implemented in it, it is used to upload some static html file from git to blob. Everything works perfect. But sometimes I need this file to be changed in the blob storage (using hosted application tools). So, the question is: can I "detect" if my git file is older than target blob file and ignore this file for the copy task to leave it untouched. My initial idea was to use Azure file copy and use an "Optional Arguments" textbox. However, I couldn't find required option in the documentation. Does it allow such things? Or should this case be handled some other way?
I think you're looking for the isSourceNewer value for the --overwrite option.
--overwrite string Overwrite the conflicting files and blobs at the destination if this flag is set to true. (default true) Possible values include true, false, prompt, and ifSourceNewer.
More info: azcopy copy - Options
Agree with ickvdbosch. The isSourceNewer value for the --overwrite option could meet your requirements.
error: couldn't parse "ifSourceNewer" into a "OverwriteOption"
Based on my test, I could reproduce this issue in Azure file copy task.
It seems that the isSourceNewer value couldn't be set to Overwrite option in Azure File copy task.
Workaround: you could use PowerShell task to run the azcopy script to upload the files with --overwrite=ifSourceNewer
For example:
azcopy copy "filepath" "BlobURLwithSASToken" --overwrite=ifSourceNewer --recursive
For more detailed info, you could refer to this doc.
For the issue about the Azure File copy task, I suggest that you could submit a feedback ticket in the following link: Report task issues.

AZCopy Copy results in 404 blob does not exist

When running AZCopy command copy to get 2 pictures from a blob container/ folder, it results in a 404 blog not found. Error does not occur if I specify the filename (it downloads the folder structure with the file in it).
Tested 3 different versions of it the code but can not get a recursive version to work.
Example of not working
azcopy copy "https://bloburl.blob.core.windows.net/Container/Folder/*?SASKey" "C:\Users\MyFolder\Pictures"
azcopy copy "https://bloburl.blob.core.windows.net/Container/Folder?SASKey" "C:\Users\MyFolder\Pictures"
Example of working
azcopy copy "https://bloburl.blob.core.windows.net/Container/Folder/UNSC_Infinity.jpg?SASKey" "C:\Users\MyFolder\Pictures"
My goal is to download all files in the blob container/folder and not the structure itself.
SAS URI: https://s00aops01stg01blkbsa.blob.core.windows.net/5740-christianmimms?st=2019-06-03T17%3A09%3A34Z&se=2020-06-04T17%3A09%3A00Z&sp=rwl&sv=2018-03-28&sr=c&sig=3HyCaMQ1JCkb4Yof%2BOWExx8amHtPTmZHpEZbZPX8Iqs%3D
SAS URI w/Folder: https://s00aops01stg01blkbsa.blob.core.windows.net/5740-christianmimms/images/*?st=2019-06-03T17%3A09%3A34Z&se=2020-06-04T17%3A09%3A00Z&sp=rwl&sv=2018-03-28&sr=c&sig=3HyCaMQ1JCkb4Yof%2BOWExx8amHtPTmZHpEZbZPX8Iqs%3D

AWS S3, Deleting files from local directory after upload

I have backup files in different directories in one drive. Files in those directories can be quite big up to 800GB or so. So I have a batch file with a set of scripts which upload/syncs files to S3.
See example below:
aws s3 sync R:\DB_Backups3\System s3://usa-daily/System/ --exclude "*" --include "*/*/Diff/*"
The upload time can vary but so far so good.
My question is, how do I edit the script or create a new one which checks in the s3 bucket that the files have been uploaded and ONLY if they have been uploaded then deleted them from the local drive, if not leave them on the drive?
(Ideally it would check each file)
I'm not familiar with aws s3, or aws cli command that can do that? Please let me know if I made myself clear or if you need more details.
Any help will be very appreciated.
Best would be to use mv with --recursive parameter for multiple files
When passed with the parameter --recursive, the following mv command recursively moves all files under a specified directory to a specified bucket and prefix while excluding some files by using an --exclude parameter. In this example, the directory myDir has the files test1.txt and test2.jpg:
aws s3 mv myDir s3://mybucket/ --recursive --exclude "*.jpg"
Output:
move: myDir/test1.txt to s3://mybucket2/test1.txt
Hope this helps.
As the answer by #ketan shows, Amazon aws client cannot do batch move.
You can use WinSCP put -delete command instead:
winscp.com /log=S3.log /ini=nul /command ^
"open s3://S3KEY:S3SECRET#s3.amazonaws.com/" ^
"put -delete C:\local\path\* /bucket/" ^
"exit"
You need to URL-encode special characters in the credentials. WinSCP GUI can generate an S3 script template, like the one above, for you.
Alternatively, since WinSCP 5.19, you can use -username and -password switches, which do not need any encoding:
"open s3://s3.amazonaws.com/ -username=S3KEY -password=S3SECRET" ^
(I'm the author of WinSCP)

How to extract .gz file with .txt extension folder?

I'm currently stuck with this problem where my .gz file is "some_name.txt.gz" (the .gz is not visible, but can be recognized with File::Type functions),
and inside the .gz file, there is a FOLDER with the name "some_name.txt", which contains other files and folders.
However, I am not able to extract the archive as you would manually (the folder with the name "some_name.txt" is extracted along with its contents) when calling the extract function from the Archive::Extract because it will just extract the "some_name.txt" folder as a .txt file.
I've been searching the web for answers, but none are correct solutions. Is there a way around this?
From Archive::Extract official doc
"Since .gz files never hold a directory, but only a single file;"
I would recommend using tar on the folder and then gz it.
That way you can use Archive::Tar to easily extract specific file:
Example from official docs:
$tar->extract_file( $file, [$extract_path] )
Write an entry, whose name is equivalent to the file name provided to disk. Optionally takes a second parameter, which is the full native path (including filename) the entry will be written to.
For example:
$tar->extract_file( 'name/in/archive', 'name/i/want/to/give/it' );
$tar->extract_file( $at_file_object, 'name/i/want/to/give/it' );
Returns true on success, false on failure.
Hope this helps.
Maybe you can identify these files with File::Type, rename them with .gz extension instead of .txt, then try Archive::Extract on it?
A gzip file can only contain a single file. If you have an archive file that contains a folder plus multiple other files and folders, then you may have a gzip file that contains a tar file. Alternatively you may have a zip file.
Can you give more details on how the archive file was created and a listing of it contents?

How to partially extract a folder from a 7z file using powershell

I'm trying to automate the install of my platform. I've made a script for compressing the build of the deployables to a 7zip file.
Now i need to uncompress partially some folders to a specific destination.
Package
-app1
--folder11
---folder111
--folder12
-app2
--folder21
--folder22
...
I need to create a powershell script to extract the content of 'app1' to a destination folder.
I've been trying to use the following command but the result is not the as i expected.
I've been receiving the full path and not the content from folder11 recursivelly.
Set-Alias zip $ZipCommand
zip x $FilePath app1\folder11 -oc:DeployableFolder -r
Any ideas? Suggestions?
Thanks.
I tried and had no issue.
set-alias zip "c:\Program Files\outils\7-Zip\7z.exe"
zip x program.7z python-core-2.6.1\lib -oc:\data
I eventually got a c:\data\python-core-2.6.1 which only contains the lib folder with all its subfolders & files.
The only difference I see is the backslash \ in the output path.
HTH