Can I update a Zip file inside AWS S3 with powershell? - powershell

I recently learned how to modify the contents of a Zip file with Powershell, in order to update it without having to recreate it completely, but now I have to do the same on AWS S3 with the Powershell module AWSToolsS3 and I I'm afraid it won't be possible.
My goal is to have two scripts. The first archives files on a file server and creates Zips for months and years.
This script is done and working.
The second script aims to copy the zips of months and years to an Amazon S3 Cloud.
However, there is a very small chance that a previously archived folder will be modified during a period when they are still accessible, and I therefore have to update any Zips that may have already been created.
These folders and Zips are very large and recreating them from scratch is not an option, so I have set up an optimized update system for Zips on the file server, and I would like to know if this is possible too with the AWSToolsS3 Powershell module
I can't seem to find in the Amazon documentation if it is possible to modify the contents of a Zip file without unzipping it.
So I ask the help of professionals from Powershell and AWS who will see my question.
Thank you in advance for your help.

Related

How to automatically-selectively backup critical files on edit?

I have just accidentally deleted one week of coding source files, and even testdisk does not restore them. Even executable jars gone... I use ubuntu. I dont want that happen ever again. How to sufficiently and efficiently make automatic backups (clones) of selected critical files to a different location e.g. home?
I use java, and eclipse as IDE, but this could be any file i work with. E.g. i select certain file, because i can accidentally delete it, so this lightweight backup tool would automatically update it in saved backup location according to saved changes. So if it is lost in working directory, as in my case, i can just take it from backup site on local machine. Pls help. I feel devastated...
cwatch might be a solution i am looking for, but it is too complicated.
p.s. i am aware of question Script to perform a local backup of files stored in Google drive
google services not ok for me.
The simplest solution would be to use GitHub or Bitbucket and to regularly push the changes you made to the online repository. You will benefit more from the usage of a version control software then from a local backup. You can use either of them for free.

Count files in a ZIP file over SFTP using PowerShell

I am connecting to SFTP via host, port, username and password using PowerShell. I want to count the number of files in a particular zip folder without having to download the zip folder on my local machine and count. Please share the piece of logic that would do this. I looked into this but it seems a bit tricky when it comes to do this in a zip folder.
That's not an easy task to do. There's no API in SFTP to do that completely remotely. There are basically two solutions:
Use SFTP to download only the ZIP central directory (basically the listing that is placed at the very end of the ZIP file). And decode the directory locally. For C#, this is covered in my answer to List files inside ZIP file located on SFTP server in C#. Though as mentioned there, there's a bug in SSH.NET that requires a workaround with implementing an interface. While that's probably doable in PowerShell too, I've never done that.
If you have an SSH shell access to the server, use remote zip command to list the contents of the file. Or build another API (like a web service).
Btw, note that there's nothing like ZIP "folder". ZIP is an archive file. It's only Windows that call ZIP files "folders".

Copy files from SFTP with CMD

We have a customer with their SFTP site, and I would like to copy files from specific folder, by using any automated process.
One of the example which I found, is winscp.net, but I have not managed how to use it, for my purpose.
http://www.itworld.com/article/2928599/windows/how-to-automate-sftp-file-transfers-in-microsoft-windows.html
QUESTION: All I will need is not run script and the file should be copied from their directory to my local folder. Is it possible at all?
I found the way of using PSFTP(Putty) to connect to the server, but how to make it automatically, do not know.
I think its a good solution for your problem.
You have to install winscp and the code some files and it will do automaticaly. I used 3 or 4 times. Also you need the key for your SFTP to connect throught it.
Here a link to the guide step by step.
https://winscp.net/eng/docs/guides
Here is a link to the scripting webpage
https://winscp.net/eng/docs/scripting
I recommend you to create an ini file to load all of the characteristics of your ftp connection and then execute an script over this.
It can be launched like that
WinSCP.com /ini=[your ini file] /script=[Your script file(what you want to do when it is connect)]
Hope this helps!.

Google Compute Startup Script PHP Files From Bucket

I'd like to automatically load a folder full of php files from a bucket when an instance starts up. My php files are normally located at /var/www/html
How do I write a startup script for this?
I think this would be enormously useful for people such as myself who are trying to deploy autoscaling, but don't want to have to create a new image with their php files every time they want to deploy changes. It would also be useful as a way of keeping a live backup on cloud storage.

Synchronizing with live server via FTP - how to FTP to different folder then copy changes

I'm trying to think of a good solution for automating the deployment of my .NET website to the live server via FTP.
The problem with using a simple FTP deployment tool is that FTPing the files takes some time. If I FTP directly into the website application's folder, the website has to be taken down whilst I wait for the files to all be transferred. What I do instead is manually FTP to a seperate folder, then once the transfer is completed, manually copy and paste the files into the real website folder.
To automate this process I am faced with a number of challenges:
I don't want to FTP all the files - I only want to FTP those files that have been modified since the last deployment. So I need a program that can manage this.
The files should be FTPed to a seperate directory, then copy+pasted into the correct destination onces complete.
Correct security permissions need to be retained on the directories. If a directory is copied over, I need to be sure that the permissions will be retained (this could probably be solved by rerunning a script that applies the correct permissions).
So basically I think that the tool that I'm looking for would do a FTP sync via a temporary directory.
Are there any tools that can manage these requirements in a reliable way?
I would prefer to use rsync for this purpose. But seems you are using windows OS here, some more effort is needed, cygwin stuff or something alike.