Google Compute Startup Script PHP Files From Bucket - google-cloud-storage

I'd like to automatically load a folder full of php files from a bucket when an instance starts up. My php files are normally located at /var/www/html
How do I write a startup script for this?
I think this would be enormously useful for people such as myself who are trying to deploy autoscaling, but don't want to have to create a new image with their php files every time they want to deploy changes. It would also be useful as a way of keeping a live backup on cloud storage.

Related

How logging to be done into a file in a specific folder

Some macOS apps write their logs into a folder like /Library/Logs or ~/Library/Logs? How this can be achieved?
I tried by creating a folder in ~/Library/Logs using FileManager.createDirectory. But I think creating a file and write every time on it using file manager functions will make the app more complex.

How to automatically scan the re-uploaded files with some modification in the wildfly-10 without the server restart?

I am using wildfly-10 server. I am providing an option to Upload images or jsp files for the user in the UI and the user can make use of these files in the other section of the application later.
At any one point of time I am allowing only one entry with a particular name. If the user tries to upload file with a name that is already existing then I am trying to overwrite the existing one with the new file.
In this scenario I am facing the below problem:
I have uploaded a image with the name image1.png.
Now if I change some other image's name to image1.png and upload it, the new image is not visible until I restart the server.
Looks like the older image has been cached by the server and it is still referring to the cache location. When I restart the server then it refreshes the cache with the new content of the file.
Is there any way that I can immediately see the changes in the UI whenever I re-upload the modified file?
I am using a custom folder to store the uploaded files in my server.
Is there way that I can enable deployment directory scan for this particular directory only?
You don't have to restart the server, a redeploy of the application should work.
You can define another deployment scanner or the directory scanned by the scanner: http://wildscribe.github.io/WildFly/16.0/subsystem/deployment-scanner/scanner/index.html
Another solution would be to create overlays http://wildscribe.github.io/WildFly/16.0/deployment-overlay/index.html .
Thirdly with exploded deployments WildFly already provide the functionality you have developed: https://wildfly.org/news/2017/09/08/Exploded-deployments/ (note that all jboss-cli operations can be called using HTTP rest API)

Deploy Click once as a single file?

I am looking to use click once to deploy an application for internal use, When publishing to the network share it creates several files and folders. (manifest, ApplicationFiles etc)
Is there a way to bundle this up as a single file, I do not fancy the idea of allowing other users access to the application Files folder that is created, I would rather just give them the exe and have it take care of everything else.
Does anyone have experience with this, or am I stuck with the application Folder, Application Manifest, and setup file all being in the same directory for installation.
There is not a way to package the whole application folder and files into one file, like an MSI with ClickOnce.
You could code something on your own to have a shell app that use ClickOnce and its only file would be your app compressed. The shell would download that compressed file to the client's machine and would unzip etc.
You could also InstallShield Limited Edition that comes with VS 2012/2013 in the Other Projects, Setup and Deployment but that does give you the ClickOnce easy of deployment features. You could use the InstallShield setup to be your compress file in your shell clickonce app and then just use Process.Start to launch the InstallShield setup. It should work.

Synchronizing with live server via FTP - how to FTP to different folder then copy changes

I'm trying to think of a good solution for automating the deployment of my .NET website to the live server via FTP.
The problem with using a simple FTP deployment tool is that FTPing the files takes some time. If I FTP directly into the website application's folder, the website has to be taken down whilst I wait for the files to all be transferred. What I do instead is manually FTP to a seperate folder, then once the transfer is completed, manually copy and paste the files into the real website folder.
To automate this process I am faced with a number of challenges:
I don't want to FTP all the files - I only want to FTP those files that have been modified since the last deployment. So I need a program that can manage this.
The files should be FTPed to a seperate directory, then copy+pasted into the correct destination onces complete.
Correct security permissions need to be retained on the directories. If a directory is copied over, I need to be sure that the permissions will be retained (this could probably be solved by rerunning a script that applies the correct permissions).
So basically I think that the tool that I'm looking for would do a FTP sync via a temporary directory.
Are there any tools that can manage these requirements in a reliable way?
I would prefer to use rsync for this purpose. But seems you are using windows OS here, some more effort is needed, cygwin stuff or something alike.

online space to store files using commandline

I require a small space online (free) where I can
upload/download few files automatically using a script.
Space requirement is around 50 MB.
This should be such that it could be automated so I can set
it to run without manual interaction i.e. No GUI
I have a dynamic ip & have no tech on setting up a server.
Any help would be appreciated. Thanks.
A number of online storage services provide 1-2 GB space for free. Several of those have command-line clients. E.g. SpiderOak that I use has a client that can run in a headless (non-GUI) mode to upload files, and there's even a way to download files from it by wget or curl.
You just set up things in GUI mode, then put files into the configured directory and run SpiderOak with right options; files get uploaded. Then you either download ('restore') all or some of the files via another SpiderOak call or get them via HTTP.
About the same applies to Dropbox, but I have no experience with that.
www.bshellz.net gives you a free shell running Linux. I think everyone gets 50mb so you're in luck!