Upload files to Google Cloud Storage using FTP client - google-cloud-storage

I'm bit new to Cloud Storage.
We have an application which uploads files to our FTP server.
Now as our system grows, we would go for Cloud Storage service like Google Cloud Storage.
The main issue is our client software is already distributed to thousands of customers. This client software uploads file with FTP commands.
Now if we plan to change our storage to Google Cloud Storage, is it possible to upload file using FTP commands from our client software to Google Cloud Storage.

You could try running an FTP server on top of a directory mounted with gcs-fuse:
https://cloud.google.com/storage/docs/gcs-fuse

Related

Cloudflare CDN for Google Cloud Storage Bucket

I have a custom domain (cdnexample.com) and a Firebase Google Cloud Storage Bucket (examplefiles.appspot.com).
I want to configure cdnexample.com domain in Cloudflare CDN to source from GCS bucket (examplefiles.appspot.com).
For example, given a GCS File: https://storage.googleapis.com/examplefiles.appspot.com/image1.jpg I want to get the Cloudflare CDN File working: https://cdnexample.com/image1.jpg
The problem is that I cannot change the GCS bucket name (examplefiles.appspot.com) to match my Cloudflare domain name (cdnexample.com). All the solutions I came across below require the GCS bucket name to match Cloudflare domain name and use CNAME configuration with c.storage.googleapis.com.
I have read through the following relevant articles:
https://cloud.google.com/storage/docs/request-endpoints
https://devopsdirective.com/posts/2020/10/gcs-cloudflare-hosting/
https://community.cloudflare.com/t/using-cloudflare-cdn-https-with-google-cloud-storage/15602
How to cache google cloud storage (GCS) with cloudflare?
Using Cloudflare CDN + HTTPS with Google Cloud Storage
Use CloudFlare to CDN a Google Cloud Storage Bucket
https://medium.com/#pablo.delvalle.cr/cloudflare-and-google-cloud-for-hosting-a-static-site-fd2e1a97aa9b
Does anyone have an idea of how to make the Cloudflare CDN work in this case?
In this case you can set up a load balancer with backend bucket which will connect your storage bucket and can be accessed with an IP address, later you point the IP address in your custom domain. you can find the below information about adding a backend bucket here

Download from cloud storage bucket without internet

I have a requirement to download some files stored in a Google Cloud Storage bucket. The challenge is to download it without internet access. Is possible to interact with a Bucket without Internet access? Any suggestions?
Thanks,
Prasanth
No, it wouldn't be possible. You need internet connection to access resources hosted in the Cloud.
You would need to store the files locally or on a physical data storage device in order to access them without the connection.
The only possible option to not use "internet" is to use Dedicated Interconnect where basically you will have a cable from your on-premise to Google's network.
EDIT:
As I understand from the comment you edited, your actual goal is to connect to your GCS bucket from a private VM instance hosted on GCE.
For that you might want to use VPC Service Controls to define the security perimeter around your services and constrain data within a VPC. One of this product's advantages is that the VPC Service Controls provides an additional layer of security by denying access from unauthorized networks, even if the data is exposed by misconfigured Cloud IAM policies.
Here you can find the GCP documentation on configuring VPC Service Controls.

gsutil copying LetsEncrypt domain verification not working

When using the LetsEncrypt certbot to generate an SSL certificate for my domain, I am prompted to make a file available at my domain to verify my control at my domain:
http://example.com/.well-known/acme-challenge/XXXXXX
However when I try to upload that file to my Google Cloud Storage bucket I get the following error:
$ gsutil rsync -R . gs://example.com
Building synchronization state...
Starting synchronization
Copying file://./.well-known/acme-challenge/XXXXXX [Content-Type=application/octet-stream]...
BadRequestException: 400 ACME HTTP challenges are not supported.
Does Google Cloud Storage expressly forbid URLs with "acme challenge" in the path? Is it possible to setup a LetsEncrypt certificate for a domain hosted at a Google Cloud Storage bucket?
We worked around this by exposing /.well-known/acme-challenge as an endpoint and storing the challenge at a different directory that is allowed by Cloud Storage. When LE hits that endpoint we retrieve the generated challenge from it's directory and serialize it in the response.

Transfer files from Google Drive to Cluster account

The google API is confusing me. I have no idea how to approach this problem.
I have 62GB of data on a google drive account. I want it transferred to my server cluster account. How can I do this without downloading the data to my local device and uploading it to a server; cut out the middle man?
I know perl but the perl module for google drive api is ambiguous at best.
Google Drive server cannot actively upload file to your server. Your server should have authorized Drive API to download from Google Drive. To rephrase, you can't "upload" from Drive to remote server. You should "download" from Drive to remote server.
To do this, you need to authenticate your web application. Then, you can select file and retrieve fileId of the file you want to download to remote server. Then, you send this fileId to your remote server with, for example, simple HTTP request. Your server will then trigger download of the file from Drive.

Serving large file with zend framework thats connected to another sftp server

I'm using the phpseclib for secure ssh and ftp access. My site is on a webserver, that connects to a different backup server and displays the files and folders, imagine a frontend for a backup on several different servers.
Now I would like them to be able to download a file, but I can't think of a better method than temporarily store the file on the front webserver.
I looked at the phpseclib docs and didn't see a great way to transfer a file from one of the backup servers to the frontend server to a client in a fast efficient way without having to fully copy the file from backup server to frontend server before passing it to the client.
Using cURL you can use one of its callback functions to serve a download to the client simultaneously while the file is being downloaded from the backup server by cURL. cURL will support ssh and sftp. This way, you don't have to have the frontend fully download the file from the backup before being able to send it to the client.
I have shown a similar example of this using the FTP protocol in this answer: Streaming a file fromFTP and letting user to download it at the same time
Feel free to ask for more help if you have any questions implementing this solution with SSH and your system.