The google API is confusing me. I have no idea how to approach this problem.
I have 62GB of data on a google drive account. I want it transferred to my server cluster account. How can I do this without downloading the data to my local device and uploading it to a server; cut out the middle man?
I know perl but the perl module for google drive api is ambiguous at best.
Google Drive server cannot actively upload file to your server. Your server should have authorized Drive API to download from Google Drive. To rephrase, you can't "upload" from Drive to remote server. You should "download" from Drive to remote server.
To do this, you need to authenticate your web application. Then, you can select file and retrieve fileId of the file you want to download to remote server. Then, you send this fileId to your remote server with, for example, simple HTTP request. Your server will then trigger download of the file from Drive.
Related
I have created a SUbdomain in DegitalOcean Server. Now I need to upload a file on my Server and merge this file with a Subdomain. What should I do?
I need every single detail.
We host a website in our company.
A certificate was issued to www.ourdomainname.com from the company IT department.
Now we want to move the website to azure and install the certificate there.
I already exported the certificate with private key exported set to true from the server.
1.) What will happen when the certificate is installed on azure when it is also installed on our company server?
2.) What will happen when the website on our server is stopped in the server and the certificate is then imported to the azure website?
3.) How can I guarantee a soft transition time without any break?
The aim is:
Website on the company server going to be deleted and the website on azure is used instead.
What will happen when the certificate is installed on azure when it is also installed on our company server?
web site will be available via SSL in Azure too.
What will happen when the website on our server is stopped in the server and the certificate is then imported to the azure website?
web site on your server will be inaccessble.
How can I guarantee a soft transition time without any break?
it is more about DNS management. There is no much work with SSL. You just install SSL on both internal and Azure servers, so clients can access both. Test if web site on Azure works the same way as on your internal server. Then point all clients (via DNS) to a web site on Azure. When all clients move and there are no references to internal server, you can safely shutdown it.
The SSL Certificate which was exported from the current server has to be imported in Azure. The format of the certificate has to be PFX.
Now, in the DNS Management , you need to edit the A record for the URL and point it to the IP address of Azure. This will make sure that any request made will be handled by Azure .
I'm bit new to Cloud Storage.
We have an application which uploads files to our FTP server.
Now as our system grows, we would go for Cloud Storage service like Google Cloud Storage.
The main issue is our client software is already distributed to thousands of customers. This client software uploads file with FTP commands.
Now if we plan to change our storage to Google Cloud Storage, is it possible to upload file using FTP commands from our client software to Google Cloud Storage.
You could try running an FTP server on top of a directory mounted with gcs-fuse:
https://cloud.google.com/storage/docs/gcs-fuse
I'm using the phpseclib for secure ssh and ftp access. My site is on a webserver, that connects to a different backup server and displays the files and folders, imagine a frontend for a backup on several different servers.
Now I would like them to be able to download a file, but I can't think of a better method than temporarily store the file on the front webserver.
I looked at the phpseclib docs and didn't see a great way to transfer a file from one of the backup servers to the frontend server to a client in a fast efficient way without having to fully copy the file from backup server to frontend server before passing it to the client.
Using cURL you can use one of its callback functions to serve a download to the client simultaneously while the file is being downloaded from the backup server by cURL. cURL will support ssh and sftp. This way, you don't have to have the frontend fully download the file from the backup before being able to send it to the client.
I have shown a similar example of this using the FTP protocol in this answer: Streaming a file fromFTP and letting user to download it at the same time
Feel free to ask for more help if you have any questions implementing this solution with SSH and your system.
Whenever I connect to customer site with Citrix Xenapp it takes me around 15-20 minutes for the Remote Desktop Prompt to come up, while for other users it's almost immediate. My connection is faster than theirs(25/4mbps). What is Citrix XenApp doing that's taking mine so long- any guesses?
The IT person at the customer site, said it's because my profile is over 1gb. What is "my profile"? They haven't been able to tell me that. They said I should "clear stuff off my desktop". Whenever I clear my things off the desktop at the customer site, it comes back the next time I login - like undeletes. Are they talking about desktop on their server, or on my local machine?
Thank you for any tips!
The profile being referred to is your Windows Roaming Profile. The most esstenial part of the roaming profile is ntuser.dat, which contains registry information for that user. It also can contain a large amount of data stored in system folders such as the Desktop or My Documents if folder redirection is not enabled.
So what happens when you try and open RDP from the XenApp server is that server requests your roaming profile from another server on the backend and downloads a local copy to the XenApp server. The profile is not sent to your local computer. The beauty of XenApp is that you only see compressed screen updates. So if the XenApp farm is well designed, your roaming profile is on a LAN server on the backend so the roaming profile can be accessed very fast.