Command line for Dropbox connection in Duplicati backup - command-line

I'm trying to backup Folders from local drive to Dropbox using Duplicati command in Command prompt. (Backup should be Incremental)
C:\Users\Desktop\Office_Works\Duplicati\Duplicati 1.3.4\Duplicati>Duplicati.CommandLine.exe backup a https://www.dropbox.com/
Enter passphrase: **
Confirm passphrase: **
**Unable to find backend for: https://www.dropbox.com/**
"a" is my folder in local drive. Now I want to know how to make a connection with Dropbox using command lines. Is any particular way to connect Dropbox using duplicati commands?

There is no direct support for DropBox in Duplicati.
Others have reported using a local folder under the DropBox folder as a destination, such that the DropBox client synchronizes the folder for you.

There's a project that can upload files to dropbox from the command prompt. It's very lightweight and both installation and usage couldn't have been easier!
PneumaticTube
If you use it the first time it will open your browser to ask for permission to access your dropbox account but from then on it's easy sailing.

Related

Azure data factory - SftpPermissionDeniedException

Using a copy data activity I want to upload files to an SFTP service, but receive the following error message:
I can upload files via a simple linux sftp client to the target folder with the same user, and also able to create files and folders within the target folder(but not in its parent folder, which is the root folder).
"Upload with temp file" option is set to false.
Any idea?
To confirm which user your build runs as you can run the whoami command as a part of your build process.
Solution:
Store things inside of a folder that the user running the build has permissions to.
Change the ownership of the directory with the chown command before trying to write to it.
Refer - https://support.circleci.com/hc/en-us/articles/360003649774-Permission-Denied-When-Creating-Directory-or-Writing-a-File

Importing a large .sql file into a database - repeated timeout error myPHPAdmin

I have a .sql file (db) which I am trying to import using myphpadmin and keep getting a time out error.
The file is 46.6 MB (zipped)
Please note I am not on XAMPP but using a Godaddy myphpAdmin platform to manage the database.
What I've tried:
Re-downloaded the file as a zip file - and tried importing it. Still failed.
For this option given in phpmyadmin import, I tried UNSELECTING this option > "Allow the interruption of an import in case the script detects it is close to the PHP timeout limit. (This might be a good way to import large files, however it can break transactions.)"....and I also tried importing the db keeping it selected, but this failed. Which should it be?
What else can I do?
nothing worked, except SSH.
What you need:
Database (that you are importing into) username and password
Cpanel username and password + IP address (for Putty)
I had to upload the .sql file to a folder on the public_html.
Download pUtty
In putty I needed the IP address (hosting server) as well as the cpanel username and password (so have that handy).
Once in, you have to enter your cpanel's password
Use the "cd" change directory command to change directory to where you have placed your .sql file.
Once there, use the following command:
mysql -p -u user_name database_name < file.sql*
(Note: replace 'user_name', 'database_name', and 'file.sql' with the actual name.)**
You will be prompted for your database password, and then your database will be imported.
Useful link: https://www.siteground.co.uk/kb/exportimport-mysql-database-via-ssh/
You can try unzipping the file locally and importing the uncompressed .sql file; the overhead of uncompressing the file in memory could be the problem for phpMyAdmin. Generally, though, what Shadow said is correct and you should use some other means for import (like the command-line client). You could also use the phpMyAdmin UploadDir feature to put the file on the file in a special folder that phpMyAdmin can directly access on the server. This can help with a lot of the resource limits the webserver imposes.

Storage Manager in pgAdmin

I am trying to backup one of my databases in PostgreSQL pgAdmin tool. I used this tutorial:
backup database with pgAdmin
After finishing that I want to have the file. In that tutorial it says that we can use the Storage Manager to download the backup file on the client machine. After that from this link I wanted to access the Storage Manager. It says that "You can access Storage Manager from the Tools Menu", but from my system there is not any option with that name:
What is the problem and how could I obtain the backup database file?
If you are not running pgAdmin4 in server mode, then there is no storage manager. The storage manager is only relevant when the computer from which you run the pgAdmin4 GUI is different from the computer where the pgAdmin4 app-server is running.
When you took the backup, you told it where to save the file although not in a very user-friendly way. It asks for a filename, and there are three dots you can click to browse for a directory into which to put the file. But if you don't avail yourself of the three dots, then you don't know where it is going to put the file, it just uses an apparently OS-dependent default and doesn't tell you what it is. I usually find in my "Documents" folder. (Well, I usually don't use pgAdmin4 in the first place as it makes everything harder than just using the command line is, but when I do use it...)

How to backup postgres db hosted on cloud with pgadmin4?

I'm hosting my db using AWS RDS and I'm trying to backup tables. However once it's finished backing up, where is the downloaded on my computer?
Doesn't seem like theres a path to save the file
I've checked a couple of answers and others are having same issue
https://stackoverflow.com/a/29246636/11110509
The "Filename" element in that dialog box lets you pick a directory as well as file name. That is where it is. If you just typed in a filename without giving a path, then on Windows it is probably in your user's "Documents" folder.

Powershell script to copy files to remote server

I'd like to have files on a local machine and copy them to a remote machine on the internet. For this reason, I can't use UNC files shares. I'd also like to avoid using MSDeploy or FTP if possible. Does powershell have an easy way to copy a bunch of files to a remote server?
Look at BitsTransfer module, might help you - http://technet.microsoft.com/en-us/library/dd819420.aspx
When the remote server is a Linux or UNIX system, I use PSCP.EXE, the Windows SCP client created by the developer of Putty. I also use Puttygen to create a key pair that can be used instead of interactive password authentication.