I would like to make a Daily Backup of my MongoDB from a replication set running from Windows 2012 servers.
End goal would be to get a daily backup and write the backup to a remote or local share - Windows.
Can I batch the mongodump command?
Any help would be greatly appreciated!!
Sorry, it's a bit late but the following seems to work OK for me. The script dumps the database and compresses the output using 7-Zip.
1) Create backup script (backup.bat)
#echo off
REM move into the backups directory
CD C:\database_backups
REM Create a file name for the database output which contains the date and time. Replace any characters which might cause an issue.
set filename=database %date% %time%
set filename=%filename:/=-%
set filename=%filename: =__%
set filename=%filename:.=_%
set filename=%filename::=-%
REM Export the database
echo Running backup "%filename%"
C:\mongodb\mongodump --out %filename%
REM ZIP the backup directory
echo Running backup "%filename%"
"c:\Program Files\7-Zip\7z.exe" a -tzip "%filename%.zip" "%filename%"
REM Delete the backup directory (leave the ZIP file). The /q tag makes sure we don't get prompted for questions
echo Deleting original backup directory "%filename%"
rmdir "%filename%" /s /q
echo BACKUP COMPLETE
2) Schedule the backup
Open Computer Management
Go to Task Scheduler and select Create Task.
On the General tab, enter a description and select Run whether user is logged on or not if you want the task to run at night.
On the Triggers tab, select when you would like the task to run.
On the Actions tab, create a new action which points at your batch script.
I'm running on linux, not Windows 2012, but here is what I do. On one of the servers in the replica set, this script gets run every night via a cron job.
#config
BACKUPNAME=[backup file name]
DATAPATH=[path to mongo data folder]
DATESTAMP=$(date +"%Y-%m-%d")
FILENAME=backup.$BACKUPNAME.$DATESTAMP.tar.gz
TARPATH=$DATAPATH/$FILENAME
echo $DATESTAMP;
/etc/init.d/mongod stop
/usr/bin/mongodump --journal --dbpath $DATAPATH --out $DATAPATH/backup
tar czvf $TARPATH $DATAPATH/backup
rm -rf $DATAPATH/backup
/usr/bin/s3cmd put $TARPATH s3://[backup s3 bucket name]/$FILENAME
rm -f $TARPATH
/etc/init.d/mongod start
/scripts/prunebackups
I'm using s3cmd to send files to an S3 bucket on Amazon AWS, but you could just as easily copy the file anywhere. prunebackups is a script that deletes old backups from S3 based on how old they are.
On Windows I'd create a batch file that does similar tasks. In essence:
Stop mongod
run mongodump to generate the data
zip up the dumped data and move it somewhere
clean up files
start mongod again
You can then use Task Scheduler to run it periodically.
If you have other mongod instances in the replica set, you shouldn't run into any issues with downtime. The backup instance in my setup is never used for reads or writes, but only for backups and in case one of the other instances goes down.
MongoDB has documentation on different backup strategies: http://docs.mongodb.org/manual/administration/backup/
We chose the mongodump approach because for us it's cheaper to store dump backups than snapshots. This is the specific strategy we used: http://docs.mongodb.org/manual/tutorial/backup-databases-with-binary-database-dumps/. Good news is, we actually had to restore data from a backup to production once and it was pretty painless.
Related
I have this little script running on a windows machine that syncs down files from google storage bucket and sends the new files to a printer hot folder. The file disappears from printers hot directory as soon as it is printed. As you can see the script uses a backup directory to compare each and every file to make sure only new files are sent to the printer. This solution works well, but clearly not very efficient for large volume of files. Just wondering if rsync has options to copy only new files from bucket since the last run.
#echo off
SET SOURCE=%1
SET DESTINATION=%2
SET HOTDIR=%3
SET BACKUPDIR=%4
SET GSUTIL_INST_DIR=C:\PRINT
SET PARENT_DIR=C:\PRINT
SET GSUTIL=%GSUTIL_INST_DIR%\google-cloud-sdk-386.0.0-windows-x86_64-bundled-python\google-cloud-sdk\bin\gsutil
SET GSUTIL=%GSUTIL% -m
call %GSUTIL% rsync -d -C %SOURCE%/ %DESTINATION%/
:: Compare each file before sending it to hot directory
for /f %%F in ('dir /b "%DESTINATION%"') do (
if not exist "%BACKUPDIR%\%%F" (
XCOPY /Y /F "%DESTINATION%\%%F" "%HOTDIR%\%%F*"
)
)
robocopy %DESTINATION% %BACKUPDIR% /MIR
SET CONFIRMATION= "%DATE% %TIME% %SOURCE% to %HOTDIR%"
Just wondering if rsync has options to copy only new files from bucket
since the last run.
Below solution may work for your scenario:
Maintain last script run timestamp history in some local file
List the GCS objects(files) in sorted order by date
gsutil ls -l gs://your-bucket/ | sort -k 2
The above command will give the timestamp of the objects added, so you can pipe the output and filter the object list based on the timestamp stored in history file.
I am running attached script to backup postgresql database by using task scheduler. Script is executed successfully but backup is not happening. Same script i have run on powershell and it's working fine.enter image description here
I want to schedule daily backup on windows server. Please help me or suggest any alternative to automate the backup.
Regards
Sadam
Try this script, it will create a backup file with a name consisting of a timestamp:
First, create a backup.bat file and just run it (set your credentials and database name):
echo off
echo 'Generate backup file name'
set CUR_YYYY=%date:~10,4%
set CUR_MM=%date:~4,2%
set CUR_DD=%date:~7,2%
set CUR_HH=%time:~0,2%
if %CUR_HH% lss 10 (set CUR_HH=0%time:~1,1%)
set CUR_NN=%time:~3,2%
set CUR_SS=%time:~6,2%
set BACKUP_FILE=%CUR_YYYY%-%CUR_MM%-%CUR_DD%_%CUR_HH%-%CUR_NN%-%CUR_SS%.custom.backup
echo 'Backup path: %BACKUP_FILE%'
echo 'Creating a backup ...'
set PGPASSWORD=strongpas
pg_dump.exe --username="postgres" -d AdventureWorks --format=custom -f "%BACKUP_FILE%"
echo 'Backup successfully created: %BACKUP_FILE%'
As a result, you should see such a picture, and a file with the extension .custom.backup should appear in the directory
If you get an error that the executable file "pg_dump.exe" was not found, then specify the full path to this file, something like "C:\Program Files\PostgreSQL\12\bin\pg_dump.exe", or add a directory with binaries PostgreSQL to environment variables (more details here)
-
To schedule regular execution, you can use the windows task scheduler
Press win + r and enter taskschd.msc
Select Create Basic Task
Then follow the steps of the wizard, specify the schedule, and in the "Action" section, specify "Start a program" and then the path to the backup.bat file
To check that everything is correct, find the task in the list and select Run
You can read more about Postgresql backups on Windows here
I am receiving the message:
The process cannot access the file because another process has locked a portion of the file
Cannot open the disk 'C:\Users\t825665\VM's\VPC\Windows 10 x64.vmdk' or one of the snapshot disks it depends on.
Module 'Disk' power on failed.
Failed to start the virtual machine.
So the virtual machine is not starting anymore, how to fix that?
I just found the solution for this issue. I created a backup and moved the 'lck' files from my VM's directory (*.lck), removing them from the VM's directory. Then just restarted the virtual machine.
To solve this error, please go to virtual O's directory and delete every thing with an ".lck" extension.
removing folders with an extension of lck solved the issue for me
I run the batch file below to delete all temporary files , locks, directories and memory files in the VMWare Working Directory (i.e. Settings/Options/Working Directory). It's got me out of many a jam. You will lose any unsaved work that was in VMWare suspended memory so backup before using if you're not sure. It will reboot the image as if it was shutdown.
--------------------------Clean.bat ----------------
#echo off
REM - Delete all directories in Working Directory
set dr=%cd%
set ex=\*
set "dr=%dr%%ex%"
for /d %%a in ("%dr%") do rd "%%a" /q /s
REM - Delete files in Working Directory
del *.log
del *.vmem
del *.vmss
del *.nvram
del *.vmx~
pause
Workstation shut down, delete any *.lck files and folders in the VM folder. Then reopen the Workstation, load the VM, and power on.
I am trying to gather files/folders from multiple computers in my network into one centralized folder in the command console (this is the name of the pseudo server for this set of computers)
Basically, what i need is to collect a certain file from all the computers connected to my network and back it up in the console.
Example:
* data.txt // this is the file that i need to back up and its located in all the computers in the same location
* \console\users\administrator\desktop\backup\%computername% // i need each computer to create a folder with its computer name into the command console's desktop so i can keep track of which files belongs to which computer
I was trying to use psexec to do this using the following code:
psexec #cart.txt -u administrator -p <password> cmd /c (^net use \\console /USER:administrator <password> ^& mkdir \\console\users\Administrator\Desktop\backup\%computername% ^& copy c:\data.txt \\console\USERS\Administrator\DESKTOP\backup\%computername%\)
any other suggestions since im having trouble with this command
Just use the command copy must easy.
take a look:
for /F %%a in (computerslist.txt) do (
copy \\%%a\c$\users\administrator\desktop\%%a\*.txt c:\mycollecteddata\%%a
)
that will copy all files *.txt for all computers that are on computereslist.txt; the copy will be with the current credentials. Save the code on a file *.cmd and execute with the right user, you can create a scheduled taks to start with a user thant is commom for all computers.
Good work.
I want to know if there is any way for me to deploy a stored procedure (sql file) that is checked-in at VSS? I have a couple of procedures that I want to deploy to SQL Server. I'm trying to create a batch file to deploy them from VSS to SQL Server.
I want to achieve this since we would like to remove direct access to SQL Server. So that everything step we do on the procedures could be monitored.
Thanks!
EDIT:
I have also read that it's possible in Powershell. If anyone can point me to a good way to do it, that would be appreciated so much! I'm new to the VSS, Batch Files, and Powershells. So I'm a little bit confused where to start. Thanks!
This is what I have so far. But it doesn't work.
#echo off
cls
set path=C:\Program Files\Microsoft Visual SourceSafe
set ssdir=\\MySampel_VSS\VSS\SampleDB
set Recursive = Yes
set /p SName=Server Name :
set /p UName=User Name :
set /p Pwd=Password :
set /p DbName=Database Name :
set /p choice=ARE YOU SURE TO EXECUTE SCRIPTS in %DbName% (y/n) ?
if '%choice%'=='y' goto begin
goto end
:begin
if exist C:\Scripts\error.txt del C:\Scripts\error.txt
#echo on
sqlcmd -S %SName% -U %UName% -P %Pwd% -d %DbName% -I -i $/Database/SampleDB/Procedures/MySample.sql >> error.txt 2>&1
#notepad error.txt
:end
You need to use Visual Source Safe command line in a batch file to deploy a project that is checked in. Take a look at the Checkout and Deploy commands in particular.