SqlCompare command line remote server - redgate

I am trying to use the sqlcompare command line utility to compare a local database and a remote database but I am having difficulty doing this.
I would like to specify a connection string or some way to connect to the remote server (Destination)
Here is the command I have so far:
sqlcompare /Database1:RootDev /Database2:RootProd /scriptFile:"_build\changes.sql" /f

You'll want something along the lines below, using the switches /Server1 and /Server 2 to specify the servers.
sqlcompare /Server1:local\SQL2008 /Database1:RootDev /Server2:remote\SQL2008 /Database2:RootProd /scriptFile:"_build\changes.sql"
Full details on the docs

Related

How can I use CLI to start a service on multiple remote PCs?

Due to the constraints of my environment, I am unable to use Powershell to start this service and need to use CLI. I need to be able to start a single service on apporx. 100 workstations.
I have been able to start the service remotely on a single PC, but I would like for it to point to a txt list of computers.
This is the command that will work on a single workstation
sc \\remotepc start [service]
I have tried this command but does not start the service on the remote workstation
for /f %H in (D:\Server\workstationlist.txt) do net start [service]
I was able to find a few ways to execute these commands
for /f %i in (computerlist.txt) do sc \\%i start winrm
computerlist.txt is a text file containing the names of the remote computers with one name per line. You can also replace this with a comma-separated list of computer names.
For /f "tokens=1" %i in ("Computer1,Computer2,Computer3") do sc \\%i start winrm
Where 'Tokens=1' tells the for loop to split the list into separate tokens based on the comma character and use only the first token as the computer name

Change ODBC in command line (Win. 7)

I have a set of ODBCs that have to be changed each time I move a computer from one domain to the other. I have move a bunch of them so I am looking for a way to do this with command line to save time. If you know any faster ways of doing this I am open to this also.
You can change from cmd as below.
odbcconf configsysdsn "MySQL ODBC 5.2w Driver"
"DSN=test;SERVER=127.0.0.1;PORT=3306;DATABASE=mdb;UID=root 
Reference: Install An ODBC connection from cmd line

Using Netsh with PsExec

I'm trying to dump DHCP settings from an older server thats being decommissioned. I ran the command fine while on the actual server but when trying to run it using psexec remotely, it keeps failing. The command is: psexec \\$source netsh dhcp server \\$source dump>$dhcpSettings
$source = the server being decommissioned
$dhcpSettings = the path to save the dumped settings
I've tried all sorts of combinations of encapsulating quotation marks but still nothing. the errors I'm getting is, "The system cannot find the file specified" and "The system cannot find the path specified"
EDIT: So I got rid of the path to save the dumped settings and now it works. But how should I format the command so that it'll save the settings to the remote computer's C:\USER.SET\LOG directory?
One solution might be to bundle the command you want to run and the stdout redirection into a single line cmd file and use PsExec -c or -f to copy and execute that file on the remote system. As an example
Create a line cmd file named DHCPSettings.cmd with the following in it and save it to C:\temp\:
netsh dhcp server \\localhost dump >c:\user.set\log\dhcpsetting.log
Then use
psexec \\$source -c c:\temp\DHCPSettings.cmd
You did not really provide any code to go by and I am not sure I understand the all requirements and constraints you have, so consider this as an idea; not the exact commands you need to run. Hope it helps.

Need to list the file names in remote server using sftpc

I need to list the file names in a remote server using the PSFTP command.
sftpc -profile=remote_server_profile.tlp -cmd=ls location
This command is not giving any output. Does anyone know what may cause this?

Is it possible to have Perl run shell script aliases?

Is it possible to have a Perl script run shell aliases? I am running into a situation where we've got a Perl module I don't have access to modify and one of the things it does is logs into multiple servers via SSH to run some commands remotely. Sadly some of the systems (which I also don't have access to modify) have a buggy SSH server that will disconnect as soon as my system tries to send an SSH public key. I have the SSH agent running because I need it to connect to some other servers.
My initial solution was to set up an alias to set ssh to ssh -o PubkeyAuthentication=no, but Perl runs the ssh binary it finds in the PATH instead of trying to use the alias.
It looks like the only solutions are disable the SSH agent while I am connecting to the problem servers or override the Perl module that does the actual connection.
Perhaps you could put a command called ssh in PATH ahead of the ssh which runs ssh as you want it to be run.
Alter the PATH before you run the perl script, or use this in your .ssh/config
Host *
PubkeyAuthentication no
Why don't you skip the alias and just create a shell script called ssh in a directory somewhere, then change the path to put that directory before the one containing the real ssh?
I had to do this recently with iostat because the new version output a different format that a third-party product couldn't handle (it scanned the output to generate a report).
I just created an iostat shell script which called the real iostat (with hardcoded path, but you could be more sophisticated), passing the output through an awk script to massage it into the original format. Then, I changed the path for the third-party program and it started working fine.
You could declare a function in .bashrc (or .profile or whatever) with that name. It could look like this (might break):
function ssh {
/usr/bin/ssh -o PubkeyAuthentication=no "$#"
}
But using a config file might be the best solution in your case.