Running sdbinst on a .sdb in a network share location - command-line

I want to run the sdbinst command on a .sdb database file as well as open it in the compatibility administrator. I have no problem doing this locally when the .sdb is stored on the machine i'm using, but i'd like to be able to open and run sdbinst on it when the file is stored in a network store location.
Is this possible?

Yes, according to the MS Help files within the MS Compatibility Toolkit.
See: "Mitigating Issues by using Compatibility Fixes". There is an example of a network deployment workflow: "Deploying the Contoso.sdb Database to your environment".
The basic pattern is to place the sdb on a network Share. Create a one line deployment script that references a path to that share.(sdbinst "\\SomePath\Ex.sdb" -q) Either push or execute the deployment script to/on each target computer in your environment.

Related

How to update files whenever script is scheduled to run in Heroku app

I have a simple python script that is hosted on Heroku and I'm using the Heroku Scheduler to run the script every hour/day. The script will possibly update a simple .txt file (could also be a config var if possible) when it runs. When it does run and conditions are met, I need that value stored and used when the next scheduled script runs. The value changed is simply a date.
However, since the app is containerized based on the most recent code I have on Github, it doesn't store those changes anywhere to be used again. Is there any way I can accomplish to update the file and use it every time it runs? Any simple add-ons or other solutions I can use?
Heroku Dynos have a local file system that does not survive an application restart or redeployment, therefore it cannot be used to persist data.
Typically you have 2 options:
use a database. On Heroku you can use (there is also a Free tier) Postgres
save the file on external storage (S3, Dropbox, even GitHub). See Files on Heroku for details and examples

Unable to configure TFS backup using Backup wizard

When trying to configure the TFS 2010 backup using the TFS Power Tools I kept running into teh following error message:
Account TFS\tfsadmin failed to create backups using path \\tfs-xxxxxxx.local\TFSBackups
The strange thin is that TFS\TFSAdmin has full permissions on both share and file system and that the share path doesn't contain any spaces (thanks for MSDN forums for pointing that out).
I tried backing up through the SQL Server Management Studio, and sure, there the backups fail too.
It turns out that while the backup job is started using the account specified in the Create Backup Wizard of the TFS Power Tools, SQL Server will try to write the files to the share using its own service account.
So in addition to whomever needs access to the share, you need to add the service account running SQL Server to that share as well. In this case it was running under NETWORK SERVICE, so adding MACHINENAME$ to the share's list of permitted users did wonders.

Creating Powershell script for VMWare using veeam

I am trying to create a script to automate the Veeam backup using PowerShell.
I know in the free version I only have 2 options (Veeamzip and Quick Backup).
I have a Drobo on the network with the share setup and accessible.
I have gone into all the VMWare Hypervisors and created an account with the proper permissions to run a backup.
I am down to creating the syntax for running the backup.
I am confused when I look at their document. I am not sure if I am supposed to use a copy, replication, a backup job, or what.
If I can get the initial syntax to run a backup of one machine I know I can build the script.
Any information would be greatly appreciated.
This can not be done with this version of VMWare.

Recover cron jobs through file structure

One of the drives on my server recently gave out and corrupted the OS. I was able to restore all the files, but now I have a backup drive with just the file system; not bootable. I'm setting up a new server now, and need to setup the old cron jobs. Is there a way to look through the file structure to see all cron jobs that were setup on the old server? Server was CentOS, not sure of version. Thanks in advance!
Crontabs belonging to individual users should be found in
/var/spool/cron/##USERNAME##
Whereas the server-wide crontab should be in
/etc/crontab

Jenkins windows slave service does not interact with desktop

I have followed this guide to install a jenkins slave on windows 8 as a service:
https://wiki.jenkins-ci.org/display/JENKINS/Installing+Jenkins+as+a+Windows+service#InstallingJenkinsasaWindowsservice-InstallSlaveasaWindowsservice%28require.NET2.0framework%29
I need to run a job that interact with the desktop (run an application that opens a browser etc.). So after I have installed the slave as a service (running jnlp downloaded from the master) I have changed the service "Log on" to "Allow to interact with display".
For some reason its only possible to enable this for the "Local System account" even though its recommended to run the service as a specified user, eg. jenkins.
But nothing happens when I execute the job, the browser is not opened. If I instead stop the service and just launch the slave through the jnlp file the job runs fine - the browser is opened.
Anybody had any luck interacting with the desktop when running a jenkins windows slave as a service?
Services run since Vista in Session 0 and the first user is now in Session 1. So you can't interact any longer. This is called Session 0 Isolation.
Microsoft explains this here and here. You have to use 2nd Program which uses IPC to communicate to the Service.
I had lots of issues running Jenkins in Windows using the service.
Instead I now disable the service and run it from CMD.
So open CMD.
cd C:\Program Files (x86)\Jenkins
java -Xrs -Xmx256m -Dhudson.lifecycle=hudson.lifecycle.WindowsServiceLifecycle -jar
jenkins.war --httpPort=9091
To resolve it, first create Windows auto-logon as I explain here:
https://serverfault.com/questions/269832/windows-server-2008-automatic-user-logon-on-power-on/606130#606130
Then create a startup batch for Jenkins agent (place it in Jenkins directory). This will launch agent console on desktop, and should allow Jenkins to interact with Windows GUI:
java -jar slave.jar -jnlpUrl http://{Your Jenkins Server}:8080/computer/{Your Jenkins Node}/slave-agent.jnlp
(slave.jar can be download from http://{Your Jenkins Server}:8080/jnlpJars/slave.jar)
EDIT :
If you're getting black screenshots (when using Selenium or Sikuli, for example), create a batch file that disconnects Remote Desktop, instead of closing the RDP session with the regular X button:
%windir%\system32\tscon.exe %SESSIONNAME% /dest:console
Consider running the Java slave server directly at startup and then using something to monitor and restart should the server go down (e.g., Kiwi Restarter).
Please check the services (# TestNode) make sure the "Interactive Services Detection" service is STARTED, by default the startup type is set to Manual, you may like to set it to automatic as well.
After service started, when you run your test in the Test Node, you will see something like the below:
Click on it and choose view the message
You will see the activities happen there. Hope this helps :D
Note: If login with other account and cannot view the Interative Services Detection prompt, restart the service again.
My Jenkins Service runs as user "jenkins" and all I did was to create Desktop folders in: C:\Windows\system32\config\systemprofile\desktop and if 64 bit Windows also in C:\Windows\SysWOW64\config\systemprofile\desktop - then it runs perfectly.
Make sure that Desktop folders are created as such:
%WINDOWS%/System32/config/systemprofile/Desktop
%WINDOWS%/SystemWOW64/config/systemprofile/Desktop
Presence of those can sometimes be mandatory while running some Java software as a Service.