I have a requirement to monitor a particular java string in a logfile whose name changes with date, for ex. D:\Logs\logfile-DDMMYYYY.log. I want to get a alert if this string is detected in the log. Can someone help me that?
Regards,
Script is your best way for accomplish that, if it is an linux machine you can wrote a perl\bash script. if you need to monitor Windows machine you can use VBScript.
If you prefer not dealing with scripts, you can use NiCE Excellnet & Free Log File Monitoring MP: http://www.nice.de/log-file-monitoring-scom-nice-logfile-mp/ But i don't sure if theirs mp supports monitoring wildcards log file names.
SCOM can monitor log files dynamically. You should monitor log directory D:\Logs with log file pattern logfile-????????.log.
Related
I need to read and write(update) some remote machine file.I am able to find the remote file using WMI(System.Management) but not able to do read or updation on that.
Any help would be appreciated.
Thanks
Himanshu
The WMI doesn't have any class (or method) to read or write the content of files. You may only retrieve the metadata (FileName, Date, Size) of the files using CIM_DataFile, or do tasks like Copy, Rename, Delete or Compress files.
RRUZ is correct: WMI cannot copy or create files over a network. This is because it would require credential "hopping":
http://msdn.microsoft.com/en-us/library/windows/desktop/aa389288%28v=vs.85%29.aspx
However, a workaround was recently created by Stackoverflow.com user Frank White in C#, and the WMI logic ports directly to VBS. Here's his solution:
WMI remote process to copy file
I ported it to a fully working VBScript:
https://stackoverflow.com/a/11948096/1569434
First check your file access in premmisions and set user "Everyone" to Full Control
then try it again.
I have two drives A and B. Using a python script I am creating some files in "A" drive and I am running a powerscript which copies all the files in the drive A to drive B in the interval of 1 sec.
I am getting this error in my powershell.
2015/03/10 23:55:35 ERROR 32 (0x00000020) Time-Stamping Destination
File \x.x.x.x\share1\source\ Dummy_100.txt The process cannot access
the file because it is being used by another process. Waiting 30
seconds...
How will I overcome this error?
This happened is because the file is locked by running process. To fix this, download Process Explorer. Then use Find>Find Handle or DLL, find out which process locked this file. Use 'taskkill' to kill that process in commandline. You will be fine.
if you want to skip this files you can use /r:n that n is times of tries
for example /w:3 /r:5 will try 5 time every 3 seconds
How will I overcome this error?
If backup is, what you got in mind, and you encounter in-use files frequently, you look into Volume Shadow Copies (VSS), which allow to copy files despite them being ‘in use’. It's not a product, but a windows technology used by various backup tool.
Sadly, it's not built into robocopy, but can be used in conjunction with it. See
➝ https://superuser.com/a/602833/75914
and especially:
➝ https://github.com/candera/shadowspawn
It could be many reasons.
In my case, I was running a CMD script to copy from one server to another, a heap of SQL Server backups and transaction logs. I too had the same problem because it was trying to write into a log file that was supposedly opened by another process. It was not.
I ran many IP checks and Process ID checkers that I ran out of knowing what was hogging the log file. Event viewer said nothing.
I found out it was not even the log file that was being locked. I was able to delete it by logging into the server as a normal user with no admin privileges!
It was the backup files themselves by the SQL Server Agent. Like #Oseack said, there may have been the need to use another tool whilst the backup files themselves were still being used or locked by the SQL Server Agent.
The way I got around it was to force ROBOCOPY to wait.
/W:5
did it.
One of the drives on my server recently gave out and corrupted the OS. I was able to restore all the files, but now I have a backup drive with just the file system; not bootable. I'm setting up a new server now, and need to setup the old cron jobs. Is there a way to look through the file structure to see all cron jobs that were setup on the old server? Server was CentOS, not sure of version. Thanks in advance!
Crontabs belonging to individual users should be found in
/var/spool/cron/##USERNAME##
Whereas the server-wide crontab should be in
/etc/crontab
Is there a way to programmatically retrieve start-up time/duration for all the Windows Services that have started during boot in Windows XP?
That is, the time the service was initialized to getting to the "started" state.
Thank you in advance!
P.S. I'm not asking for software recommendation.
I would start with looking at the logs in event viewer, check if that information you want is there.
If it is, then use Microsofts EventLog class to get the log you want.
For example if you're interested in the System log use -
EventLog systemLog = new EventLog("System");
systemLog contains a collection of all entries in the System log. Should be easy from there.
I've developed a Powershell script to deploy updates to a suite of applications; including SQL Server database updates.
Next I need a way to execute these scripts on 100+ servers; without manually connecting to each server. "Powershell v2 with remoting" is not an option as it is still in CTP.
Powershell v1 with WinRM looks the most promising, but I can't get feedback from my scripts. The scripts execute, but I need to know about exceptions. The scripts create a log file, is there a way to send the contents of the log file back to the "client" (the local computer making the remote calls)?
Quick answer is No. Long version is, possible but will involve lots of hacks. I developed very similar deployment script/system using PowerShell 2 last year. The remoting feature is the primary reason we put up with the CTP status. PowerShell 1 with WinRM is flaky at best and as you said, no real feedback apart from ok or failed.
Alternative that I considered included using PsExec, which is very much non-standard and may be blocked by firewall. The other approach involves using system management tools such as MS's System Center, but that's just a big hammer for a tiny nail. So you have to pick your poison...
Just a comment on this: The easiest way to capture powershell output is to use the start-transcript cmdlet to pipe console output to a file. We have a small snippet at the start of all our script that sends a log file with the console output from each script to a central file share, and names the log file with script name and date executed so that we'll have an idea of what happened. Its not too hard to pipe all those log files into a database for further processing either. Probably won't seolve all your problems, but would definitely help on the "getting data back" part.
best regards,
Trond