imacros check if proxy is working - macros

I have a huge imacro script to run. I added a header to connect to a proxy using a list. I'm trying to do some error handling. I need to find a way to tell imacros if the current proxy is working or not. If it's working, then continue and execute the script. If it's not working, switch to the next loop so we can start over and try with another proxy.
Here's what i have so far:
SET !ERRORIGNORE YES
TAB T=1
CLEAR
SET !FILESTOPWATCH mydata.csv
STOPWATCH ID=total
SET !DATASOURCE C:\proxies.csv
SET !DATASOURCE_COLUMNS 1
SET !LOOP 1
SET !DATASOURCE_LINE {{!LOOP}}
SET !TIMEOUT 60
PROXY ADDRESS={{!COL1}} BYPASS=127.0.0.1
URL GOTO=http://www.mywebsite.com/test.php
The last line will connect to mywebsite.com and display the page test.php which is just a blank page with the word "working". If we can see the word "working" befiore the 60 seconds timeout, then we can continue with the rest of the script. But if we couldn't get the "working" word before 60 seconds, i need to exit current loop and switch to next, and this is what i can't figure out how to do

First of all try this script (for Firefox ‘iMacros’ estension!) and catch the idea:
var ret = iimPlayCode("URL GOTO=http://www.mywebsite.com/test.php");
if (ret == 1)
alert("Page is found!");
else
alert("Page is not found!");

Related

How to use WebUI.getUrl().contains('atlassian') with timeout value

I have a piece of code that has a 5 sec delay and getUrl after. If I dont delay the execution, getUrl returns false since the site doesn't load yet.
WebUI.delay(5)
assert WebUI.getUrl().contains('atlassian')
In the website, there is a div which leads to another window when clicked. This code checks if the opened page is an Atlassian webpage. However, I don't want to use delay for 5 sec(it may take way longer or shorter). Is there a way to put a timeout, for instance wait for 1 min until page loads and if not loaded -> fail execution?
Try waiting for page load
WebUI.waitForPageLoad(5, FailureHandling.STOP)
assert WebUI.getUrl().contains('atlassian')
This will wait for 5 seconds for the page load and stop execution with test failed if the page isn't loaded in that time.
Alternatively, you could use WebUI.waitForElementPresent(to, timeout) where to is a test object you are certain is present when the page is loaded.

How to call the controller task on each 1 min interval

I have created task on controller and there is loop which is loading for 100 times.
Now I want to load it for 25 times and pause that loop for 1 min and after that it will execute next 25 items same for next 25.
I have checked it with sleep but its not working.
Can you please advise me if is there any way on plugin event or any other method.
Thanks
This is actually unrelated to Joomla! Since you're creating a long running process you need to start it with something else than a browser. A CRON job is a good idea here if you want to execute this operation multiple times. Otherwise it can run via command line. Make sure the max_execution time setting of PHP does not cause any trouble.
If you still need this within Joomla please have a look at the CLI documentation.
https://docs.joomla.org/How_to_create_a_stand-alone_application_using_the_Joomla!_Platform

Reload page if 'not available'?

I've a standalone Raspberry Pi which shows a webpage from another server.
It reloads after 30 minutes via JavaScript on the webpage.
In some cases, the server isn't reachable for a very short time and Chromium shows the usual This webpage is not available message, and stops reloading
(because no JavaScript from the page triggers an reload).
In this case, how can I still reload the webpage after a few seconds?
Now i had the Idea to fetch the website results via AJAX and replace it in the current page if they were available.
Rather than refreshing the webpage every few minutes, what you can do is ping the server using javascript (pingjs is a nice library that can do that)
Now, if the ping is successful, reload the page. If it is not successful, wait for 30 more seconds and ping it again. Doing this continuously, will basically make you wait until the server is open again (i.e. you can ping it)
I think this is a much simpler method compared to making your own java browser and making a browser plugin.
Extra info: You should use a exponential function or timeout checking to avoid unnecessary processing overhead. i.e. the first time out find the ping fails, wait for 30 seconds, second time wait for 30*(2^1) sec, 3rd time wait for 30*(2^2) and so on until you reach a maximum value.
Note - this assumes your server is really unreachable ... and not just that the html page in unavailable (there's a small but appreciable difference)
My favored approach would be to copy the web page locally using a script every 30 mins and point chromium to the local copy.
The advantage is that script can run every 30 seconds, and it checks if the successful page pull happened in the last 30 mins. If YES it then does nothing. If NO then you can keep attempting to pull it. In the mean time the browser will be set to refresh the page every 5 seconds, but because it is pulling a local page it does little to no work for each refresh. You then can detect if what it has pulled back has the required content in it.
This approach assumes that your goal is to avoid refreshing the page every few seconds and therefore reducing load on the remote page.
Use these options to grab the whole page....
# exit if age of last reload is less than 1800 seconds (30 minutes)
AGE_IN_SECS=$(( $( perl -e 'print time();' ) - $(stat -c "%Y" /success/directory/index.html) ))
[[ $AGE_IN_SECS -lt 1800 ]] && exit
# copy whole page to the current directory
cd /temporary/directory
wget -p -k http://www.example.com/
and then you need to test the page in some way to ensure you have what you need, for example (using bash script)....
RESULT=$(grep -ci "REQUIRED_PATTERN_MATCH" expected_file_name )
[[ $result -gt 0 ]] && cp -r /temporary/directory/* /success/directory
rm -rf /temporary/directory/*
NOTE:
This is only the bare bones of what you need as I don't know the specifics of what you need. But you should also look at trying to ...
ensure you have a timeout on the wget, such that you do not have multiple wgets running.
create some form of back off so that you do not hammer the remote server when it is trouble
ideally show some message on the page if it is over 40 minutes old so that viewer knows a problem is being experienced.
you could use a chromium refresh plugin to pull the page from locally
you can use your script to alter the page once you have downloaded it if you want to add in additional/altered formatting (e.g. replace the css file?)
I see three solutions:
Load page in iframe (if not blocked), and check for content/response).
Create simple browser in java (not so hard, even if you dont know this language, using webview)
Create plugin for your browser.
reloading a page via javascript is pretty easy:
function refresh() {
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function() {
if (xhr.readyState == 4 && xhr.status === 200)
document.body.innerHTML = this.responseXML.body;
else
setTimeout('refresh', 1500);
};
xhr.open('GET', window.location.href);
xhr.responseType = "document"
xhr.send();
}
setInterval('refresh', 30*60*1000);
this should work as you requested

Repeat task, random message Mirc Scipt

Currently it reads the text.txt at random and it displays on a channel
on *:TEXT:!command:#channel:{
/msg $chan $read(text.txt)
I don't understand how to make it auto execute at x minute intervals, whitout using the !command
I've beginner at this, I want to make it like a /timer but can add read random lines from the text everytime
It's been a while since I last worked with mIRC, so I had to look up the documentation on /timer, but you should be able to do something like this:
on *:TEXT:!command:#channel:{
/timer 0 60 /msg $chan $!read(<textfile>)
}
This will execute /msg $chan $!read(<textfile>) an infinite number of times at 60 second intervals once !command has been entered into a channel.
If you need to cancel the timer for some reason, you would need to name the timer, which can be done by appending a name to the command, such as /timerMESSAGE or /timer1, and then including a command to turn the timer off, such as:
on *:TEXT:!timeroff:#channel:{
/timer<name> off
}
replacing <name> with the name of your timer.
EDIT: Thanks to Patrickdev for pointing out the difference of $!read() versus $read() for timer commands.
i suggest you to use this
if you disconnect from a network for whatever reason
ping timeout,broken pipe,connection reseted by peer,netsplit
it wont stop
the most efficient way is
using an on join event
on me:*:join:#channel:{
.timerrepeat 0 60 msg $chan $read(text.txt)
}
on me:*:part:#channel:{
.timerrepeat off
}
on *:disconnect:{
.timerrepeat off
}
this script will only triggers when you join on #channel
replace #channel with channel you want

How to setup email notification alert in IIS 6.0 web server when a file is uploaded via any ftp client?

I'm trying to setup email notification alerts in IIS 6 when a file is uploaded via any FTP client. Does anyone know how to accomplish this?
I found something similar but don't understand how to implement it:
http://forums.iis.net/t/1196793.aspx/1?How+to+add+email+notification+service+in+IIS+6+0+when+a+file+is+uploaded+via+FTP+
Does anyone have any insight on this?
function countFolders(strPath)
dim objShell
dim objFolder
dim folderCount
set objShell = CreateObject("shell.application")
set objFolder = objShell.NameSpace(strPath)
if (not objFolder is nothing) then
dim objFolderItems
set objFolderItems = objFolder.Items
if (not objFolderItems Is Nothing) then
folderCount=objFolderItems.Count
end if
set objFolderItem = nothing
end if
set objFolder = nothing
set objShell = nothing
countFolders=folderCount
end function
The post you're citing basically suggests this:
Create a script which checks the number of files in a folder(or folders) (as you have).
Create a running total of number of files. Maybe save this value into a database or another txt file.
If the number of files differs from the last time the check was ran, then send the email.
It suggests using scheduled tasks. This means an email is sent exactly when the FTP is updated only when you script is executed. The good thing about Windows Tasks is you can run it as often as you like. So, assuming you don't need an immediate notification, you could set your script to run once a minute, once every 10 minutes or similar.
The problem with the above though is if people are removing files as well, you'll probably get missed notifications. EG assuming you don't want to be notified when a file is removed, this means if my current count of files is 10, 3 and removed and 1 added, this means next time the script runs I have 8 files. There is no way to know that files had been removed/re-added. In this case, you want to take a notice of the file-names and paths, make a note of them so you can compare existing paths to previous paths!
I have just completed a very similar task, but I had an extra luxury. I wrote the FTP client which had to be installed on all clients machines to send files to my FTP. This meant, in my FTP program, I had an extra bit of code which did: OnUploadCompleted -> Send Notofication Email
You could create a service which uses the FileSystemWatcher.
The FileSystemWatcher listens to files system change notifications. In the provided link is a good example how to use the class.