I have some pages in my context. I trigger multiple requests like this, without waiting to complete/load the page:
//Trigger without await
page1.GotoAsync("https://slowWebsite.com");
page2.GotoAsync("https://fastWebsite.com");
//page3 you stay idle yet
Question: How to find the idle pages? Suppose fastWebsite.com loads after 1 second, and slowWbsite.com loads completely after 6 seconds. At second 2, I need to find idle/free/completed pages and make them do some new tasks:
foreach (IPage page in pages)
{
//if page is free/idle and is not waiting to load something and I need to get page2 and page3
};
Currently I check to find some elements on the page but It doesn't work for page3 or pages which have error connection/response failed and page is idle :
if (await page.Locator("id=HelloStacky").CountAsync() > 0)
{
//This page is completed and free for new jobs
}
Related
I have a piece of code that has a 5 sec delay and getUrl after. If I dont delay the execution, getUrl returns false since the site doesn't load yet.
WebUI.delay(5)
assert WebUI.getUrl().contains('atlassian')
In the website, there is a div which leads to another window when clicked. This code checks if the opened page is an Atlassian webpage. However, I don't want to use delay for 5 sec(it may take way longer or shorter). Is there a way to put a timeout, for instance wait for 1 min until page loads and if not loaded -> fail execution?
Try waiting for page load
WebUI.waitForPageLoad(5, FailureHandling.STOP)
assert WebUI.getUrl().contains('atlassian')
This will wait for 5 seconds for the page load and stop execution with test failed if the page isn't loaded in that time.
Alternatively, you could use WebUI.waitForElementPresent(to, timeout) where to is a test object you are certain is present when the page is loaded.
I am using Chrome browser(Version 54.0.2840.98 (64-bit)). There are two different operations happening in my application webpage. The success toaster for the result of first operation and second operation does not come simultaneously. There is a delay between the appearance of both toasters. I see that the second toaster appears after the first toaster has disappeared ( I have set timeout to 3 seconds). How would I validate using protractor that both toasters have appeared. The messages inside toasters are different. The ID of the toasters are same.
All the answers which I have seen here in stackoverflow site, doesn't work for this scenario.
First validate the appearance of first toaster, because the second appears when first disappears use waiting :
browser.wait(EC.invisibilityOf(element(by.id("toaster1"))), 30000).then(function () {
{here place the code for validation of second toaster}
})
var toast1 = element(by.css(".toast-message"));
expect(toast1).toEqual("validateText");
browser.wait(EC.invisibilityOf(element(by.id("toaster1"))), 30000).then(function () {
var toast2 = element(by.css(".toast-message2"));
expect(toast2).toEqual("validateText2");
})
I have a protractor test that navigates to another url, which cannot be found/resolved in my test environment, so I check if the title is not the previous title.
The test is as follows:
it('should navigate to another site in case of click on cancel link', function () {
page.navigate();
page.submit();
protractor.getInstance().ignoreSynchronization = true;
browser.wait(function(){
return element(by.id('submit')).isPresent();
});
page.closePage();
// the title of a 404, dns issue etc is at least different from the previous site:
expect(browser.getTitle()).not.toEqual('MyDummyTitle')
protractor.getInstance().ignoreSynchronization = false;
});
This works in most browsers, but in Internet Explorer I find that it often is not ready navigating to the non-existing page when the expect is fired.
Can I somehow wait for the 'submit' element to be gone, similar to what I do before firing the closePage?
What I do in this cases is an active wait of an element to disappear:
Using a custom waitAbsent() helper function that actively waits for an element to disappear either by becoming invisible or by not being present.
That helper waits up to specTimeoutMs ignoring useless webdriver errors like StaleElementError.
Usage: add require('./waitAbsent.js'); in your onPrepare block or file.
Example to wait for #submit to be gone:
expect(element(by.id('submit')).waitAbsent()).toBeTruthy();
I've a standalone Raspberry Pi which shows a webpage from another server.
It reloads after 30 minutes via JavaScript on the webpage.
In some cases, the server isn't reachable for a very short time and Chromium shows the usual This webpage is not available message, and stops reloading
(because no JavaScript from the page triggers an reload).
In this case, how can I still reload the webpage after a few seconds?
Now i had the Idea to fetch the website results via AJAX and replace it in the current page if they were available.
Rather than refreshing the webpage every few minutes, what you can do is ping the server using javascript (pingjs is a nice library that can do that)
Now, if the ping is successful, reload the page. If it is not successful, wait for 30 more seconds and ping it again. Doing this continuously, will basically make you wait until the server is open again (i.e. you can ping it)
I think this is a much simpler method compared to making your own java browser and making a browser plugin.
Extra info: You should use a exponential function or timeout checking to avoid unnecessary processing overhead. i.e. the first time out find the ping fails, wait for 30 seconds, second time wait for 30*(2^1) sec, 3rd time wait for 30*(2^2) and so on until you reach a maximum value.
Note - this assumes your server is really unreachable ... and not just that the html page in unavailable (there's a small but appreciable difference)
My favored approach would be to copy the web page locally using a script every 30 mins and point chromium to the local copy.
The advantage is that script can run every 30 seconds, and it checks if the successful page pull happened in the last 30 mins. If YES it then does nothing. If NO then you can keep attempting to pull it. In the mean time the browser will be set to refresh the page every 5 seconds, but because it is pulling a local page it does little to no work for each refresh. You then can detect if what it has pulled back has the required content in it.
This approach assumes that your goal is to avoid refreshing the page every few seconds and therefore reducing load on the remote page.
Use these options to grab the whole page....
# exit if age of last reload is less than 1800 seconds (30 minutes)
AGE_IN_SECS=$(( $( perl -e 'print time();' ) - $(stat -c "%Y" /success/directory/index.html) ))
[[ $AGE_IN_SECS -lt 1800 ]] && exit
# copy whole page to the current directory
cd /temporary/directory
wget -p -k http://www.example.com/
and then you need to test the page in some way to ensure you have what you need, for example (using bash script)....
RESULT=$(grep -ci "REQUIRED_PATTERN_MATCH" expected_file_name )
[[ $result -gt 0 ]] && cp -r /temporary/directory/* /success/directory
rm -rf /temporary/directory/*
NOTE:
This is only the bare bones of what you need as I don't know the specifics of what you need. But you should also look at trying to ...
ensure you have a timeout on the wget, such that you do not have multiple wgets running.
create some form of back off so that you do not hammer the remote server when it is trouble
ideally show some message on the page if it is over 40 minutes old so that viewer knows a problem is being experienced.
you could use a chromium refresh plugin to pull the page from locally
you can use your script to alter the page once you have downloaded it if you want to add in additional/altered formatting (e.g. replace the css file?)
I see three solutions:
Load page in iframe (if not blocked), and check for content/response).
Create simple browser in java (not so hard, even if you dont know this language, using webview)
Create plugin for your browser.
reloading a page via javascript is pretty easy:
function refresh() {
var xhr = new XMLHttpRequest();
xhr.onreadystatechange = function() {
if (xhr.readyState == 4 && xhr.status === 200)
document.body.innerHTML = this.responseXML.body;
else
setTimeout('refresh', 1500);
};
xhr.open('GET', window.location.href);
xhr.responseType = "document"
xhr.send();
}
setInterval('refresh', 30*60*1000);
this should work as you requested
In asp.net mvc2, how do I show the status of a user as "online"?
So if the user is actively working on the site, the status would be "online".
if the user stepped away from the site for about 5 minutes then it would be "5 minutes".
if the user stepped away from the site for about 10 minutes then it would be "10 minutes".
So on and so forth.
What is the best way to accomplish this? Any code sample would be very helpful to me.
The responses so far suggest that I used Ajax. If so, then how would I be able to query online users vs offline users. All my queries go against Database. Is it possible to query and display results joining Ajax results with Database queries?
I would think the most straight forward way of doing it would be with a session variable. You could add a line to your controllers (either in the action method, or the constructor, or even possibly with an action filter) which stashes the current Date/Time in the session. You could then use an ajax call to update the value on the screen at a specific interval. You would probably want to make the interval in minutes rather than seconds otherwise you would be displaying a counter (i.e. "1 second", "2 seconds", etc).
Some quick code samples:
// Somewhere in controller
Session["LastSeen"] = DateTime.Now;
// Now, an action that would return the amount of time since the user was last seen
public ViewResult GetLastSeenTime()
{
return Json(new { TimeAway = Date.Time.Now.Subtract((DateTime)Session["LastSeen"]).TotalMinutes});
}
// Then on your page, something like this
$.post("/Controller/GetLastSeenTime",null,function(result) {
if(result.LastSeen < 5)
$("#Status").Text("Online");
else if (result.LastSeen % 10 == 0)
$("#Status").Text(result.LastSeen + " minutes");
},"json");
Totally not tested, but should be close.
ckramer is right. I suggest expending his solution to make it js degradable.
/ Now, an action that would return the amount of time since the user was last seen
public ActionResult GetLastSeenTime()
{
if (Request.IsAjax) {
return Json(new { TimeAway = Date.Time.Now.Subtract((DateTime)Session["LastSeen"]).TotalMinutes});
}
ViewData.Add("LastSeen", Date.Time.Now.Subtract((DateTime)Session["LastSeen"]).TotalMinutes}));
return View("MyView")
}