I'm using InstaPy which use Python and Selenium. I start the script per Cron and from time to time it crashes. So it'r really irregular, sometimes it runs well through. I'v posted on GitHub Repo as well already but didn't get an answer there, so i'm asking here now if someone has an idea why.
It's a digital ocean ubuntu server and i'm using it on headless mode. The driver version are visible on the log. here are error messages:
ERROR [2018-12-10 09:53:54] [user] Error occurred while deleting cookies from web browser!
b'Message: invalid session id\n (Driver info: chromedriver=2.44.609551 (5d576e9a44fe4c5b6a07e568f1ebc753f1214634),platform=Linux 4.15.0-42-generic x86_64)\n'
Traceback (most recent call last):
File "/root/InstaPy/instapy/util.py", line 1410, in smart_run
yield
File "./my_config.py", line 43, in <module>
session.follow_user_followers(['xxxx','xxxx','xxxx','xxxx'], amount=100, randomize=True, interact=True)
File "/root/InstaPy/instapy/instapy.py", line 2907, in follow_user_followers
self.logfolder)
File "/root/InstaPy/instapy/unfollow_util.py", line 883, in get_given_user_followers
channel, jumps, logger, logfolder)
File "/root/InstaPy/instapy/unfollow_util.py", line 722, in get_users_through_dialog
person_list = dialog_username_extractor(buttons)
File "/root/InstaPy/instapy/unfollow_util.py", line 747, in dialog_username_extractor
person_list.append(person.find_element_by_xpath("../../../*")
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webelement.py", line 351, in find_element_by_xpath
return self.find_element(by=By.XPATH, value=xpath)
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webelement.py", line 659, in find_element
{"using": by, "value": value})['value']
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webelement.py", line 633, in _execute
return self._parent.execute(command, params)
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webdriver.py", line 321, in execute
self.error_handler.check_response(response)
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/errorhandler.py", line 242, in check_response
raise exception_class(message, screen, stacktrace)
selenium.common.exceptions.WebDriverException: Message: unknown error: session deleted because of page crash
from unknown error: cannot determine loading status
from tab crashed
(Session info: headless chrome=70.0.3538.110)
(Driver info: chromedriver=2.44.609551 (5d576e9a44fe4c5b6a07e568f1ebc753f1214634),platform=Linux 4.15.0-42-generic x86_64)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/root/InstaPy/instapy/instapy.py", line 3845, in end
self.browser.delete_all_cookies()
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webdriver.py", line 878, in delete_all_cookies
self.execute(Command.DELETE_ALL_COOKIES)
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/webdriver.py", line 321, in execute
self.error_handler.check_response(response)
File "/usr/local/lib/python3.6/dist-packages/selenium/webdriver/remote/errorhandler.py", line 242, in check_response
raise exception_class(message, screen, stacktrace)
selenium.common.exceptions.WebDriverException: Message: chrome not reachable
(Session info: headless chrome=71.0.3578.80)
(Driver info: chromedriver=2.44.609551 (5d576e9a44fe4c5b6a07e568f1ebc753f1214634),platform=Linux 4.15.0-42-generic x86_64)
Any idea what the reason could be and how to solve it?
Thanks for the inputs. And the guys from http://treestones.ch/ helped me out.
Though you see the error as:
Error occurred while deleting cookies from web browser!
b'Message: invalid session id\n (Driver info: chromedriver=2.44.609551 (5d576e9a44fe4c5b6a07e568f1ebc753f1214634),platform=Linux 4.15.0-42-generic x86_64)\n'
The main exception is:
selenium.common.exceptions.WebDriverException: Message: unknown error: session deleted because of page crash
from unknown error: cannot determine loading status
from tab crashed
Your code trials would have given us some clues what going wrong.
Solution
There are diverse solution to this issue. However as per UnknownError: session deleted because of page crash from tab crashed this issue can be solved by either of the following solutions:
Add the following chrome_options:
chrome_options.add_argument('--no-sandbox')
Chrome seem to crash in Docker containers on certain pages due to too small /dev/shm. So you may have to fix the small /dev/shm size.
An example:
sudo mount -t tmpfs -o rw,nosuid,nodev,noexec,relatime,size=512M tmpfs /dev/shm
It also works if you use -v /dev/shm:/dev/shm option to share host /dev/shm
Another way to make it work would be to add the chrome_options as --disable-dev-shm-usage. This will force Chrome to use the /tmp directory instead. This may slow down the execution though since disk will be used instead of memory.
chrome_options.add_argument('--disable-dev-shm-usage')
from tab crashed
from tab crashed was WIP(Work In Progress) with the Chromium Team for quite some time now which relates to Linux attempting to always use /dev/shm for non-executable memory. Here are the references :
Linux: Chrome/Chromium SIGBUS/Aw, Snap! on small /dev/shm
Chrome crashes/fails to load when /dev/shm is too small, and location can't be overridden
As per Comment61#Issue 736452 the fix seems to be have landed with Chrome v65.0.3299.6
Reference
You can find a couple of relevant discussions in:
org.openqa.selenium.SessionNotCreatedException: session not created exception from tab crashed error when executing from Jenkins CI server
In case someone is facing this problem with docker containers:
use the flag --shm-size=2g when creating the container and the error is gone.
This flag make the container to use the host's shared memory.
Example
$ docker run -d --net gridNet2020 --shm-size="2g" -e SE_OPTS="-browser applicationName=zChromeNodePdf30,browserName=chrome,maxInstances=1,version=78.0_debug_pdf" -e HUB_HOST=selenium-hub-3.141.59 -P -p 5700:5555 --name zChromeNodePdf30 -v /var/lib/docker/sharedFolder:/home/seluser/Downloads selenium/node-chrome:3.141.59-xenon
Source: https://github.com/SeleniumHQ/docker-selenium
I was getting the following error on my Ubuntu server:
selenium.common.exceptions.WebDriverException: Message: unknown error:
session deleted because of page crash from tab crashed (Session
info: headless chrome=86.0.4240.111) (Driver info:
chromedriver=2.41.578700
(2f1ed5f9343c13f73144538f15c00b370eda6706),platform=Linux
5.4.0-1029-aws x86_64)
It turned out the the cause of the error was insufficient disk space on the server and the solution was to extend my disk space. You can check this question for more information.
We need to specify the shm memory separatly, --shm-size=2g
In case of docker,
use the following config - this working fine for me
services:
chrome:
image: selenium/node-chrome:4.0.0-rc-1-prerelease-20210823
shm_size: 2gb
Message: unknown error: session deleted because of page crash from unknown error: cannot determine loading status from tab crashed
(Session info: headless chrome=95.0.4638.69)
This error occurred because there was not enough waiting time for web pages to load
The answers above solved my issue, but since i needed to run it from a docker-compose.yml i used this configuration which calls my regular unchanged DockerFile
docker-compose.yml
version: '1.0'
services:
my_app:
build:
context: .
#when building
shm_size: 1gb
#when running
shm_size: 1gb
DockerFile (selenium on Ubuntu -WSL-)
FROM python:3.10
# install google chrome
RUN wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add -
RUN sh -c 'echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" >> /etc/apt/sources.list.d/google-chrome.list'
RUN apt-get -y update
RUN apt-get install -y google-chrome-stable
# install chromedriver
RUN apt-get install -yqq unzip
RUN wget -O /tmp/chromedriver.zip http://chromedriver.storage.googleapis.com/`curl -sS chromedriver.storage.googleapis.com/LATEST_RELEASE`/chromedriver_linux64.zip
RUN unzip /tmp/chromedriver.zip chromedriver -d /usr/local/bin/
# set display port to avoid crash
ENV DISPLAY=:99
# install selenium
RUN pip install selenium==3.8.0
#install and prepar app
COPY ./requirements.txt ./
# COPY . /app
RUN pip3 install -r requirements.txt
RUN apt-get install -y libnss3
ENV APP_DIR=/app/my_app
RUN mkdir -p ${APP_DIR}
WORKDIR ${APP_DIR}
# COPY . ${APP_DIR} #not needed since we are mapping the volume in docker-compose
CMD [ "my_app.py" ]
ENTRYPOINT [ "python" ]
This happened to me while trying to open a new web page with the same driver in Chromium. It worked fine in my local machine where I use Chrome.
Did not worked:
driver = webdriver.Chrome(options=options)
driver.execute_script("Object.defineProperty(navigator, 'webdriver', {get: () => undefined})")
driver.execute_cdp_cmd('Network.setUserAgentOverride', {
"userAgent": 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.53 Safari/537.36'})
driver.get('url1')
# Do operations with url1
driver.get('url2')
# Do operations with url2 -> did not work and crashed
Below is the solution, I am using which is working for me. i.e re-initializing the driver
def setup_driver():
global driver
driver = webdriver.Chrome(options=options)
driver.maximize_window()
driver.execute_script("Object.defineProperty(navigator, 'webdriver', {get: () => undefined})")
driver.execute_cdp_cmd('Network.setUserAgentOverride', {
"userAgent": 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/83.0.4103.53 Safari/537.36'})
setup_driver()
driver.get('url1')
# Do operations with url1
driver.close()
setup_driver()
driver.get('url2')
# Do operations with url2
driver.close()
I'm not sure whether this is the only possible cause and solution, but after thorough investigation of this error which I encountered every now and then, I found the following evidences:
In the log of the Selenium Grid nodes (which you can show by executing the following command on the docker host: sudo docker logs <container-id>) I found many errors reading: [SEVERE]: bind() failed: Cannot assign requested address (99). From what I read, this error usually means that there are no available ports.
When showing the processes running inside a node (sudo docker exec -it bash and then ps aux), I found more than 300 instances of chrome-driver processes (you can count them using ps aux|grep driver|wc -l)
When running locally, I know that the chrome-driver process is normally invoked when you create an instance of ChromeDriver and is terminated when you call driver.Quit() (I work in C#, not Python). Therefore I concluded that some tests don't call drive.Quit().
The conclusion
In my case, I found that even though we had a call to driver.Quit() in the [TearDown] method (we use NUnit), we had some more code before that line, that could throw and exception. When one of these preceding lines threw an exception, the line that calls driver.Quit() is not reached, and therefore over time we were "leaking" chrome-driver processes on the Selenium Grid nodes. These orphan processes caused a resource leak of available ports (and probably also memory), which also caused the browser's page to crash.
The solution
Given the above conclusion, the solution was pretty straight forward. We had to wrap the code that precedes driver.Quit() in a try/finally, and put the call to driver.Quit() in the finally clause, like this:
[TearDown]
public void MyTearDown()
{
try
{
// Perform any tear down code you like, like saving screenshots, page source, etc.
}
finally
{
_driver?.Quit();
}
}
I was having the same problem, I checked the log at which point in my script the bug happened and I added some wait, ie, time.sleep(2) just before the bug, and my problem was fixed.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed last year.
Improve this question
I'm having difficulties identifying what causes my website to load extremely slow, I have found something but google archives don't provide the right answer or even explanation.
In my raw-access logs I found multiple records about different robots accessing my website, here's an example:
202.46.53.40 - - [31/Dec/2016:03:30:51 +0100] "GET /en/home/184-2016-hyperlite-motive-wakeboard.html HTTP/1.1" 302 - "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36"
202.46.54.27 - - [31/Dec/2016:03:30:52 +0100] "GET /en/home/184-2016-hyperlite-motive-wakeboard.html HTTP/1.1" 301 - "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36"
202.46.56.210 - - [31/Dec/2016:03:30:53 +0100] "GET /en/home/184-2016-hyperlite-motive-wakeboard.html HTTP/1.1" 302 - "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36"
202.46.56.114 - - [31/Dec/2016:03:30:54 +0100] "GET /en/wakeboards/184-2016-hyperlite-motive-wakeboard.html HTTP/1.1" 200 140041 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36"
180.76.15.154 - - [31/Dec/2016:03:31:26 +0100] "GET /en/26-sup HTTP/1.1" 406 73864 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"
157.55.39.40 - - [31/Dec/2016:03:31:50 +0100] "GET /en/helmets/57-2015-mystic-mk8-helmet-mint.html HTTP/1.1" 302 - "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
157.55.39.40 - - [31/Dec/2016:03:31:55 +0100] "GET /en/helmets/57-2015-mystic-mk8-helmet-mint.html HTTP/1.1" 301 - "-" "Mozilla/5.0 (compatible; bingbot/2.0; +http://www.bing.com/bingbot.htm)"
77.75.77.95 - - [31/Dec/2016:03:34:03 +0100] "GET /robots.txt HTTP/1.1" 404 57839 "-" "Mozilla/5.0 (compatible; SeznamBot/3.2; +http://napoveda.seznam.cz/en/seznambot-intro/)"
77.75.77.95 - - [31/Dec/2016:03:34:05 +0100] "GET /en/31-bags HTTP/1.1" 301 - "-" "Mozilla/5.0 (compatible; SeznamBot/3.2; +http://napoveda.seznam.cz/en/seznambot-intro/)"
163.172.66.143 - - [31/Dec/2016:03:43:36 +0100] "GET /en/13-rokavice HTTP/1.1" 302 - "-" "Mozilla/5.0 (compatible; AhrefsBot/5.2; +http://ahrefs.com/robot/)"
202.46.54.134 - - [31/Dec/2016:04:04:20 +0100] "GET /en/accessories/169-plavutke-pro-ii.html HTTP/1.1" 302 - "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36"
202.46.54.102 - - [31/Dec/2016:04:04:21 +0100] "GET /en/accessories/169-plavutke-pro-ii.html HTTP/1.1" 301 - "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36"
202.46.48.140 - - [31/Dec/2016:04:04:22 +0100] "GET /en/accessories/169-plavutke-pro-ii.html HTTP/1.1" 200 110602 "-" "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/45.0.2454.93 Safari/537.36"
180.76.15.10 - - [31/Dec/2016:04:04:55 +0100] "GET /en/56-kiteboarding-gear HTTP/1.1" 406 62988 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)"
66.249.76.47 - - [31/Dec/2016:04:25:33 +0100] "GET /380/komplet-oceanrodeo-razor-fst8-advenced-performance-kite.jpg HTTP/1.1" 200 126044 "-" "Googlebot-Image/1.0"
112.210.233.49 - - [31/Dec/2016:04:29:17 +0100] "POST /modules/sendtoafriend/sendtoafriend_ajax.php?rand=1472104141118 HTTP/1.1" 500 - "https://proadrenalin.si/modules/sendtoafriend/sendtoafriend_ajax.php?rand=1472104141118" "Mozilla/4.0 (compatible; MSIE 9.0; Windows NT 6.1)"
66.249.76.78 - - [31/Dec/2016:04:33:09 +0100] "POST /modules/leocustomajax/leoajax.php?rand=1482019200024 HTTP/1.1" 200 14 "https://www.proadrenalin.si/en/20-wakeboards?p=3" "Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
Is it possible that this visits are causing the problem with slow page load ?
For 31st December i have 1342 requests, 1st Jan. 1222, 2nd Jan - 2374 requests, 4th Jan - 2391... This goes on every day.
Webshop is run by Prestashop, and as far as I've been inspecting the platform is not causing any problems which would result in slow page load. Most of modules are disabled, removed, only needed (enabled) ones are on server, cashing is on, recompile when something changes..
Any tips, links to read, possible solutions...would be very useful because currently I'm living in nightmare..
You can find the IP patterns of the robots hitting your store and then block those IPs using the .htaccess file.
Visit following URL for more details on this:
How to Block an IP address range using the .htaccess file
I have the same problem and currently trying a solution to block that bot via robots.txt like this:
User-agent: SeznamBot
Disallow: /
Taken from official source https://napoveda.seznam.cz/en/full-text-search/crawling-control/
Situation:
Sending a HTML form (method POST) results sporadically in broken GET request on webserver (405). In the browser is displayed "Method now allowed" (405).
The incorrect string in front of the wrong GET method looks like some form variables . For example checkout=Weiter+%3E%3E, which is the value-attribute of the submit button (Weiter >>).
WA Log entries:
"egoryID=vvXAqAFS1FIAAAFA6CQIDsbGGET /is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewData-Start/1268826926?JumpTarget=ViewRequisitionCheckout-ShowLoginPage HTTP/1.1" 405 92 "https://www.XYZ.de/is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewData-Start/1268341878?JumpTarget=ViewRequisition-View" "Mozilla/5.0 (iPhone; CPU iPhone OS 7_0_2 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A501 Safari/9537.53" 1016
"t_State=true&processLogin=WeiterGET /is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewData-Start/1269223568?JumpTarget=ViewRequisitionCheckout-ManageAddresses HTTP/1.1" 405 92 "https://www.XYZ.de/is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewData-Start/1268974319?JumpTarget=ViewRequisitionCheckout-ShowLoginPage" "Mozilla/5.0 (iPhone; CPU iPhone OS 7_0_2 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A501 Safari/9537.53" 1309
"ipTo=true&checkout=Weiter+%3E%3EGET /is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewData-Start/1270218168?JumpTarget=ViewRequisitionCheckout-ManageAddresses HTTP/1.1" 405 92 "https://www.XYZ.de/is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewData-Start/1269355351?JumpTarget=ViewRequisitionCheckout-ManageAddresses" "Mozilla/5.0 (iPhone; CPU iPhone OS 7_0_2 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A501 Safari/9537.53" 1223
"KpsHpPhxqCtk&apply=Weiter+%3E%3EGET /is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewData-Start/1271422613?JumpTarget=ViewRequisitionCheckoutPayment-Edit HTTP/1.1" 405 92 "https://www.XYZ.de/is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewData-Start/1270749634?JumpTarget=ViewRequisitionCheckoutPayment-Edit" "Mozilla/5.0 (iPhone; CPU iPhone OS 7_0_2 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A501 Safari/9537.53" 1132
"egoryID=KzfAqAFSLHUAAAFARCQIDsbGGET /is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewData-Start/499191000?JumpTarget=ViewRequisitionCheckout-ShowLoginPage HTTP/1.1" 405 92 "https://www.XYZ.de/is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewData-Start/499170102?JumpTarget=ViewRequisition-View" "Mozilla/5.0 (iPad; CPU OS 7_0_2 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A501 Safari/9537.53" 1072
"rue&checkout=Bestellung+absendenGET /is-bin/static/WFS/XYZ-DE-Site/-/de_DE/jscript/snippets/catalog/LeftPanelCatalog.js HTTP/1.1" 405 92 "https://www.XYZ.de/is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewRequisitionCheckoutFinish-Dispatch" "Mozilla/5.0 (iPhone; CPU iPhone OS 7_0_2 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A501 Safari/9537.53" 1086
"ponent=&addList=In+den+WarenkorbGET /is-bin/static/WFS/XYZ-DE-Site/-/de_DE/images/ajax_loader_bg_white.png HTTP/1.1" 405 84 "https://www.XYZ.de/is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewDirectRequisition-List" "Mozilla/5.0 (iPad; CPU OS 7_0_2 like Mac OS X) AppleWebKit/537.51.1 (KHTML, like Gecko) Version/7.0 Mobile/11A501 Safari/9537.53" 813
"4E45&QuantityString=1&Position=1GET /is-bin/static/WFS/XYZ-DE-Site/-/de_DE/images/ajax_loader.gif HTTP/1.1" 405 84 "https://www.XYZ.de/is-bin/WFS/XYZ-DE-Site/de_DE/-/EUR/ViewDirectRequisition-List" "Mozilla/5.0 (iPad; CPU OS 6_1_3 like Mac OS X) AppleWebKit/536.26 (KHTML, like Gecko) Version/6.0 Mobile/10B329 Safari/8536.25" 740
[...]
WA Log analysis:
User Agent
mostly mobile Safari (Version 6.0, 5.1.1)
-- iPad und iPhone (Apple iOS; Version 7.0.x, 6.x)
sometimes desktop Safari (Version 5.1.1)
-- Macintosh (Mac OS X; Version 10.6.8)
different pages
apparently only HTTPS (SSL)
Question
What causes this behavior?
I can see that the codeigniter's email library's useragent is changeable as the document says.
But what is really the useragent and what does it do for us?
I'm not sure about the email library's case, but in most cases the useragent is the identifier of what the user "used" to make a request in the way of the browser type/version, OS type/version, etc.
This information is commonly used or contained cookies.
Here are examples of useragents:
Mozilla/5.0 (iPad; U; CPU OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B334b Safari/531.21.102011-10-16 20:23:50
Mozilla/5.0 (iPad; U; CPU OS 3_2 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Version/4.0.4 Mobile/7B334b Safari/531.21.102011-10-16 20:23:10
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:7.0.1) Gecko/20100101 Firefox/7.0.12011-10-16 20:23:00
Mozilla/5.0 (Linux; U; Android 2.3.3; en-au; GT-I9100 Build/GINGERBREAD) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.12011-10-16 20:22:55
Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; InfoPath.2; .NET CLR 2.0.50727; .NET CLR 3.0.04506.648; .NET CLR 3.5.21022; .NET CLR 1.1.4322)2011-10-16 20:22:33
Mozilla/5.0 (Windows NT 6.1; rv:5.0) Gecko/20100101 Firefox/5.02011-10-16 20:21:42
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_7_2) AppleWebKit/535.1 (KHTML, like Gecko)
Chrome/14.0.835.202 Safari/535.12011-10-16 20:21:13
Mozilla/5.0 (BlackBerry; U; BlackBerry 9800; en) AppleWebKit/534.1+ (KHTML, like Gecko) Version/6.0.0.337 Mobile Safari/534.1+2011-10-16 20:21:10
Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)2011-10-16 20:21:07
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:7.0.1) Gecko/20100101 Firefox/7.0.12011-10-16 20:21:05
Mozilla/5.0 (X11; Linux i686) AppleWebKit/534.34 (KHTML, like Gecko) rekonq Safari/534.342011-10-16 20:21:01
Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/4.0; GTB6; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; OfficeLiveConnector.1.4; OfficeLivePatch.1.3)2011-10-16 20:20:48
BlackBerry8300/4.2.2 Profile/MIDP-2.0 Configuration/CLDC-1.1 VendorID/107 UP.Link/6.2.3.15.02011-10-16 20:20:17
IE 7 ? Mozilla/4.0 (compatible; MSIE 7.0; Windows NT 5.1; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.04506.30)2011-10-16 20:20:09
Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.23) Gecko/20110920 Firefox/3.6.23 SearchToolbar/1.22011-10-16 20:20:07