asp.net how do you check the status code in the browser? - redirect

Say I have a web page which does a permanent redirect to another page. The status code sent should be 301. I would like to test this (ie to check that the status code is indeed 301) but the browser redirects automatically to the new page and I don't have the time to check the status code returned.
Any ideas?

Fiddler is your friend here, it can monitor all web traffic and you will be able to see the 301 being sent back.
You can download it from http://www.fiddler2.com/fiddler2/

You can check your logs in IIS, it keeps track of requests and the response code it sent back.
You can also use a tool like Fiddler which works with IE and it will show you the request/response data.
Other browsers likely have their own tools that will show you this information also.

I would recommend that you use a web debugging tool that will allow you to look at what requests and responses have been received by the browser. Fiddler is a free, useful tool in seeing these items.

Related

Delphi REST Debugger Returns Error 429 Too Many Requests but Browser Returns JSON as Expected

What is the difference between the way a browser calls the URL and doing it via Rest Debugger or HTTP Components?
I have a 3rd party Web REST API that work every time in a browser (IE it returns JSON as expected), but when I use (GET) the same URL in the Delphi REST Debugger it returns error code 429 Too Many Requests.
I am not allowed to post the exact URL here (I'm sorry, boss has the last say but it is like this https://xxxx.yyyy.com.au/search/resources/store/zzzzz/productview/123456).
For additional information the result is consistent giving the 429 error when I use NetHTTPClient and NetHTTPRequest components as well as using the Delphi REST Components.
I thought that setting the user agent to be the same as my browsers might help, but alas it didn't. I use Delphi 10.3.3 Rio
I'm a bit new to REST and haven't found an answer by googling for a couple of days now. Any help will be most appreciated.
Thanks,
John
The answer is cookies. When I rejected all cookies I could see the behavior as stated by #RemyLebeau where the page is in a continuous loop. The browser sends a cookie with the request header. I'm new to all of this, so I'll try to replicate what the browser is doing and see what happens. If I get really stuck I'll post another question specifically about cookies. Many thanks to all who offered advice. Most appreciated. I put this here because someone deleted this as an answer.

Capture disappearing API link and payload from browser

I am trying to capture payload information and Endpoint link from a particular web site i am using, Since i don`t have any documentation i have to capture endpoint manually.
i see Endpoint link is appears in web inspect for 3-4 seconds when i do any post request and disappears before i can capture all details.. is there any way i can delay endpoint visibility in browser and capture information?
This may not be programming question but this task mostly performed by programmer, Hence looking for answer in this site.
Thanks
I am able to resolve it.
From Chrome network option Check option-- > preserve Log

How to find if the website is reading cookies using mechanize?

I'm trying to automate the website but the website is reading the cookies and after 5secs its redirect to the main page. (Just I assume, because I disable the cookies then refresh it, the website was not redirecting). I don't know how to set the cookies by using WWW::Mechanize.
Here are answers to the questions you've asked, but I don't think they'll help you a lot. You really need to explain what you're trying to do, show your Perl code, and describe the behaviour that needs to be fixed
Cookies are data that a browser client stores on behalf of a server. They are indexed by URL
Every time a client sends an HTTP message, it checks to see whether it has cookie data for the URL. If so then the data is included in the header of the message sent
How to find if the website is reading cookies
The cookie information that a client sends is always read, but there is no way at all to tell whether the server has taken action according to that information, or just discarded it
the website is reading the cookies and after 5secs its redirect to the main page
I'm unclear how you think cookies might be relevant
Just to be clear:
A website is an accumulation of data files and executables on a server system, and so cannot "read" a cookie
It is the client—your browser—that redirects to the main page. That is most likely to be because the last message from the server included an instruction to load the main page after five seconds

Fiddler session disappears after being shown

I am tracking an IOS app through fiddler as a proxy. And for some requests from the IOS app, fiddler will display those sessions in a flash and then those sessions are gone. It is against my common sense that why a landed session in fiddler will ever disappear. The only thing I can think of is that the response of the http session is violating the protocol? But the IOS app behaves normally with those http requests and responses.
I did catch some screen shots for the http request while it is being displayed in fiddler, and here is how it looks like for each fiddler column:
#:76
Result:-
Protocol:HTTP
Host:184.102.xxx.xx
URL:/someUrl/someHashCode
Body:-1
Caching:
Does anyone know what is going on here?
In both cases, the traffic is being hidden by a filter. Click Help > Troubleshoot Filters and follow the instructions.

I'm unable to de-authorize callback

I want to delete record of those peoples who have remove app from their application's list, to do this I have entered that URL where I make a code to delete record of active user from my database in de-authorize callback. But still I'm unable to de-authorize users from by db.
Edit: See Facebook Deauthorize Callback over HTTPS for what my original problem really was. Summary: Improper web server configuration on my part.
Original answer was:
One potential problem has to do with https based deauthorize callbacks. At least some SSL certificates are not compatible with the Facebook back end servers that send the ping to the deauthorize callback. I was only able to process the data once I implemented a callback on an http based handler.
Some things to check...
That the URL of your server is visible from facebook's servers (ie not 192.168 or 10.0 unless you've got proper firewall and dns config).
Try using an anonymous surfing service and browsing to the URL you gave facebook - do you see a PHP Error?
Increase the loglevel for PHP and Apache/IIS to maximum and see if you get any more information
We can't do much more unless you give us your code...