How to generate full report in owasp zap in any format - owasp

When i try to generate report in HTML,.XML or PDF I'm getting only alerts in the report. I would like to get all the information including passed attack also in the report.
For example in active scan there is around 500+ combination of URL being used but I'm getting only fee of them. I need all the 500+ URL and its results in the report
Any suggestion?

We dont generate that as a 'standard' report as no ones asked for that to date. However we do expose pretty much everything via the ZAP API, and if theres anything we dont currently expose then let us know and we'll fix that.
To get started with the API point your browser at the host:port that ZAP is listening on and follow the link to the API UI which will allow you to invoke any of the end points. We also have some info on the wiki: https://github.com/zaproxy/zaproxy/wiki/ApiDetails
If you have more detailed questions then the ZAP User Group is a good place to ask: https://groups.google.com/group/zaproxy-users

Related

Loading & Connecting Facebook Pixel Conversions Data

I am trying to load the Facebook Pixel Conversions level data from the Marketing/Insights API but not able to do it at the level I want or even properly
I have various pixels created in the form of events eg: Leads, Registrations etc and need to track them
After reading the documentation for Ads Pixels and it's stats - I was able to load some basic fields for now - but still not able to pull the s
GET API Query : https://graph.facebook.com/v2.9/act_/adspixels?fields=name,id,creation_time,last_fired_time
This gives me all the correct Ads Pixel details but how do I pull all the stats for this in the form of Events, their occurrences etc - will I be using more query parameters in this URL or a new URL - tried multiple iterations but was not able to get anything to work for now.
Tried this API Query as per documentation -: https://graph.facebook.com/v2.9//stats - but does not work even with fields added etc
Another issue I had was I am not at all able to test my queries with Graph API explorer - it keeps telling me that "Timeout issue" or "some other errors" when I am trying to use the app etc there. Do I need to publish and approve the app before hitting FB Ads data via the API Explorer?
All your suggestions and feedback will be highly appreciated here
I was searching for some things related and encoutered your thread.. I will report my findings .. maybe you already know this, but here it goes.
As far as querying with Graph API explorer.. it doesn't seem to work with Marketing API. You need to create your own app, and enable market api, in order to get the necessary token.
I am following the instructions on the link you provided: stats
Second.. to get the stats I am using
graph.facebook.com/v2.11/{pixel-id}/stats?aggregation=pixel_fire
The aggregation is necessary to get results. I can get the "Page View" event listed that I am tracking on a website.
I was able to compare these results, with the ones showed to me on the events manager page of the pixel.
Hope this helps

Adwords and Form Tracking

Im not an expert in Google Tracking and Adwords and i had a request where a client wants to track the people who submit a contact form on a website, coming from an adwords ad.
So someone lands on a specific site on the clients site via Adwords, then fills out the contact form on that site. Now they want to know how many people coming from adwords ads, are willing to submit the form.
This seems to be very obvious, and i thought there might be already a solution for this.
Conversion Tracking already happens when the form is submitted, but they cannot comprehend whether someone submitted the form coming from adwords or or not.
I´ve been told to save the GET-Parameter from the Adwords-Link inside the database, the website is running on, every time the form is submitted. That doesnt seem to be the right way. Also there are some security issues with that.
Can anyone give some advice, how this could be achieved.
I hope i explained that right.
Thanks in advance.
If you need to save source into DB:
Simpliest way migth be parsing UTM parameters and populate the hidden fields in the form, so you know that user came from AdWords or any other source.
If you are using automatic tagging (which I would recommend), you are going after GLCID parameter, if manual, it's just utm_source.
I would do tracking from all sources (not only AdWords) - it might happen that you also need to know if user came through Facebook, LinkedIn article or so.
More on tracking sources into database here and here. More on AdWords tagging here.
If you only need to know if people from AdWords submit forms (=converts):
Chances are, that you already know. See Google Analytics. Filter out segment of people who did goal of sumbmiting form and see Aquisition -> All traffic.
More on Google Analytics here
You should check and make sure you have auto-tagging enabled. The best way to do this in Google Analytics go into the Admin > Property column > Product Linking section > Adwords Linking.
Once you have Adwords and Analytics linked up simply go back to the Reporting tab and navigate to Acquisition > Adwords > Campaigns as shown here
If you have properly set up your goals in Analytics you can use the Conversion filter to select the goal you want to view and this will show you all goal conversions that came from Adwords.

How to detect the creation date of a webpage from its server

I'm trying to find a way to detect the creation date of a webpage from its server. As for example when was this page wwww.Amazon.com/fghhggg created? Is there a way to find it and automate it? Thank you for the clues
In general, the answer is no. Occasionally you'll see a web page that returns this information in the headers, but for a site where the pages are generated from information in the database (like Amazon, or most other sites on the internet), asking for the "creation date" doesn't really make sense.
For example, imagine you're looking up product X on Amazon. Amazon's servers retrieve information from the database, put together an HTML document, and return it to you. What would the "creation date" be? The page didn't exist 5 seconds ago - it was just assembled for you - and it doesn't exist now that it's been sent to you. If you're looking for when the product was added to Amazon's database, that information might be available via Amazon's API.

Autofill an HTML form

What applications exist that can take a series of fields from my db (or csv output from my db) and insert them into a web-based form and then submit that form?
Big Picture Use Case:
I maintain an in-house registration management system for webinars that we produce/present. Currently we use GoToWebinar.com to host our events but they haven't always been (and may not always continue to be) our vendor.
GoToWebinars.com does not provide me an API for creating registrations for 3rd party individuals. So when someone decides to attend one of our events they have to fill out 2 registrations forms, mine and GoToWebinars.com. I'd like to automate the task of filling in GoToWebinar's registration form.
I am looking into the same thing. I found some bits and pieces here and there and was able to decipher the URL to post to GTW:
https://www.gotowebinar.com/en_US/island/webinar/registration.flow?Template=island/webinar/registration.tmpl&Form=webinarRegistrationForm&WebinarKey=XXX_YOUR_WEBINAR_ID_XXX&Name_First=ViewersFirstName&Name_Last=ViewersLastName&Email=ViewersEmailAddress
If you are using cURL, then be sure to use CURLOPT_FOLLOWLOCATION because there are some redirections on the GTW side and cURL needs to follow them.
So far this seems to work for us.
Good luck!
I'm late to the party, but let me offer a way to call the CITRIX API via PHP to register a new GotoWebinar attendee, in case somebody else hits this page looking for the answer to your question.

screenshot-grabbing email tool

I have a web site with various graphs embedded in it that are generated externally. Occasionally those graphs will fail to generate and I would like to catch that when it happens. These graphs are embedded in multiple pages and I would rather not check each page manually. Is there any kind of tool or perhaps a browser addon that could periodically take screenshots of different URLs and email them in a single email? It would be sufficient to have scaled-down screenshots of full pages emailed maybe once a day to me, allowing me to take a quick glance and see that all the graphs are there and look okay.
I'm a big fan of automation. Rather than have emails generated that you then have to look at, take a look at 'replacing custom missing images in jquery'. This will run a piece of Javascript for each image that fails. Extending that to make a request to a URL that you control, which may also include the broken URL (or just the filename that is broken) would not be too hard. That URL would then generate an email, and store the broken URL so that it doesn't send 5000 emails if there's a flurry of hits to your page.
Another idea building on the above is to effectively change the external 404 from the source site to a local one (eg /backend/missing-images/) - the full-path need not exist - you are just generating a local 404 record in your apache logs. Logwatch will send a list of 404 pages from the apache log to you daily (or more often, if you want) by email.