How to get Goal Funnel Step data such as "entered" and "proceeded" through Query API? - google-analytics-api

When looking at Goal Funnel report in the Google Analytics website. I can see not only the number of goal starts and completion but also how many visits to each step.
How can I find the step data through the Google Analytics API?
I am testing with the query explorer and testing on a goal with 3 steps, which 1st step marked as Required
I was able to get the start and completion by running by using goalXXStarts and goalXXCompletions:
https://www.googleapis.com/analytics/v3/data/ga?ids=ga%3A90593258&start-date=2015-09-12&end-date=2015-10-12&metrics=ga%3Agoal7Starts%2Cga%3Agoal7Completions
However I can't figure out a way to get the goal second step data.
I tried using ga:users or ga:uniquePageViews with the URL of the step 2, and previousPagePath as step 1 (required = true) and add to that the ga:users or ga:uniquePageViews from the next stage with ga:previousPagePath of step 1 (since its required=true) for backfill.
I also tried other combinations, but could never reach the right number or close to it.

One technique that can be used to perform conversion funnel analysis with the Google Analytics Core Reporting API is to define a segment for each step in the funnel. If the first step of the funnel is a 'required' step, then that step must also be included in segments for each of the subsequent steps.
For example, if your funnel has three steps named A, B, and C, then you will need to define a segment for A, another for B, and another again for C.
If step A is required then:
Segment 1: viewed page A,
Segment 2: viewed page A and viewed page B,
Segment 3: viewed page A and viewed page C.
Otherwise, if step A is NOT required then:
Segment 1: viewed page A,
Segment 2: viewed page B,
Segment 3: viewed page C.
To obtain the count for each step in the funnel, you perform a query against each segment to obtain the number of sessions where that segment matches. Additionally, you can query the previous and next pages, including entrances and exits, for each step (if you need to); in which case, query previousPagePath and pagePath as dimensions along with metrics uniquePageviews, entrances and exits. Keep in mind the difference between 'hit-level' vs 'session-level' data when performing, constructing and interpreting the results of each query.
You can also achieve similar results by using sequential segmentation which will offer you finer control over how the funnel steps are counted, as well as allowing for non-sequential funnel analysis if required.

Related

JMeter to record results on hourly basis

I have a JMeter project with multiple GET and POST requests and assertions for these. I use Aggregate results and View results tree listeners, but none of these can store results on hourly basis. I tried JMeterPlugins-Standard and JMeterPlugins-Extras packages and jp#gc - Graphs Generator listener, but all of them use aggregated data instead of hourly data. So I would like to get number of successful and failed requests/assertions per hour, maybe a bar chart would be most suitable for this purpose.
I'm going to suggest a non-conventional design-level solution: name your samplers dynamically with hour (or date and hour), so that each hour the name will change, and thus they will appear in different category, i.e.:
The code for such name is:
${__time(dd:hh,)} the rest of sampler name
Such sampler will appear in the following way in Aggregate Report (here I simulated it with minutes/seconds, but same will happen with days/hours, just on larger scale):
Pros and cons of such approach:
Simple, you can aggregate anything by hour, minute, or any other time slice while test is running, and not by analysis after execution.
Not listener-dependant, can be used with pretty much any listener or visualizer
If you want to also have overall stats, it will require to sum up every sub-category. So it alters data, but in the way that it can still can be added back to original relatively easy.
Calculating __time before every sampler will not be unnoticed completely from performance perspective, but I don't think it will add visible overhead to a script.
You could get the same data by properly aggregating JTL or CSV (whichever you use) after execution, so it doesn't provide you with anything that is not possible to achieve using standard methods
Script needs altering to make this happen. if you have 100s of samplers, it's going to take a while. And if you want to change back...
You might want to use Filter Results Tool which has --start-offset and --end-offset parameters, you can "cut" your results file into "interesting" pieces and plot them according to your requirements.
You can install Filter Results Tool using JMeter Plugins Manager
Also be aware that according to JMeter Best Practices you should
Use as few Listeners as possible; if using the -l flag as above they can all be deleted or disabled.
Don't use "View Results Tree" or "View Results in Table" listeners during the load test, use them only during scripting phase to debug your scripts.
You can get whatever information you need from the .jtl results file, you can specify test results location via -l command-line argument
To get summarized results per hour add to your test plan Generate Summary Results:
Generates a summary of the test run so far to the log file and/or standard output
Update interval in jmeter.properties to your needs ,1 hour, 3600 seconds:
summariser.interval=3600
You will get summary per hour of your requests.
You can try with Jmeter backend Listener. It has integration with graphite and Influxdb. After storing the results in these time series database you can display the result in Grafana dashboard. Grafana has its own filtering of showing the results in hourly, monthly, daily basis and so on.

Tableau performance

I've a problem with the dashboard in Tableau. In the dashboard there are many worksheets, and all the columns that are in the report are calculable. The problem is that dashboard is being formed for a very long time. The report contains approximately 2 million rows. And it is generated about 5 minutes.
Tell me, what are the solutions in this case?
Maybe I can somehow adjust the page display and not all the records at once?
To reduce the calculation time, try to exclude data you don't need with a data source filter in tableau. You can also hide or delete unused calculated fields. Other things you can do is reduce sheets that are not used.
Here's a link: https://www.tableau.com/about/blog/2016/1/5-tips-make-your-dashboards-more-performant-48574
Steps to follow to reduce calculation time:
Extract the data and use Extract data and also keep option as extract instead of live.Also replace the data source using extract data.
Use "User Filter" to reduce calculation time so that tableau will display of particular user data only.
I hope this will work to solve your problems.
I have one more idea to resolve this issue.
1)when you loan first time your dashboard put into Dashboard Action Filter
First Time load dashboard data exclude in your sheet.
Dashboard Menu->Action->add action->select sheet and exclude option.
2) Live to Extract data source and select radio button extract.
3)use user filter.
I am following the other answers (use extract, dashboard action filter...) and I want to add one point:
Drag every field used by any tablesheet on the dashboard on "Detail" of every tablesheet you are using on the Dashboard. Now Tableau loads all needed data while loading the first tablesheet and can use this data for the other sheets.
i.e. A dashboard contains three tablesheets (A, B, C) now you drag every field used by A on "Deatil" of B and C, every field used by B on "Deatil" of A and C, every field used by C on "Deatil" of B and A.
We are also having a similar issue with 150 million rows but I want to check if you are doing following steps. This may help you. This goes back to fundamentals of Tableau reporting.
1/ Try to make sure your data set is in star schema format. This will help a lot in report.
2/ Try to have tables and views in DB in such a way that same columns are used in Tableau. Any extra columns in tables adds to the performance issue.
3/Make sure indexing is done properly for all the fields that are joined.
4/ In my experience Dashboard adds extra performance lag. So make sure you try to get as much performance tuning on sheets as possible before even going to dashboard.
5/ If required try to use materialized views.
hope this helps.
Try to capture performance metrics using performance recorder option in Tableau.
Check for the underlying DB tables and joins present on the data source layer.
Try using optimized sets and parameters as required and get rid of less relevant filters.
Try using data extracts with scheduled refresh with data source filter for limited business years data.

Google Analytics Core Reporting API query for exits and entrances metrics - entrance values incorrectly exactly the same as exits

I'm using GA's Core Reporting API to create a report that shows the top exit pages alongside some behavioural metrics for each page. The dimension is ga:exitPagePath, and the metrics I want are:
ga:exits
ga:pageviews
ga:entrances
ga:avgTimeOnPage
ga:bounceRate
ga:exitRate
I'm sorting by -ga:exits. I'm not using any filters or segments.
The query appears to work fine, it doesn't return an error - however the entrances values it returns are incorrect and exactly match the exit values for each page. Other queries for ga:entrances without ga:exits give the correct entrance values.
I may have overlooked it but I can't find anywhere in the documentation indicating that these metrics can't be used together. I also tested creating a custom report within the GA interface with these two metrics and found the same result - no error or indication that I can't create a report with both metrics, but entrances incorrectly reported and exactly matching the exit values. I also get the same result in GA's Query Explorer.
Would love to work this out - it seems perfectly logical to me to want to view entrances alongside exits for exit pages :)
A better late than never response.
It makes sense, because all users that have visited your site (entrances) have left (exits).
It gets meaningful when using it along with the pages (ga:pagePath for example).

Dynamic Generate Copies of a Page in Google Forms

I'm looking to bulk input data from google forms. This involves 2 sections:
Initial Conditions
Observations
Everyone who wants to input data will be inputting 1 set of initial conditions, followed by somewhere between 5 and 20 (maybe more?) observations of multiple variables:
Name
Date
Color
Quantity
Type
The problem is that I don't want to have to make people re-enter the initial conditions each time they submit a form.
Ideally they would be able to select a response after adding one observation:
Add another observation
Conclude session
The thing I don't know how to do: Add another observation would open a new blank observations page.
Anyone have any ideas about how to make the appropriate form?

SSRS - Have report execute sub-queries?

I've looked all over and cannot find an answer to my question; I can't even determine whether it is possible.
Referring to the attached image, you will notice that this is a statement report with data grouping activated.
1) The report shows all the services invoiced to an account by date.
You can expand the group to see all the transactions that formed part of that service for that day. (You can for instance make use of the same service multiple times per day)
2) This is the detailed layout of the service invoiced. This list is different for each service, but mainly it will show you a summarized transaction list (PK BatchId), which has the "+" symbol next to it to enable drilldown to a detailed report of the batch.
My problem:
When loading the statement report, we are now hitting multiple tables, multiple times to produce the data to be grouped and displayed in #2 (refer to image).
We are trying to avoid this like the plague.
My Question
Is there a way to populate #2 when and only when the user clicks a "+" symbol or an "expand" image where the "+" is currently located in #1.
In other words. We dispose of the group function and populate the statement without detailed information. When the user clicks on #1, we load a sproc, populate a dataset and display the data in #2.
Any thoughts on this?
Drillthrough Reports look like a good solution here. See the link for more information on how these work. So basically you have the report without the detailed information, but when somebody clicks on 1 it opens up a new report with the details behind it.
After testing, I confirmed that subreports are executed even if they are hidden within an element that can be toggled.
So subreports won't answer this problem.
[Edited: previously I thought they could be used. JAT points out that this negative answer may have some value, so I'm leaving it.]