Looking for feedback from anyone that has implemented server-side FB pixel on SFCC since iOS 14.6
this is a big lengthy so I do apologize, but I haven't seen this issue referenced on Stack and I'm imagining the use-case is growing as companies keep migrating to SFCC.
I currently have server-side enabled out-of-the-box on SFCC for the Facebook Pixel. While it fires appropriately, I'm still seeing a handful of errors as noted below. This was a necessary step after 14.6 decimated our existing FB pixel and it has worked, for the most part, but I can't help but feel as though there's something is fundamentally broken about it.
Errors keeping me up at night:
Am I correct in ONLY using the server-side FB pixel and having the third party tracking pixel turned off?
Should I even be using the SFCC out-of-the-box server-side set-up?? Am I better off building the connector out myself via GA4?
"Server External ID Not Matching to Pixel External ID for Purchase Event"
^ this makes sense as I ONLY have the server-side running and no web pixel, but unsure why the issue is still presenting itself on the diagnostics tab.
"Server Sending Invalid Match Key Parameters for AddToCart Event"
^ FB support mentioned that this is because a parameter is missing from AddToCart, which i'm guessing is the $ total, which SFCC isn't sending OOTB.
I'm getting a bunch of external URLs in the "Recent Activities" section for the "Purchase" event, but not on the pageview or any other events. unsure what the impact is here.
The "Aggregated Event Measurement" tool shows "No Recent Activity", and FB support was not very clear on why or how this happens.
Related
Is it possible to do this? Ideally to return the report in the very same page with ajax?
Example the user adds www.mywebsite.com to the field and then the report of pagespeed is returned. If not possible then redirect to Pagespeed result page.
You have a few options here. Starting from easiest to hardest (and in my opinion "worst" to "best" solution).
Add the Page Speed Insights (PSI) test page to an iframe on your site. You can then change the URL of that iframe to https://developers.google.com/speed/pagespeed/insights/?url=yourwebsite.com and manipulate the ?url=yourwebsite.com to be whatever you want.
This may be against Google's terms of service and is also a bad user experience but it is the easiest way to achieve it. I will leave you to investigate that option if you decide to do it.
Redirect users to a new tab. So just do <a target="_blank" href="https://developers.google.com/speed/pagespeed/insights/?url=yourwebsite.com">view your report</a> or redirect via JS on a button click.
Yet again not a great option as people are leaving your site but at least this won't be against Google's terms of service.
Use the page speed insights API. https://developers.google.com/speed/docs/insights/v5/get-started.
This is your best option in terms of time vs flexibility. You supply the API with the URL and it returns a JSON response with all of the metrics it gathers and the scoring.
Please note PSI is on version 6 of the API which should be available for general use soon.
Obviously this is a lot more work but well worth the effort as you can style everything as you please.
Install Lighthouse, the engine that drives PSI on your own server.
You can find the Lighthouse repository here. Please note you need to know how to use node, it is useful to understand puppeteer and you need a reasonable amount of server admin knowledge to get chromium (used as a headless web browser for running the tests) working and linked correctly.
At this stage you have complete control and can write your own test, scoring criteria etc. You can also run as many tests as your server will allow. If you want this level of control and freedom then this is the best option. However be prepared to sink a lot of hours into this solution!
I am trying to figure out how to run an A/B Test for a change on a Page Step for a Single Page. The idea is we have a payment flow with several page steps each containing a form. We'd like to swap out forms and test how our users react. We are trying to avoid changing the URL.
I looked into tools such as Google Analytics, but that requires a different URL to run the A/B test. The hesitation about creating a new URL is because our users are known to bookmark them, and we don't want to keep a backlog of redirects from invalid URLs, also we'd like to avoid constantly deploying new URLs for our tests.
I cannot seem to find any tool to do this, so I've tried to think of a few solutions but I'm not having a lot of luck.
My best idea is to build both a and b forms into the page, and when a user accesses the flow, the session randomly(based on a preset%) stores a value that dictates whether the user is in test a or b. Then when they step into that form, the server will serve the proper form to them. If they abandon their session, we'd track that, and if they complete the action, we'd track that.
I feel like there should be a better solution, but I just cannot come up with one.
My results online were either blogs showing how to approach it from a high level, and all of them used different URLs, I have found almost no developer resources.
Thanks.
We're using ExtJS 4.2.2, and .NET as our server.
Whenever you need the server to be involved, you need server-side instrumentation. No free tools offer that, but you could consider Optimizely "full-stack" (has support for C#) or Variant (does not yet).
We are implementing SiteCatalyst on flat HTML files. There is a requirement where we need to show campaigns based on the data that we reported from Analytics. e.g. There is a form having multiple fields. If user have not filled the form/or filled the form, we will track this event and report it to omniture. Now if he presses back button without filling the form completely, we need to show him some campaign/offers. The same will happen when he presses the submit button only the campaign will be different this time. Can this be achieved ? Can we integrate sitecatalyst and campaigning ?
I know that the vice-versa is possible. We can track campaigns and report the campaign id's. But is there any way to display offers based on the analytics data. That too in real time.
Any help would be great !
Thanks in advance.
It sounds like what you are looking for is Adobe Target.
Adobe Target is a tool that allows you to do AB/MV testing, but also target visitors by set rules and criteria.
Very simple example:
"If user came from foo.com, show <h1>foo</h1>. If user came from bar.com, show <h1>bar</h1>"
There is a level of integration between Adobe Target and Adobe Analytics. However, it is not real-time for data that has already been collected.
For example, if you have logic that pops s.prop10 on page with "foo" then that can be integrated with Adobe Target and you can setup a rule that says something like "If s.prop10 is 'foo' then show '<h1>foo</h1>'".
But, it does not let you make a rule like "if prop10 was 'foo' for this visitor at any point in the time in the past, show '<h1>foo</h1>'". In other words, there is no real-time evaluation of data already collected on Adobe's servers.
But, if you were simply wanting to make rules based off the current visit, you can store information in cookies look at cookies to make rules in Adobe Target easy enough.
Also note that there are no built-in tools or hooks or methods etc.. for the actions you described. For example, there's no way to natively say in Adobe Target (or Adobe Analytics) "If a visitor clicks the back button or does this other action, track that". You need to write your own code to define those actions and trigger relevant tracking code at relevant times. Adobe Analytics (and other tracking tools) can help automate some basic stuff like simple link clicks or form field focusing - IOW direct 1:1 actions, but baking in complex actions like that is not feasible for a tracking tool, because every site and scenario is unique.
I guess the TL;DR here is that there is no magic wand for this sort of thing, not for Adobe or any other analytics/tracking tool; you're going to have to write your own code (be it server-side, client-side, or mix of both) to meet your business needs.
You can use Reporting API exposed by adobe sitecatalyst.
Through the Reporting API, you’re able to access the reports generated for your Form events. If you’re using SiteCatalyst 15, you’ll be able to generate reports based on segments also. Recently the Reporting API was updated and given the ability to perform multi-level breakdowns across reports. For more information on this method, go to the API documentation within the Adobe Developer Connection.
Sample Real time access API:
// Real-Time Report
// Note the inclusion of "source" equals "realtime"
// Make sure you configure Real-Time reports for the report suite
https://api.omniture.com/admin/1.4/rest/?method=Report.Run
{
"reportDescription": {
"source": "realtime",
"reportSuiteID": "rsid",
"metrics": [
{ "id": "revenue" }
]
}
}
I have been playing around with GWT and GWT Visualization Wrapper API. One thing I learned recently is that GWT Visualization API does not work without an internet connection (I was working offline the other day and it took me a good half hour to figure out why my charts were not loading)
After doing a lot of reading online about privacy, data, and GWT, it seems that many people, including me, have a concern about sending data to Google when trying to display graphs. I already searched through many sources, including stackoverflow, and I would like to 100% confirm that my assumptions are correct.
The reason for people's concern about sending data to Google was when you tried to get an image of the said chart. This required data to be sent to Google, they processed it, and then they returned an image to be embedded in your website. According to my studies, that feature has been deprecated from Google charts (and for good reason). The way it works now, to my understanding, is that every time you want to display a chart, you download the most up-to-date library on the client side and perform all the calculations on the client. This makes it so that Google doesn’t actually get any information you will display on the charts.
Thus, I can continue using the visualization API as long as I keep using interactive charts and keep checking on the Google charts documentation page that it says that for this particular chart i.e Line Chart:
https://developers.google.com/chart/interactive/docs/gallery/linechart
(SEE BOTTOM OF PAGE) “All code and data are processed and rendered in the browser. No data is sent to any server” I do not have to worry about anyone getting my data because all information is processed client side.
Please correct any incorrect assumptions that I may have. Thank you.
The charts on this page, https://developers.google.com/chart/interactive/docs/gallery, all include a "Data Policy" section which details whether the chart is rendered on the client and what data will leave the client. Currently, only GeoChart communicates with Google (in order to do the Geocoding); obviously, this could change in the future.
The charts on this other page, https://developers.google.com/chart/interactive/docs/more_charts, include some that were written by Google, and some that were written by third parties. These also include a Data Policy section. For those written by Google, you can rely on this policy. For those written by third parties, Google has not validated the claims and cannot guarantee them.
I have been looking into using Facebook Connect for a new web site I am building, however the Facebook API seems to be a little flaky.
The code I have been using is basic, however the example application 'therunaround' suffers from similar issues.
For example, the friends list does not always load, and the logged-in Facebook user is not always detected.
Are these issues purely because I am just starting out?
Yes, the problems you describe are likely just due to not using the API correctly.
Facebook Connect is still very new and does suffer from occasional problems, but in general it is pretty stable and is being implemented by many major sites.
This developer post may be relevant to your problem: (From: http://www.facebook.com/developers/message.php)
Please make sure features are loaded before using Facebook Connect JS APIs
Feb 26, 2009 1:35pm
Almost all Facebook Connect's JS APIs
are loaded asynchronously after
calling FB.init(). Because of the
asynchrous nature, the JS APIs are not
necessaryly available immediately
after making these calls. You should
use FB.ensureInit or
FB.Bootstrap.requireFeatures to ensure
the JS APIs are loaded before using
them.
We have a wiki documentation that
describes this in more detail.
However, we just found out that some
Connect apps were calling Connect JS
API such as FB.Connect.* and
FB.Facebook.* immedidately after
calling FB.init. This approach would
cause intermittent failures because
the functions may not be loaded yet.
Pior to last night's push, we fixed a
bug in our FeatureLoader.js.php where
it was always automatically staring
the asynchourous loading of Connect
features. After the bugs is fixed,
code that were calling JS API before
making they are loaded are more likely
to fail. If you noticed that your
Connect apps are getting function not
defined erros for Connect JS
functions, please check your code to
make sure it is waiting for features
to loaded before using them.