Unable to play deezer content from javascript-samples. VALID_TOKEN_REQUIRED - deezer

I'm exploring the capabilities of deezer javascript sdk.
All looks fine and working on the developer.deezer.com site but when I'm trying to replicate this on my localhost http server it fails to play any song.
create a deezer app with domain being my local ip.
downloaded the https://github.com/deezer/javascript-samples/tree/master/basic-custom-player
changed the index.html to match the app id and the url to the channel.html (again, using my server local ip)
served the files using node.js and a serve-static
launched chrome (windows, latest) and directed it to the /index.html
login button works
but when I hit the play button nothing is played. The chrome network inspector shows A LOT of requests to get the pageAlbum and all those requests fail with VALID_TOKEN_REQUIRED.
Request url (the actual api_token changes with each request and is different than the one from the login request)
http://www.deezer.com/ajax/gw-light.php?api_version=1.0&api_token=fd120a7ce34fa1e18e4cb75237785b9a&input=3&cid=00568e39151fd6bf1
Request body
[{"method":"deezer.pageAlbum","params":{"alb_id":"2962681","lang":"ro","header":true,"tab":12}}]:
Response
[{"error":{"VALID_TOKEN_REQUIRED":"1"},"results":{}}]
I have also tried:
- get a free domain and try from there, same result
- use the deezer widget from local html files (file:///) and from the local server (http://). same result
What am I missing?

It seems that is has something to do with cookies.
I changed the browser policy to not block third party cookies. And the deezer widget started working. Then I have only allowed third party cookies from [*.]deezer.com and it continued to work. Once I chose to block again all third party cookies, the widget stopped working.

Related

I'm getting an error message related to Access-Control-Allow-Origin

I'm working with a landing page that uses cdn plyr
<script src="https://cdn.plyr.io/3.3.10/plyr.js"></script>
<script>const player = new Plyr('#player');</script>
I moved a video from local files to a server and changed the src="to new address form server", but the video stopped working and I'm getting this error:
page.html:1 Failed to load https://www.video.mp4: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://111.0.0.0:12121' is therefore not allowed access.
I tired different things, and even added another videos from other servers and it worked. except my video. The only thing that work is to add crossOrigin="anonymous" to the video tag and install Chrome extension But this wont work for other users, I need something permanent.
I also looked in to many answers:
How does Access-Control-Allow-Origin header work?
Videos not playing due to no Access Control Allow Origin
HTML5 video doesn't play with crossOrigin=“anonymous”
Please any ideas how to make this work?
This is a problem caused when you try to send request from a server that is different from the server you send request to. As in the comment was indicated, only the server you have uploaded your video to can control the header. But if it's your own server you can easily manipulate the code to allow request from different servers.
Try this for a reference on how to enable on your server W3C CORS Enabled

HTTP 302 redirect caching on Opera Mini

I'm developing a site where I am using the common Post/Redirect/Get pattern (https://en.wikipedia.org/wiki/Post/Redirect/Get) when submitting forms. In my particular case this is items for a todo list - so I'm POSTing to say https://example.com/group, processing that request server side adding the new item to a database, and then returning a 302 response (http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html#sec10.3.3) to indicate the browser should redirect (GET) to https://example.com/group which then displays a list of all the todos submitted to the db. This partly works as expected on Opera Mini, but on both Opera Mini on Android (v12) and on the microemulator on Mac OSX (I haven't tested on other versions), the resulting page shows the list of todos without the new item, until I refresh the page manually at which point the list returned does include the new item.
I'm assuming what's happening here is that the page being shown after the redirect is the version which has been cached on Opera's proxy server previous to the POST request. If this is the case, is there a way I can indicate to the proxy server that it should display a fresh version of the page from my server rather than the cached version? I have also tried the more correct 303 status code (http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html#sec10.3.4) but the same thing happens - although the spec says for this:
The 303 response MUST NOT be cached, but the response to the second (redirected) request might be cacheable'
although I have found no references to how this redirected request can me marked as non-cacheable. Incidentally all other browsers seem not to cache this redirected request at all.
Thanks very much for you help in advance.
Chris.

Facebook desktop app (on browser) - login without hosting on a webserver

I am trying to develop a Facebook desktop app that runs on a browser (but not on a webserver). So strictly speaking, I am running a standalone webpage that is not in any domain. I am in need to find a suitable login solution for this.
Currently, Facebook authentication has to be redirected to another URI - the problem for me is that I am unable to get the access token from this redirected page (dialog/poup) because of cross-domain access issues. Is there a way across it?
Also, since I am running the page on a filepath (c:/wamp/www/facebook.html) rather than on a webserver, the "auth.login" events are not fired after authentication is done in the dialog. Is this expected behaviour as well?
Any help would be appreciated. Thanks!
Read up on FB Auth: http://developers.facebook.com/docs/authentication/
If your app is able to read iframe URLs (such as an AIR based app) then you can rely on using the Desktop App Auth Flow and reading the credentials from the response URL that Facebook hosts: https://www.facebook.com/connect/login_success.html#access_token=...
Use local web server and server it from http://localhost... There is planty of lightweight one for Windows (Abyss) as well as Unix (thttpd).
EDIT: It seems it is not possible using file:// urls. I tried this page:
<html>
<head>
<script>
function fbLogin() {
if (window.location.hash) {
alert("Access token is: "+window.location.hash);
} else {
window.location.href = "https://www.facebook.com/dialog/oauth?"
+ "client_id=54715426813&redirect_uri=file:///D:/Herby/Desktop/page.html&response_type=token";
}
}
</script>
</head>
<body onLoad="fbLogin();">
</body>
</html>
and oath dialog said
API Error Code: 191
API Error Description: The specified URL is not owned by the application
Error Message: Invalid redirect_uri: Given URL is not permitted by the application configuration.
which means, you must fill in the file:/// type url in you app config. I tried that, but it said me the protocol used must be http or https. So, bye-bye to facebook on file://
The only possibility for true desktop apps is to include web control that uses true http urls and somehow (through tiny embedded server or by some kind of hook and mocking) is able to use such an URL (or, as was pointed in other answer, you can use no redirection url and get redirected to default facebook result page). But that is not something you can do in a browser only.

Not getting $_REQUEST['signed_request']

I'm trying to pass some variables to my facebook app from the url, i.e. using GET variable app_data like facebook wants.
At some point I've stopped getting the ['signed_request'] part of the $_REQUEST. When I print_r($_REQUEST) I'm getting: ['doc'], ['user'], ['__utmz'], ['__utma'] and ['session'] values, but not signed request :(
Any ideas of why this might be happening?
Check the tab/canvas url is EXACTLY the same as required. If there is a redirect to another page, then signed request and other values will not be sent. You can check using a browser sniffer, if a call to the page responds with a 300 (301/302 etc) redirect, then you need to change to what it redirects to.
Examples:
https://example.com/ may need to be https://www.example.com/ (add www., or remove www. depending on how server is set up)
www.example.com/ may need to be www.example.com/index.php (add index.php, or the right page).
Check you are using http:// and https:// correctly in the URLs, and that https:// returns a valid page.
I've only been able to get the signed request in https://, i get no request at all in http.
Currently have a bug on FB, but no word on fixing it yet; http://developers.facebook.com/bugs/264505123580318?browse=search_4eb3ef23eb18d6649415729
EDIT:
http://SITE.com was redirecting to http://www.SITE.com, so I was loosing the request variables.
Had a similar issue, for me it was as simple as a mismatch of the app id and app secret! However in facebook developers backend I have noticed that the URLs all need to have that trailing slash!
Some browsers do redirect your request to https automatically if you have been on this particular site on https so if you are in http mode on facebook there is situation:
facebook requests http version of your app, browser redirect this request of facebook to https and POST data and thus signer_request are gone in this process...
i see this problem in chrome 23, if you delete browsin data (particulary Deauthorize content licenses) app should run back on http

Force the browser to send some HTTP request header

I need to include some secure (BASIC authentication) application.
when I open the application URL in the browser, the browser asks me to enter your credentials ...
what I know is that:
The browser ask the server to get
some URL -- the url of the app
The server checks the request header
for the Authentication header and
didn't find it
The server sends 401 to the
browser back
The browser interpret this response
code into a message dialog that
shows to me asking me to enter the
username/password to send back to
the server in the Authentication
request header
So far... so good, I can write some page (in JSP) that send this required http request header to the request that is calling this page..
So I'll call this application through my page..
The problem here is, this application (in fact a GWT application) contains a reference to some Javascript and CSS files that is coming from the server that hosts this application. the application page that I import looks like:
<html>
<link href="http://application_host/cssfile.css" />
<link href="http://application_host/javascriptfile.js" />
.....
</html>
So, again I found the application asks me for the authentication crenditals for the css and js files!
I am thinking of many solutions but don't know the applicability of each
One solution is to ask the browser
(via Javascript) to send the request
header (Authentication) when he
asks the server for the js and css
files
please give me your opinions about that... and any other suggestions will be very welcomed.
Thanks.
I think you're running into some weirdness with how your server is configured. Authentication happens in context of a authentication realm. Your assets should either be in the same authentication realm as your page, or (more likely) should not require authentication at all. The browser should be caching credentials for the given realm, and not prompt for them again.
See the protocol example on http://en.wikipedia.org/wiki/Basic_access_authentication
Judging from your story, something tells me your problem is with the authentication method itsef. Not how to implement it. Why do you want to bother with the request header so much?
As far as i know, you can configure your container (ie Tomcat) to force http authentication for certain urls. Your container will make sure that authentication has taken place. No need to set http headers yourself whatsoever.
Perhaps you can explain a bit better what you are trying to achieve, instead of telling implementation details?
Why css & js files are kept in protected area of server? You need to place files into public area of your server. If you don't have public area, so you nead to prpvide for it. how to do it depends from serverside software architecture & configuration.