I'm using Dropbox APIs Drop-ins, when I try to save a file I get an error message like Received non-200 response status 503 from server for url: http://www.example.com/test.txt
In my html page I use the Dropbox.save(optionsUploader); function on an input button and my js script is:
optionsUploader = {
files: [
{'url':'http://www.example.com/test.txt', 'filename':'test.txt'}
],
success: function() {},
progress: function(progress) {console.log(progress);},
cancel: function() {},
error: function(errmsg) {console.log(errmsg);}
}
Is that because I'm working on localhost ? (the www.example.com address corresponds to a 127.0.0.1)
The Dropbox Saver works by having the Dropbox servers download the file at the supplied URL. In this case, it sounds like the supplied URL is actually a localhost (127.0.0.1) URL, which won't be accessible to the Dropbox servers. (That is, you may be serving the file on your local computer, but this isn't openly available on Internet.)
The error message is telling you this, in that it's saying that the servers tried to access the file at the supplied URL but got the HTTP error code 503 ("Service Unavailable").
So, this being the case, to use the Saver, you'll need to supply a URL that's actually accessible on the Internet. This may mean hosting files on your server, or some CDN, or even a Dropbox link.
Related
I've built a wee program that works fine when I run it locally. I've deployed the backend to Heroku, and I can access that either by going straight to the URL (http://gymbud-tracker.herokuapp.com/users) or when running the frontend locally. So far so good.
However, when I run npm run-script build and deploy it to Netlify, something goes wrong, and any attempt to access the server gives me the following error in the console:
auth.js:37 Error: Network Error
at e.exports (createError.js:16)
at XMLHttpRequest.p.onerror (xhr.js:99)
The action that is pushing that error is the following, if it is relevant:
export const signin = (formData, history) => async (dispatch) => {
try {
const { data } = await api.signIn(formData);
dispatch({ type: AUTH, data });
history.push("../signedin");
} catch (error) {
console.log(error);
}
};
I've been tearing my hair out trying to work out what is changing when I build and deploy, but cannot work it out.
As I say, if I run the front end locally then it access the Heroku backend no problem - no errors, and working exactly as I'd expect. The API call is correct, I believe: const API = axios.create({baseURL: 'http://gymbud-tracker.herokuapp.com/' });
I wondered if it was an issue with network access to the MongoDB database that Heroku is linked to, but it's open to "0.0.0.0/0" (I've not taken any security precautions yet, don't kill me!). The MDB database is actually in the same collection as other projects I've used, that haven't had this problem at all.
Any ideas what I need to do?
The front end is live here: https://gym-bud.netlify.app/
And the whole thing is deployed to GitHub here: https://github.com/gordonmaloney/gymbud
Your issue is CORS (Cross-Origin Resource Sharing). When I visit your site and inspect the page, I see the following error in the JavaScript console which is how I know this:
This error essentially means that your public-facing application (running live on Netlify) is trying to make an HTTP request from your JavaScript front-end to your Heroku backend deployed on a different domain.
CORS dictates which requests from a frontend application are ALLOWED to make a request to your backend API.
What you need to do to fix this is to modify your Heroku application and have it return the appropriate Access-Control-Allow-Origin header. This article on MDN explains the header and how you can use it.
Here's a simple example of the header you could set on your Heroku backend to allow this to work:
Access-Control-Allow-Origin: *
Please be sure to read the MDN documentation, however, as this example will allow any front-end application to make requests to your Heroku backend when in reality, you'll likely want to restrict it to just the front-end domains you build.
God I feel so daft, but at least I've worked it out.
I looked at the console on a different browser (Edge), and it said it was blocking it because it was mixed origin, and I realised I had just missed out the s in the https on my API call, so it wasn't actually a cors issue (I don't think?), but just a typo on my part!
So I changed:
const API = axios.create({baseURL: 'http://gymbud-tracker.herokuapp.com' });
To this:
const API = axios.create({baseURL: 'https://gymbud-tracker.herokuapp.com' });
And now it is working perfectly ☺️
Thanks for your help! Even if it wasn't the issue here, I've definitely learned a lot more about cors on the way, so that's good
I have a web server with IP address and port.
I would like to browse all files on this web server. How can view and upload new files there?
For example, I have already tried via a web browser:
192.168.101.190:2870
But there is an error: 404 Not Found
192.168.101.190:2870/name1.xml/r/n
403 Forbidden
I have access to the files for example: *.xml
192.168.101.190:2870/name1.xml
192.168.101.190:2870/name2.xml
Server: NFLC/3.0 UPnP/1.0 DLNADOC/1.50
The feature you're looking for is Directory Listing/Directory Browsing. If your web server is:
Apache refer to: https://wiki.apache.org/httpd/DirectoryListings#Directory_Listings
IIS refer to: https://blogs.iis.net/bills/how-to-enable-directory-browsing-with-iis7-web-config
Nginx refer to: https://nginxlibrary.com/enable-directory-listing/
I'm trying to write a Firefox Addon with the Addon SDK to redirect some websites based on their URL. I have created a HTML page and put it in the data directory. I get the path with:
var data = require("sdk/self").data;
var myWebsite = data.url("myWebsite.html");
I'm using PageMod to start a script given an array of URLs:
pageMod.PageMod({
include: ArrayOfUrls,
contentScriptFile: "./myScript.js",
contentScriptOptions: {"myWebsite" : myWebsite}
});
In myScript.js I'm checking if some requirements are fulfilled and if so I try to redirect to my local website with:
window.location.replace(self.options.myWebsite);
But I always get the following error message in the console:
Object
- _errorType = Error
- message = Access to 'resource://myAddon/data/myWebsite.html' from script denied
If I enter the path to the local website (resource://myAddon/...) manually in the adress bar of the browser it works. If I redirect to another website (e.g. http://example.com/) it works as well.
So I guess there's a security setting or so I need to change to make the local redirect possible, but I can't find anything in the documentation or on the web. I hope somebody here can tell me what I'm doing wrong.
In package.json I had to add the following line to make it work:
"permissions": {"cross-domain-content": ["resource://myAddon/data/"]}
Further documentation can be read in link Noitidart provided in his comment.
i'm saving the remote audio files on google cloud storage.
i want to play these files in freeswitch.
when i use:
mediaLink = "http://storage.googleapis.com/myBucket/file.wav";
session:streamFile(mediaLink);
it works great.
But when i use signed urls
mediaLink = "http://storage.googleapis.com/myBucket/file.wav?GoogleAccessId=xxx-xxx#developer.gserviceaccount.com&Expires=1408903962590&Signature=xxb%2Fx%2FDfGJlrUuz0%2F6kA6ormmReW6oN%2F0xxy3%2BwWxXc%3D";
session:streamFile(mediaLink);
i get this error:
2014-08-24 20:42:48.770818 [ERR] mod_httapi.c:2696 File at url [http://storage.googleapis.com/myBucket/file.wav?GoogleAccessId=xxx-xxx#developer.gserviceaccount.com&Expires=1408903962590&Signature=xxb%2Fx%2FDfGJlrUuz0%2F6kA6ormmReW6oN%2F0xxy3%2BwWxXc%3D] is unreachable!
Thanks,
Snabel
Was able to get this to work with playback from mod_dptools and mod_shout (for mp3 support) on FreeSWITCH 1.10. The Google Storage signing algorithm is also at V4 at the moment.
Testing workflow:
$ gsutil signurl -r us service-account.json gs://the-bucket/a-song.mp3
URL
gs://the-bucket/a-song.mp3
HTTP Method
GET
Expiration
2019-08-24 19:03:02
Signed URL
https://storage.googleapis.com/the-bucket/a-song.mp3?x-goog-signature=a4d1dc28eeebfcdfd09f00cbc1bbe605590df3773fd473a9bcc961928bd0a64bc9ea6367a7c2de98cb5f397529bd82f455651b1d04fc1650c97587fb3c298ab49ee6dfc9092068a09612b30b9707595af4705d904e8a63b8fd3e4fcf3eb6767635d2ffb77f036b58e2cae39c7ac571bd520ef5e49435599b6f9871d1bbd43d4e1329a1af1274c15ff63731f058c61cdc693e4c5d85cbeca8b3c5886a9f1d86bfa196ea89300bccd103c66c4dec0000caa80cee5f5cbd748312dc1288c33800750313e9534bfbd8ecbd23bec31fa7c97dd0fcc1581c3353fa38e09c1e888fd8c07766059e63e14579fa44d25a57231ab4504265217c8a8225bdd68983bf6570dd&x-goog-algorithm=GOOG4-RSA-SHA256&x-goog-credential=name-of-the-service-account%2F20190824%2Fus%2Fstorage%2Fgoog4_request&x-goog-date=20190824T183009Z&x-goog-expires=3600&x-goog-signedheaders=host
Using playback with the FreeSWITCH Lua API's IVRMenu (examples) (here's the full application code):
-- signed URL above stripped of the URL scheme
local signed_url_without_url_scheme = "storage.googleapis..."
i.unregistered_main:bindAction(
"menu-exec-app",
"playback shout://" .. signed_url_without_url_scheme,
"3"
)
I had no luck with streamFile, but not because of the signed URLs, but because I couldn't even get it to work. The documentation is very spotty, and certain sections even contradict each other (compare mod_shout and session:streamFile for example).
I'm trying to generate a Perl library to connect to a WebService. This webservice is in an HTTPS server and my user has access to it.
I've executed wsdl2perl.pl several times, with different options, and it always fails with the message: Unauthorized at /usr/lib/perl5/site_perl/5.8.8/SOAP/WSDL/Expat/Base.pm line 73.
The thing is, when I don't give my user/pass as arguments, it doesn't even asks for them.
I've read [SOAP::WSDL::Manual::Cookbook] (http://search.cpan.org/~mkutter/SOAP-WSDL-2.00.10/lib/SOAP/WSDL/Manual/Cookbook.pod) and done what it says about HTTPS: Crypt::SSLeay is instaleld, and both SOAP::WSDL::Transport::HTTP and SOAP::Transport::HTTP are modified.
Can you give any hint about what may be going wrong?
Can you freely access the WSDL file from your web browser?
Can someone else in your network access it without any problems?
Maybe the web server hosting the WSDL file requires Basic or some other kind of Authentication...
If not necessary ,I don't recommend you to use perl as a web service client .As you know ,perl is a open-source language,although it do support soap protocol,but its support do not seem very standard.At first,its document is not very clear.And also ,its support sometimes is limited.At last,bug always exists here and there.
So ,if you have to use wsdl2perl,you can use komodo to step into the code to find out what happened.This is just what I used to do when using perl as a web service client.You know ,in the back of https is SSL,so ,if your SSL is based on certificate-authorized,you have to set up your cert path and the list of trusted server cert.You'd better use linux-based firefox to have a test.As I know ,you can set up firefox's cert path and firefox's trusted cert list.If firefox can communicated with your web service server succefully,then,it's time to debug your perl client.
To debug situations with Perl and SOAP, interpose a web proxy so you can see exactly what data is being passed and what response comes back from the server. You were getting a 401 Not authorized, I expect, but there may be more detail in the server response.
Both Fiddler http://docs.telerik.com/fiddler and Charles proxy https://www.charlesproxy.com/ can do this.
The error message you quote seems to be from this line :
die $response->message() if $response->code() ne '200';
and in HTTP world, Unauthorized is clearly error code 401, which means your website asks for a username and password (most probably, some website may "hijack" this error code to cater for other conditions like a filter on the source IP).
Do you have them?
If so, you can
after wdsl2perl has run, find in the created files where set_proxy() is called and change the URL in there to include the username and password like that : ...->set_proxy('http://USERNAME:PASSWORD#www.example.com/...')
or your in code, after instantiating the SOAP::WSDL object, call service(SERVICENAME) on it (for each service you have defined in your WSDL file), which gives you a new object, on which you call transport() to access the underlying transport object on which you can call proxy() with the URL as formatted above (yes it is proxy() here and set_proxy() above); or you call credentials() instead of proxy() and you pass 4 strings:
'HOSTNAME:PORT'
the realm, as given by the webserver but I think you can put anything
the username
the password