Adobe CQ 5 Accepts Arbitrary POST Request - aem

i know that Adobe CQ 5 was built on top of Apache Sling (which using JackRabbit). I'm a bit baffled as for why the website will accept arbitrary POST request from outside (unauthenticated user) into Publish Instance through the Dispatcher and then reply with HTTP 200 Content Updated. Should not content update only allowed from Author Instance in this case? Should not such request met with HTTP 403 response instead? - Why is it that even not logged in can get HTTP Response 200?
The response looked like this:
{
"changes": [],
"referer": "http://www.example.com/content/somesite/en.html",
"path": "/content/somesite/en",
"location": "/content/somesite/en",
"parentLocation": "/content/somesite",
"status.code": 200,
"status.message": "OK",
"title": "Content modified /content/somesite/en"
}
I've set POST Referrer Filter for now to prevent arbitrary POST request 'outside' the website got accepted, however i can still get this response by typing jquery ajax request in the browser console while opening the website.
I do wonder if this is bad or something, really new to Adobe CQ.
The JQuery Script for testing it is actually only these:
$.ajax({
url: 'http://www.example.com/content/fasfas',
type: 'post',
data: {},
headers: {
Accept: 'application/json',
},
dataType: 'json',
success: function (data) {
console.info(data);
}
});
Thanks in advance!

This is an issue of not taking the necessary steps to secure the AEM servers. There is a security checklist provided by Adobe to ensure that AEM installation is secure when deployed. Similar security checklist for the dispatcher is also present.
As for your case, there are few issues which are evident
The filter configuration within the dispatcher doesn't deny POST
requests, thereby allowing them to pass through the dispatcher and reach the AEM instance.
The anonymous user in the AEM publisher seems to have more than just
READ privileges on the repository thereby allowing him to make changes to the repo using POST requests.
The referrer filter configuration was allowing requests from external systems as well (which you have blocked now).

Your dispatcher should block all POST operations on the publisher. This is recommended in Adobe's official documentation for configuring dispatcher.
Publisher should also disable write permission for anonymous users and everyone group to paths that are not allowed to be modified by community. Unless you are using CUG, write should be disabled for anonymous across the publisher instance.

Related

Accessing objects in google cloud storage bucket via API key

I am trying to access content in a google cloud bucket via a Javascript running in a web browser. So far I have created signed urls on the server using the service account credentials and passed them to the client during REST calls. The only reason I am doing this is because I attempted to solve this problem before and then just gave up and opted for signed urls. Now I need to get this to work.
So far I have tried creating an access token on the server using the service account credentials like so:
credential = GoogleCredential.fromStream(this.getClass().getResourceAsStream("/serviceaccount.json"));
LinkedList<String> list = new LinkedList<String>();
list.add("https://www.googleapis.com/auth/devstorage.read_only");
credential = credential.createScoped(list);
credential.refreshToken();
Then I passed the "access_token" returned form credential.getAccessToken() to the client and used it in XmlHttpRequest like so:
var xhr = new XMLHttpRequest();
xhr.open('GET', "https://storage.googleapis.com/....." true);
xhr.responseType = 'arraybuffer';
xhr.setRequestHeader('Authorization', 'Bearer ' + access_token);
This causes chrome to produce the following error.
"Response to preflight request doesn't pass access control check: No 'Access-Control-Allow-Origin' header is present on the requested resource."
Cors on the bucket is
[{"maxAgeSeconds": 3600, "method": ["GET", "HEAD", "DELETE"], "origin": ["http://www.voxxlr.com"], "responseHeader": ["O
rigin", "Content-Type", "Content-Length"]},{"maxAgeSeconds": 3600, "method": ["GET", "HEAD", "DELETE"], "origin": ["http
://voxxlr.com"], "responseHeader": ["Origin", "Content-Type", "Content-Length"]}]
Next I tried to use an API key as follow:
var xhr = new XMLHttpRequest();
xhr.open('GET', "https://storage.googleapis.com/....?KEY=...." true);
xhr.responseType = 'arraybuffer';
That produced the following error:
AccessDenied
Anonymous users does not have storage.objects.get access to voxxlr/1511465797269/n.bin.
Shouldn't the API key provide access just like that? I am not really looking for a solution that includes the google/javascript clients since the only operation required is too read the bucket contents. No admin or delete functions are necessary. I am basically just looking for a solution where all html/javascript from my domain can have read access the buckets.
Any help would be appreciated... This has been eating up a lot of time, but it seems there should be an easy solution.
API keys are not an authentication mechanism. They provide a mechanism for indicating which project your request is associated with (which is used for a variety of purposes, notably quota and billing for unauthenticated requests), but successfully using one does not associate your request with any specific account or permissions.
Your CORS issue could be a variety of things. I notice there's a space in your specified O rigin header and in http ://voxx. Is that an artifact of copying to SO or is it how the real policy is? Also, origins need to be really specific; are you using HTTPS? If so, you'll need to include it. Also it might be a good idea to include the preflight request itself (OPTIONS) as one of the allowed methods.
Finally, you very likely do not want to be passing an access token to the client. An access token is only good for a few minutes, but it can be used to do anything the creator of the token can do (within its declared scope). That's probably something you really don't want. Signed URLs are a much safer idea.

Force POST form submission to send cookies

I'm working on a feature for a Chrome extension which requires making a same-origin POST request to an endpoint. For brevity, I'll leave out the particular details of the website. This request creates a resource of a given kind. I've succeeded in being able to create many kinds of these resources, but there's one type in particular that always fails.
When you use the website's own UI to create this kind of resource, I noticed that the resulting POST request is sent with the cookie header, along with some other stuff that looks unfamiliar to me. Here's an excerpt of the request headers:
:authority:www.example.com
:method:POST
:path:/path/to/endpoint
:scheme:https
[...]
cookie: [...]
The cookies are not sent for any other resource type, just this one.
Now, since this passes along cookies, the website's own javascript can't be using ajax. In fact, the site is posting to an <iframe> by linking a <form> to an <iframe> of a particular name.
So, I modified my Chrome extension code to use forms to post to iframes instead of making an ajax request, just like it's done natively on the website. However, the resulting POST requests still do not pass cookies. I have found nothing unique about the parts of the website's UI which create these special resources which might cause the requests to pass cookies.
How does Chrome decide when to pass cookies in a web request? How can I force it to do this for a <form> submission?
EDIT: Here are some more details, as requested.
To create a resource, just POST multipart data to /resource-endpoint. In jQuery you might do something like
var data = new FormData();
data.append('property', 'value'); // Add payload values
$.ajax({
url: '/resource-endpoint'
type: 'POST',
cache: false,
contentType: false,
processData: false,
data: data
});
Doing it this way will create most resources, except for the "special" resource. Since AJAX requests cannot pass along cookies, and the request to create the "special" resource must include cookies, I have to mimic the website's UI more closely.
var id = 'some-id';
var iframe = $('<iframe name="' + id + '"></iframe>');
$(document.body).append(iframe);
var form = $('<form></form>');
form.attr({
target: id,
action: '/resource-endpoint,
method: 'POST',
enctype: 'multipart/form-data'
});
// Add payload values
form.append('<input name="property" value="value" />');
$(document.body).append(form);
form.submit();
This still sends along requests, but there appears to be something missing, because requests to create the "special" resource do not include cookies. I'm not sure how the native website javascript is doing this, as I can't find any difference between the forms that create regular resources and the form that creates "special" resources.
EDIT: Nevermind, I saw a native "special resource" POST request from the UI which doesn't pass along these cookies, so the secret must not be the cookies.

How to send POST and GET requests from Sails app to outside Sails API

I am beginning to use SailsJS and i found it wonderful and powerful.
Can anybody please explain me how to send POST and GET requests to an API outside Sails and where do i actually write this request?
Thanks a lot and Happy 2016 everyone!!!
Edit:
Hello #arcseldon, thank you for been trying to help me.
I'll try to explain myself better, and show you my code.
I have an API, written in php (which i think is not relevant) which accepts POST, GET, PUT, DELETE methods. I use Postman for testings and everything looks OK.
I am trying to make an app in Sails, which GETs and POSTs requests to my API, but i dont know where is the best place to put the GET's and POST's codes.
In the model i already have the following to ask for a token to perform the other requests, and it works:
gettoken: function (requestnewtoken,tokenresult) {
if(!requestnewtoken) {
tokenresult(global.tokeng);
} else {
request({
headers: {
'User-agent' : 'develop',
'Content-Type' : 'application/x-www-form-urlencoded;charset=UTF-8',
'Content-Length' : '29',
'Authorization' : 'Basic ' + global.idsecret
},
uri: "https://myapi/oauth2/token",
method: "POST",
form: {
grant_type: "client_credentials"
}
}, function(error, response, body) {
var tokenjson = JSON.parse(body);
var token = tokenjson['access_token'];
global.tokeng = token;
tokenresult(token);
});
}
}
Then, i perform a GET request to another endpoint, which works:
listpublicroutes: function(requestnewtoken,cb) {
Model.gettoken(requestnewtoken,function(token) {
request({
headers: {
'Authorization' : 'Bearer ' + token
},
uri: "https://myapi/folder/file.json",
method: "GET",
timeout: 10000,
followRedirect: true,
maxRedirects: 10
}, function(error,response, body) {
if(error || (response.statusCode != 200)) {
Model.listpublicroutes(true,cb);
} else {
cb(null,JSON.parse(body));
}
});
});
}
My doubts are if this is the best way to write a POST and GET request or they could be more simple, and if the requests should be done in the controller (or anywhere else) instead of the model.
Can you give me an example of a POST and GET request?
Thanks a lot to everyone who's trying to understand me.
Your question isn't clear exactly what you are asking... Here are a few suggestions depending on what you wish to do.
If you are trying to call out and make an http request from within
server-side Sails code then I would recommend you take a look at the
NPM module request.
If you are talking about making get / post requests to test your API,
then use a web browser plugin / tool such as postman (also a
Chrome plugin of same name).
If you are talking about calling a completely different domain URL
using AJAX from within your web application client (via web browser)
then you can just use any AJAX approach (jquery / angular / whatever
client library you are using to make ajax calls) but be aware that the
domain you are calling to would have to have been setup with a cross
origin resource sharing (CORS).
You have control over your own CORS settings (allowing apps originating from other domains to call into your sails API from the browser, by updating config/cors.js settings.
Does this answer your question? If you need further help leave me a message below, and I'll try to assist.
Update Based On Question Update:
#Michi - ok, you wish to call your PHP api from within SailsJS - the three contenders in terms of location to do this are inside a Controller, a custom Model method, or within a custom service.
My "recommendation" is that most of the time, this logic sits within a Controller if it is logic that doesn't really need to be shared elsewhere. You could conceivably call into a Model.method(), but "usually" you want to think of the models as your domain data (certainly not always, if the responsibility for what you are doing truly belongs to a single Model etc which owns all the state interaction then the model might be the right place to locate that responsibility).
However, in your case, you are getting a token, and without seeing your model code (you have just provided the getToken function which doesn't really look like it is tied to a particular model..?) then I would say opt to invoke it within your controller if not needed elsewhere. Otherwise, refactor it out into a service. To quote the SailsJS documentation:
Services can be thought of as libraries which contain functions that
you might want to use in many places of your application. For example,
you might have an EmailService which wraps some default email message
boilerplate code that you would want to use in many parts of your
application. The main benefit of using services in Sails is that they
are globalized--you don't have to use require() to access them.
I have frequently put custom logic in Controllers, Models and Services, but for networking related logic etc, my preference is:
Controller if one-off logic.
Service if reusability required, or encapsulating the code as service improves readability / maintenance of the app
Use a model, only if you strongly believe the logic and responsibility is truly tied to that model - err on the side of caution here and use sparingly. But I acknowledge my recommendations may be construed as subjective - am basing them on what I believe to be good OOP practices in general terms.

How to demo a REST API without a REST client

I need to build a way to demo a REST API that takes three or four inputs, makes a REST call to an external server, then displays the response. This demo needs to be performed by a rather limited technical audience to business so REST clients are out.
It seemed like a simple HTML page that would do an ajax call would be fine for this, except I ran into the No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'null' is therefore not allowed access fun messages because my localhost domain does not match the target domain. I don't have access to the target REST web service, so I can't make the necessary changes for the CORS headers.
Any ideas?
Build a very small and simple web application that shows the same HTML but does the REST call with its own REST client and shows the results.
Then run that on a local server.
Can you use a product where you can host your REST API? (There are number of API hosting products available. )If so you can try wso2 APImanager.? It is free and opensource. You do not need to develop any HTML page. You can try available swagger client or REST tool to test your APIs. You can do CORS settings too..
You can also use ExploREST, a project created with this goal in mind (production demo here).
With this tool, you can make GET/POST/PUT/DEL requests, but you can also create special links in the text documenting your API so that each time someone click on it, it will make the request you defined.
Example:
## My API is very good, I am documenting it. Try
%{
"text": "to post",
"post": {
"address": "/character",
"data": {
"name":"Dark Vador",
"type": "sith"
}
}
}%
Will result in a link that make a post when the user clicks on it.
The project is open source so do not hesitate to contribute !

Is it normal for IBM Connections opensocial gadgets to make 2 HTTP requests on gadgets.io.makeRequest?

Within an IBM Connections sharebox/share dialog gadget my-sharebox.xml, I make the following request:
gadgets.io.makeRequest(url, function (response) { ... });
Using tcpflow on the IBM Connections server to capture the outgoing request & response, I see 2 HTTP requests.
The first one to the url specified above, and a second request to the gadget XML file, my-sharebox.xml.
Is this second request expected behaviour?
Is it possible to somehow suppress the second request?
In a production environment it should be caching the gadget XML and only fetch it once. That will usually happen when the gadget is rendered. Do you have all debug parameters related to opensocial disabled?