How to skip and take within Babaganoush using javascript - babaganoush-sitefinity

How would you skip and take records from using the webservices of Babaganoush?
I tried this:
$.get('http://localhost/egz/api/mediafiles?take=10&skip=100').done(function(response) {
console.log(response);
});
That didn't work. Only the 'take' parameters seems to work?
I thought that when I can use this in C# code:
BabaManagers.News.GetAll(
filter: x => x.Visible,
take: 25,
skip: 75
);
... it would also work client side.
I have to say that I created my own Api Controller for a DynamicType, but I think that shouldn't matter?
Also, any reason why take and skip are reversed, since in most cases it is skip and then take.

This has been complete with the latest version of Babaganoush:
http://localhost/api/news?take=5&skip=10

Related

Getting just the id of a set of MongoDB documents

I have the following problem, I have already done some research in the net and here, but no solution found so far, I would be glad to receive suggestions.
Consider the code:
myModel.create(array).then((justCreated)=>
{
//justCreated is the array of document just created, I can print them out and see!
})
My problem is: I just need their ids, on a array if possible. I could take one by one, but, is there a better way?
I have tried setting the second parameter to "_id" like we do with find or select like we do with populate, but no success. Any suggestion?
PS. I am using mongoose.
According to this post you can see that forEach is much slower than map method. So i think a better solution is the following
myModel.create(array)
.then(justCreated => {
const idsArray = justCreated.map(el => el._id);
})
.catch(err => console.log(err));
After giving some extra thoughts, I have come up with the solution, which I am afraid regarding performace, since my dataset is pretty big, that is the reason I cannot use subdocument, I have tried, but I exceed easily the 16MB maximum limit.
myModel.create(array).then((justCreated)=>
{
justCreated.ForEach((doc)=>{id.push(doc.id))
})
I am opened to suggestion for better ways to handle this problem but this seems to solve the problem, at least at first glance.

Gatling - Check multiple values with JsonPath

I am using Gatling 2.1.7 and I cant't find a way to check multiple values with JsonPath.
Lets pretend I have this Json:
{
"messageTyp": "wsopen",
"userId": "1"
}
I do need to check both values.
It is easy to check on of the values:
.check(wsAwait.within(2).expect(1).jsonPath("$..messageType").is("wsopen"))
But because this tests are running in a "feeder"-loop, there are a lot "wsopen" messages, coming back asynchronous. I need to check, that every user receives exactly one "wsopen" message.
I need something like
.check(wsAwait.within(2).expect(1).jsonPath("$..messageType").is("wsopen").and.jsonPath("$..userId").is("${userid}")) // won't compile
Anybody a hint?
You can do it with JSONPath directly:
jsonPath("$[?(#.messageTyp=='wsopen' && #.userId=='1')]").exists

Filter every n:th event

Is there any way of creating a filter that filters every n:th event where n is different for different checks. I.e I would like to specify a field in each check such that I can control the filter frequency for different checks.
I have some checks that run once a day, some checks that runs once an hour and some that runs every minute. Using the same filter where I filter every n:th occurrence would not work for the different checks.
Is there any way of avoiding creating 10 different filters with different frequency?
Edit: I also have to create 10 different handlers, each that uses a different filter. Not a very clean solution and very much duplicated code.
The fine folks at Sensu have implemented what I asked for. It uses replacement tokens in the filter part as well now.
{
"filters": {
"occurrences": {
"negate": true,
"attributes": {
"occurrences": "eval: value > :::check.occurrences|60:::"
}
}
}
}
Have you tried with Mutators ? I really haven't tried them, but according to the description they could help you to manipulate the output of the check before they reach to the handler.
In theory you could manipulate the output to "normalize".
I am afraid is not possible to achieve what you tried with only filters.

In Sinatra, how to make filters dependent of request method?

I've been using filters in Sinatra the way it has been declared in the documentation: with no match string, with a match string or with a match regexp. It has been working fine til now. Now, I have a particular use case. Let's say I have this route:
/resources/1
According to REST, and depending of the request method, this can either be a GET method, PUT method or DELETE method. First question is: How to write filters that are only called when it is a GET request? (currently I'm letting all of them get filtered and only then I test the method. It works, but I don't like it). Second question, and more important: let's say a PUT request like this is triggered:
/resources/
This is of course wrong, because the PUT request has no resource id associated. I would like to know if there is something in Sinatra that enables me to do something like this:
before "/resources/", :method => :put do
error_message
end
just this possibility does not exist (before accepts only one argument). How could I achieve this result at best?
Actually, filters do take conditions. You don't have to use a condition though, you could use a conditional within the filter:
before "/path/" do
if request.request_method == "PUT"
# do something
end
end
If you want to use a condition, I think you'll need to write one, perhaps something like this:
set(:accepted_verbs) {|*verbs|
condition {
verbs.any?{|v| v == request.request_method }
}
}
before "/path/", :accepted_verbs => ["GET","POST"] do
# do something
end
before "/path/", :accepted_verbs => ["PUT"] do
# do something else
end
See conditions for more.

How do I Benchmark RESTful Service with Variable Parameters?

I'm currently working on benchmarking a RESTful service I've made, and part of that is making sure it runs in a reasonable amount of times for a large array of parameters. For example, let's say I have RESTful API of the form some_site.com/item?item_id=y. In that case to be sure my service is working as fast as I'd like it to work, I'd want to try out many values for y one by one, preferably coming from some text file. I can't figure out any way of doing this in ab or httperf. I'm open to using a different benchmarking program if I have, but would prefer something simple and light. What I want to do seems like something pretty standard, so I'm guessing there must already be a program that let's me do it, but an hour or so of googling hasn't gotten me an answer. Ideas?
Answer: Jmeter (which is apparently awesome). This faq explains how to do it. Hopefully this helps someone else, as it took me like a day of searching to figure this out.
I have just had some good experience with using JavaScript (via BSF/Rhino) in JMeter.
I have put one thread group in my test plan and stick a 'Simple Controller' with two elements under it - 'HTTP Request' sampler and 'BSF PreProcessor'.
Set BSF language to 'javascript' and either type the code into the text box or point it to a file (use full path or relative to CWD of JMeter process).
/* Since `Math.random()` gives us float, we use `java.util.Random()`
* see: http://docs.oracle.com/javase/7/docs/api/java/util/Random.html */
var Random = new Packages.java.util.Random();
var min = 10-1;
var max = 2;
var maxLines = (min)+Random.nextInt(max-min);
var s = '';
for (var d = 0; d <= maxLines; d++) {
s += d.toString()+','+Random.nextInt(1000).toString()+'\n';
}
// s => '0,312\n1,104\n2,608\n'
vars.put('PAYLOAD', s);
Now I can refer to ${PAYLOAD} in the HTTP request!
You can generate JSON, but you will need to upgrade jakarta-jmeter-2.5.1/lib/js-1.6R5.jar with the newest version of Rhino to get JSON.stringify and JSON.parse. That worked perfectly for me also, though I thought I'd put a simple example here.
You can use BSF pre-processor for URL params as well, just set another variable with vars.put('X', 'some value') and pass it as ${X} in the request parameter.
This blog post helped quite a bit, by the way.