Problems with security policy for Filepicker Convert? - filepicker.io

I use Filepicker to "read" then "store" an image from clients' computer. Now I want to resize the image using Filepicker but always get a 403 error:
POST https://www.filepicker.io/api/file/w11b6aScR1WRXKFbcXON/convert?_cacheBust=1380818787693 403 (FORBIDDEN)
I am using the same security policy and signature for the "read", "store", and "convert" calls. Is this wrong? Because when "read" and "store" are called there is no file handle yet (e.g. the last string part in InkBlob.url). But it seems the "convert" policy/signature must be generated using the file handle returned with the "store" InkBlob? And if this is the case, what's a more convenient way to do in javascript? Because in "convert" I have no access to the Python function that generates security policies unless I write an API call for that.
My code snippet as below (initialFpSecurityObj was pre-generated in Python using an empty handle):
filepicker.store(thumbFile, {
policy: initialFpSecurityObj.policy,
signature: initialFpSecurityObj.signature,
location: "S3",
path: 'thumbs/' + initialFpSecurityObj.uniqueName + '/',
},function(InkBlob){
console.log("Store successful:", JSON.stringify(InkBlob));
processThumb(InkBlob);
}, function(FPError){
console.error(FPError.toString());
});
var processThumb = function(InkBlob){
filepicker.convert(InkBlob, {
width: 800,
height: 600,
format: "jpg",
policy: initialFpSecurityObj.policy,
signature: initialFpSecurityObj.signature,
}, function(InkBlob){
console.log("thumbnail converted and stored at:", InkBlob);
}, function(FPError){
console.error(FPError);
};
}
Thanks a lot for the help.
--- EDIT ---
Below is the snippet for the Python code that generates initialFpSecurityObj
def generateFpSecurityOptions(handle, userId, policyLife=DEFAULT_POLICY_LIFE):
expiry = int(time() + policyLife)
json_policy = json.dumps({'handle': handle, 'expiry': expiry})
policy = base64.urlsafe_b64encode(json_policy)
secret = 'XXXXXXXXXXXXXX'
signature = hmac.new(secret, policy, hashlib.sha256).hexdigest()
uniqueName = hashlib.md5()
uniqueName.update(signature + repr(time()))
uniqueName = uniqueName.hexdigest() + str(userId)
return {'policy':policy, 'signature':signature, 'expiry':expiry, 'uniqueName':uniqueName}
fp_security_options = generateFpSecurityOptions(None, request.user.id)
Then in the django template fp_security_options is retrieved:
var initialFpSecurityObj = {{fp_security_options|as_json|safe}};
The way that generates fp_security_options is suspicious to me (former colleague's code) because the handle is None.

My recommendation would be to create two policies: one that is handle-bound and allows storing of the file, and another that is not handle-bound for the convert. In this case, you can set a shorter expiry time to increase the level of security, given that you are not specifying a handle.

Your problem is probably that your policy does not contain any "call" specifications. I suggest:
json_policy = json.dumps({'handle': handle, 'expiry': expiry, 'call':['pick','store','read','convert']})
but as our (very busy ;) brettcvz suggests, for conversion only, this is already enough:
json_policy = json.dumps({'handle': handle, 'expiry': expiry, 'call':'convert'})
You can find this in the security docs https://developers.inkfilepicker.com/docs/security/
If you still have issues, use a REST call, it's free. The following method is JavaScript and returns an url to the REST endpoint of filepicker which can be used to retrieve the converted image. The _options object looks like this
var myOptions = {
w: 150,
h: 150,
fit: "crop",
align: "faces",
format: "jpg",
quality: 86
};
and will work with all parameters specified of file pickers REST-API (check out https://developers.inkfilepicker.com/docs/web/#inkblob-images).
function getConvertedURL(_handle, _options, _policy, _signature) {
// basic url piece
var url = "https://www.filepicker.io/api/file/" + _handle + "/convert?";
// appending options
for (var option in _options) {
if (_options.hasOwnProperty(option)) {
url += option + "=" + _options[option] + "&";
}
}
// appending signed policy
url += "signature=" + _signature + "&policy=" + _policy;
return url;
}

So I finally figured it out myself, although I saw brettcvz's suggestion afterwards. The key is for 'convert' to work, I have to specify the exact handle of the uploaded file (i.e. the last bit of the string in InkBlob's url property returned from the 'store' or 'pickAndStore' call.
First thing I did was to edit the Python function generating the security policy and signature:
def generateFpSecurityOptions(handle, userId, policyLife=DEFAULT_POLICY_LIFE):
expiry = int(time() + policyLife)
json_policy = json.dumps({'handle': handle, 'expiry': expiry})
policy = base64.urlsafe_b64encode(json_policy)
secret = 'XXXXXXXXXXXXXX'
signature = hmac.new(secret, policy, hashlib.sha256).hexdigest()
if not handle == None:
uniqueName = handle
else:
uniqueName = hashlib.md5()
uniqueName.update(signature + repr(time()))
uniqueName = uniqueName.hexdigest() + str(userId)
return {'policy':policy, 'signature':signature, 'expiry':expiry, 'uniqueName':uniqueName}
fp_security_options = generateFpSecurityOptions(None, request.user.id)
Then I have to established the API call in our Django framework to get this security policy object dynamically via AJAX. I am fortunate that my colleague has previously written it. So I just call the API function in Javascript to retrieve the file-specific security policy object:
var initialFpSecurityObj = {{fp_security_options|as_json|safe}};
filepicker.store(thumbFile, {
policy: initialFpSecurityObj.policy,
signature: initialFpSecurityObj.signature,
access: "public"
}, function(InkBlob) {
processThumb(InkBlob);
}, function(FPError) {
console.error(FPError.toString());
}, function(progress) {
console.log("Loading: " + progress + "%");
});
var processThumb = function(InkBlob) {
var fpHandle = InkBlob.url.split('/').pop();
$.ajax({
url: API_BASE + 'file_picker_policy',
type: 'GET',
data: {
'filename': fpHandle
},
dataType: 'json',
success: function(data) {
var newFpSecurityObj = data.data;
filepicker.convert(InkBlob, {
width: 800,
height: 600,
format: "jpg",
policy: newFpSecurityObj.policy,
signature: newFpSecurityObj.signature,
}, {
location: "S3",
path: THUMB_FOLDER + '/' + newFpSecurityObj.uniqueName + '/',
}, function(fp) { // onSuccess
console.log("successfully converted and stored!");
// do what you want with the converted file
}, function(FPError) { // onError
console.error(FPError);
});
},
failure: function() {
alert("There was an error converting the thumbnail! Please try again.");
}
});
};

Related

Firebase hosting file upload via REST with Apps Script

I want to upload a file to Firebase hosting file upload via REST with Apps Script. Been trying to find a solution for days to no avail :( would highly appreciate any recommendations.
I'm following the official documentation here:
https://firebase.google.com/docs/reference/hosting/rest/v1beta1/sites.versions/populateFiles
And I can successfully get the upload URL using this code:
function getUploadURL() {
const YOUR_PROJECT_ID = 'sites/url-shortener-e42ec/versions/dd393a80797d713d';
let postUrl = 'https://firebasehosting.googleapis.com/v1beta1/YOUR_PROJECT_ID:populateFiles';
postUrl = postUrl.replace('YOUR_PROJECT_ID', YOUR_PROJECT_ID);
const options = {
method: 'post',
headers: {
Authorization: `Bearer ${ScriptApp.getOAuthToken()}`,
},
muteHttpExceptions: true
};
const response = UrlFetchApp.fetch(postUrl, options);
Logger.log(response);
}
which returns the following:
{
"uploadUrl": "https://upload-firebasehosting.googleapis.com/upload/sites/url-shortener-e42ec/versions/dd393a80797d713d/files"
}
And this is where I get kinda lost because I'm not quite sure on what to do next. The documentation says:
map (key: string, value: string)
A set of file paths to the hashes corresponding to assets that should be added to the version.
A file path to an empty hash will remove the path from the version.
Calculate a hash by Gzipping the file then taking the SHA256 hash of the newly compressed file.
But if I add a payload with a file hash to the call like so:
{
"files": {
"/teste": "3f0749957a1c4d91ed18b8e9df122709974e4e9c94c57f9245794c21dd76d4bd"
}
}
...then I get the error:
{
"error": {
"code": 400,
"message": "Precondition check failed.",
"status": "FAILED_PRECONDITION"
}
}
PART 2 :
The next issue I found is, now that I have the upload URL, I will need to actually upload the file, and according to their documentation I should:
Perform a multipart POST of the Gzipped file contents to the URL using a forward slash and the hash of the file appended to the end.
which I tried with the following apps script code:
function convert(hash) {
return hash.map(byte => ('0' + (byte & 0xFF).toString(16)).slice(-2)).join('');
}
function postFile() {
var files = DriveApp.getFilesByName('abc.txt');
let gzip;
let hash;
if (files.hasNext()) {
var file = files.next();
gzip = Utilities.gzip(file.getBlob());
hash = Utilities.computeDigest(Utilities.DigestAlgorithm.SHA_256, gzip.getBytes());
}
let postUrl = 'https://upload-firebasehosting.googleapis.com/upload/sites/url-shortener-e42ec/versions/dd393a80797d713d/files/' + convert(hash);
/*
var textBlob = Utilities.newBlob("abc");
const gzip = Utilities.gzip(textBlob);
const hash = Utilities.computeDigest(Utilities.DigestAlgorithm.SHA_256, gzipFile.getBytes());
*/
const data = {
"files": {
"/test.txt": convert(hash)
}
};
const options = {
method: 'post',
headers: {
Authorization: `Bearer ${ScriptApp.getOAuthToken()}`,
accept: 'application/json',
contentType: 'application/json'
},
muteHttpExceptions: true,
payload: JSON.stringify(data)
};
const response = UrlFetchApp.fetch(postUrl, options);
Logger.log(response);
}
... and get the following error:
Couldn't process request (status=412): File url-shortener-e42ec/dd393a80797d713d/0b3b82379e00a1994a46452e8cfd8b2c43ee8599f169a9ee4176253f1a8de469 can't be uploaded.
Appreciate all the help I can get. Thanks in advance!

How to run post request for multiple user in k6 performance test?

We run Get http request with multiple user(vus) and multiple iteration(iterations) for particular duration.
It is straightforward like below.
import http from 'k6/http';
import { sleep } from 'k6';
export let options ={
vus:'10',
duration: '20s',
iterations: '10'
}
export default function (){
const baseUri = 'http://Myhost:3200'
let response = http.get(`${baseUri}/api/v1/vm-inventory`)
let body = JSON.parse(response.body)
body.items.forEach(element => {
if (element.datastoreIds.length != 0 ){
console.log(JSON.stringify(element))
console.log(`Name of the vm is ${element.name}`)
}
});
sleep(1);
}
I also need to run post request by multiple user and multiple iterations. But all the example I am seeing are related to single post request. When we run multiple post request ,our data need to be changed for every iteration. How it is possible with k6? Below is my post request.
import http from 'k6/http';
import { sleep } from 'k6';
export let options ={
vus:'5',
duration: '120s',
iterations: '5'
}
let payload = `{
"VMId": "70c8b578-32ef-40d2-bcc5-81267423d2c4",
"name": "test_vm_1",
"mem": "4,
"cpu": "2",
"disk": "50gb"
}`
export default function () {
const url = 'http://myhost:3200/api/v1/vm-inventory'
var header = {
headers: {
'Content-Type': 'application/json',
},
};
let res = http.post(url, JSON.stringify(payload),header);
console.log(res.json().json.name);
sleep(5);
}
I need to change the Vm name in payload for every iteration in order the request to be unique.
How to achieve this in k6? In Jmeter , they are reading it from csv with different data for each iteration and achieving it. But I could not find any sample for K6.
we need to create the payload inside the export default function so that we can modify it there before sending it. If the VMId just needs to be a unique UUID, have a look at jslib utils library, which contains a function for generating a uuidv4:
export function uuidv4() {
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, function(c) {
let r = Math.random() * 16 | 0, v = c === 'x' ? r : (r & 0x3 | 0x8);
return v.toString(16);
});
}
Putting it together:
import http from 'k6/http';
import { sleep } from 'k6';
export let options = {
vus: '5',
duration: '120s',
iterations: '5'
}
export default function () {
let payload = {
VMId: uuidv4(),
name: `test_vm_${Date.now()}`,
mem: 4,
cpu: 2,
disk: "50gb"
};
const url = 'http://myhost:3200/api/v1/vm-inventory'
var header = {
headers: {
'Content-Type': 'application/json',
},
};
let res = http.post(url, JSON.stringify(payload), header);
console.log(res.json().json.name);
sleep(5);
}
export function uuidv4() {
return 'xxxxxxxx-xxxx-4xxx-yxxx-xxxxxxxxxxxx'.replace(/[xy]/g, function(c) {
let r = Math.random() * 16 | 0, v = c === 'x' ? r : (r & 0x3 | 0x8);
return v.toString(16);
});
}
(note that I've changed the payload to a JS object so that it doesn't get stringified twice).
Also we can use the payload like below.
let payload = {
VMId: uuidv4(),
name: `test_vm_vu_${__VU}`,
mem: 4,
cpu: 2,
disk: "50gb"
};
__VU will be giving virtual user number so that we can create test_vm_vu_1,test_vm_vu_3, test_vm_vu_2 . (It will be run in parallel, so the number will not come in sequential manner)
I got this answer from K6 slack community group. I post it here for anybody facing the same situation.

how to upload images as a signed request with Cloudinary/Angular(5) and Ionic(3)?

Cloudinary have a basic.js example which I'm trying to implement in my Ionic/Angular project.
Problem is, for some reason the Ionic version of "#cloudinary/angular-5.x" always uses the unsigned_upload feature, and I want to be able to transform it before I upload, same as the Cloudinary example.
Transformation requires signed upload not unsigned upload.
Since there are many versions out-there, and most of the examples don't work, mine is:
Ionic: 3
Angular: 5.2.11
Cloudinary:
"cloudinary": "^1.11.0",
"cloudinary-core": "^2.5.0",
"#cloudinary/angular-5.x": "^1.0.2"
basic.js
My configuration is inside the .env variable with the structure mentioned in cloudinary.config
var dotenv = require('dotenv');
dotenv.load();
var fs = require('fs');
var cloudinary = require('cloudinary').v2;
// set your env variable CLOUDINARY_URL or set the following configuration
/*cloudinary.config({
cloud_name: '',
api_key: '',
api_secret: ''
});*/
var url = "http://res.cloudinary.com/demo/image/upload/couple.jpg"
cloudinary.uploader.upload(url,{"tags":"basic_sample","width":500,"height":500,"crop":"fit","effect":"saturation:-70"} ,
function(err,image){
if (err){
console.warn(err);
return;
}
console.log("* "+image.public_id);
console.log("* "+image.url);
// Transform image
cloudinary.url(image.public_id,
{
width: 200,
height: 150,
crop: "fill",
gravity: "face",
radius: 10,
effect:"sepia",
format: "jpg"
}
));
});
I'm able with the following code to upload it unsigned
Ionic unsigned request
ngOnInit(): void {
const uploaderOptions: FileUploaderOptions = {
url: 'https://api.cloudinary.com/v1_1/' + this.cloudinary.config().cloud_name + '/upload',
autoUpload: false,
isHTML5: true,
removeAfterUpload: true,
headers: [{
name: 'X-Requested-With',
value: 'XMLHttpRequest'
}]
};
this.uploader = new FileUploader(uploaderOptions);
// Add custom tag for displaying the uploaded photo in the list
this.uploader.onBuildItemForm = (fileItem: any, form: FormData): any => {
form.append('upload_preset', this.cloudinary.config().upload_preset);
form.append('public_id', 'subfolder/' + this.UUID);
form.append('file', fileItem);
fileItem.withCredentials = false;
return { fileItem, form };
};
}
Ionic signed request
So in order to transform my images, I need to use parameter called eager
form.append('eager', 'c_crop,w_191,h_145,g_face,z_0.7');
But then I get the below error
Upload completed with status code 400
{
"message": "Eager parameter is not allowed when using unsigned upload.
Only upload_preset,callback,public_id,folder,tags,context,face_coordinates,custom_coordinates,source upload parameters are allowed.
}
When I remove the preset to "tell" it that maybe this is a signed request, I get the above error + Upload preset must be specified when using unsigned upload
So I'm not sure how I'm suppose to "tell" it - use signed request, and take my configuration from .env or CloudinaryModule.forRoot({Cloudinary}, cloudinaryConfiguration as CloudinaryConfiguration), etc ...
For signed upload, you need to create a signature. During post request, you have to attach it with form.
Signature is SHA1 hexadecimal string which is consists of timestamp(unixtime), public_id (any text) and your cloudinary API_SECRET.
Here is my workable sample
private generateSignature() {
this.public_id = `image_${Date.now()}`; // I like to make it unique.
this.unixtime = Date.now() / 1000 | 0;
return CryptoJS.SHA1(`public_id=${this.public_id}&timestamp=${this.unixtime}${this.API_SECRET}`).toString()
}
here I use CryptoJS for encription.
Append this signature with form body before send API request.
for example
initFileUploader(): void {
const self = this;
self.uploader = new FileUploader({
url: 'https://api.cloudinary.com/v1_1/your_cloud_name/upload',
allowedMimeType: ['image/png', 'image/jpg', 'image/jpeg', 'image/gif'],
maxFileSize: 524288,//512 KB
autoUpload: true,
removeAfterUpload: true,
isHTML5: true,
headers: [
{
name: 'X-Requested-With',
value: 'XMLHttpRequest'
}
]
});
self.uploader.onAfterAddingFile = (file) => {
file.withCredentials = false;
};
self.uploader.onSuccessItem = (item, response, status) => {
const resp = <any>JSON.parse(response);
if (resp) {
this.onSuccess.emit(resp);
} else {
this.onError.emit('An error occured during upload. Please retry');
}
};
self.uploader.setOptions(self._uploaderOptions);
self.uploader.onBuildItemForm = (fileItem: any, form: FormData): any => {
let signature = this.generateSignature();
form.append('timestamp', this.unixtime.toString());
form.append('public_id', this.public_id);
form.append('api_key', this.API_KEY); //your cloudinary API_KEY
form.append('signature', signature);
return { fileItem, form };
};
}
I use ng2-file-upload for uploading...
uploading images via signed method
Signed uploads require an authentication signature to be generated on your server using a function method or string method, and as such, signed upload
The current Angular SDK is outdated so we follow these steps to implement our signed upload.
Manually generate Signature via string method in Angular
To manually generate your own POST request, you need to authenticate the request with a signature based on the parameters you use in the request. The signature is a hexadecimal message digest (hash value) created with the SHA-1 or SHA-256 (Secure Hash Algorithm) cryptographic function.
You can manually generate the comparison signature instead of using the Cloudinary SDK’s api_sign_request method.
For example, if your API secret is abcd, your API key is 1234, the Unix time now is 1315060510 and you are posting a request to upload a file from ‘https://www.example.com/sample.jpg', set its Public ID as sample_image, and eagerly generate 2 images:
Parameters to sign:
timestamp: 1315060510
public_id: sample_image
eager: w_400,h_300,c_pad|w_260,h_200,c_crop
Serialized sorted parameters in a single string:
eager=w_400,h_300,c_pad|w_260,h_200,c_crop&public_id=sample_image&timestamp=1315060510
String including the API secret that is used to create the SHA-1 signature:
eager=w_400,h_300,c_pad|w_260,h_200,c_crop&public_id=sample_image&timestamp=1315060510abcd
Generate Signature in Angular
Using a native js function for hashing messages with the SHA-1 algorithm
First Install sha1
npm install sha1
Then import the package into the app
import sha1 from ‘sha1’;
Generate UUID for Public ID
Another thing we did so each upload has a unique ID was to using UUID package to generate a unique Public ID for each upload
npm install uuid
import * as uuid from ‘uuid’;
on NgInit we generate the UUID using
this.uuidValue = `${uuid.v4().toLowerCase()}`;
we the use method sha1(string) Returns the SHA-1 hash of the given message.
The result is a SHA-1 hexadecimal result:
b4ad47fb4e25c7bf5f92a20089f9db59bc302313
signuploadform() {
const timestamp = Math.round(new Date().getTime() / 1000);
const apiSecret = this.environmentService.getValue('CLOUDINARY_API_SECRET');
const api_key = this.environmentService.getValue('CLOUDINARY_API_KEY');
const signature = sha1(
'eager=c_pad,h_300,w_400|c_crop,h_200,w_260&folder=identification/NGA&public_id=' +
this.uuidValue +
'&timestamp=' +
timestamp +
apiSecret
);
return {timestamp, signature, api_key};
}
Post the Upload
Now that the signature has been generated we then post using the parameter as shown in the code below
folder
public_id
file
api_key
timestamp
signature
HTML
<input hidden (change)=”onFileChange($event)” #fileInput accept=”image/*” type=”file” id=”file”>
TS
onFileChange(event: any) {
this.uploadFile(event.target.files[0]);
}
uploadFile(file: File) {
const signData = this.signuploadform();
const formData = new FormData();
formData.append('eager', 'c_pad,h_300,w_400|c_crop,h_200,w_260');
formData.append('folder', 'identification/NGA');
formData.append('public_id', this.uuidValue);
formData.append('file', file);
formData.append('api_key', signData.api_key);
formData.append('timestamp', signData.timestamp.toString());
formData.append('signature', signData.signature);
const url =
'https://api.cloudinary.com/v1_1/' +
this.environmentService.getValue('CLOUDINARY_CLOUD_NAME') +
'/auto/upload';
this.isLoading = true;
this.http
.post(url, formData)
.pipe(map((x: any) => x.secure_url as string))
.subscribe({
next: res => {
this.identification = res;
this.uploadTitle = 'ID Uploaded';
this.uploadStatus = true;
from(
Swal.fire({
icon: 'success',
title: 'Successfully uploaded',
showConfirmButton: true,
})
);
},
error: error => {
this.isLoading = false;
from(
Swal.fire({
icon: 'error',
title: 'Please check your image again',
showConfirmButton: true,
})
);
},
complete: () => {
this.isLoading = false;
},
});
}

Use DataFields in Rest URL in ExtJS to access Context.io API

I have two Question Regarding Rest API in EXTJS.
How can I use fields to make rest URL dynamic?
How can I add authentication key to access Context.io in my Rest.Proxy?
This is my solution, but I am not sure if I have done it properly, or not. I am pretty new in ExtJS, so my question may be basic, but I appreciate your help.
Ext.define("EmailFolders", {
extend: "Ext.data.Model",
fields: ["id", "label"],
proxy: {
type: "rest",
url: "lite/users/:" + id + "/email_accounts/:" + label + "/folders"
},
reader: {
type: "json"
},
headers: {
CONSUMER_KEY: "KEY FROM CONTEX.IO",
CONSUMER_SECRET: "SECRET FROM CONTEXT.IO"
}
});
You could use store.getProxy() to make rest URL dynamic and to pass the authentication keys in headers. Proxy have methods
proxy.setUrl() to sets the value of url.
proxy.setHeaders() to sets the value of headers.
You can check here with working fiddle
CODE SNIPPET
Ext.application({
name: 'Fiddle',
launch: function () {
let url = 'https://jsonplaceholder.typicode.com/users';
// Set up a model to use in our Store
Ext.define('User', {
extend: 'Ext.data.Model',
proxy: {
type: 'ajax',
reader: {
type: 'json',
rootProperty: ''
}
}
});
Ext.define('MyStore', {
extend: 'Ext.data.Store',
model: 'User',
listeners: {
beforeload: function (store) {
var proxy = store.getProxy();
//if you want, you can also set here url inside of beforeload
//proxy.setUrl(url);
/*
* You can use {proxy.setHeaders} to set the values from CONTEX.IO
* After ajax request see your request parameter in network analysis below 2 headers are passed in request header
*/
proxy.setHeaders({
CONSUMER_KEY: "KEY FROM CONTEX.IO",
CONSUMER_SECRET: "SECRET FROM CONTEXT.IO"
});
}
}
});
let store = new MyStore();
//Set the dynamic url here
//This {url} will be dynamic whatever you want to pass
store.getProxy().setUrl(url);
store.load(function (data) {
console.log(data);
alert('Open console to see reposne..!')
});
/*
You can also pass url inside of load funtion
*/
new MyStore().load({
url: url + '/' + 1,
callback: function (data) {
console.log(data);
}
});
}
});

How one could use server side sorting and paging with Azure Mobile Services

I am using jqGrid (inlineNav) with data from azure service and interested in learning how one could use server side sorting and paging with Azure Mobile Services.
Please share thoughts around this.
Windows Azure Mobile Services provides REST API which can be used to get/insert/edit/delete data of the the tables which you configured for the corresponding access (see the documentation). Query records operation request uses HTTP GET verb. It supports Open Data Protocol (OData) URI options $orderby, $skip, $top and $inlinecount which could be used to fill jqGrid.
$("#list4").jqGrid({
url : 'https://mohit.azure-mobile.net/tables/Schedules',
datatype: "json",
height: "auto",
colModel: [
{ name: "RouteId", width: 50 },
{ name: "Area", width: 130 }
],
cmTemplate: {editable: true, editrules: { required: true}},
rowList: [10, 20, 30],
rowNum: 10,
prmNames: { search: null, nd: null },
ajaxGridOptions: {
contentType: "application/json",
headers: {
"X-ZUMO-APPLICATION": "myKey"
}
},
serializeGridData: function (postData) {
if (postData.sidx) {
return {
$top: postData.rows,
$skip: (parseInt(postData.page, 10) - 1) * postData.rows,
$orderby: postData.sidx + " " + postData.sord,
$inlinecount: "allpages"
};
} else {
return {
$top: postData.rows,
$skip: (parseInt(postData.page, 10) - 1) * postData.rows,
$inlinecount: "allpages"
};
}
},
beforeProcessing: function (data, textStatus, jqXHR) {
var rows = parseInt($(this).jqGrid("getGridParam", "rowNum"), 10);
data.total = Math.ceil(data.count/rows);
},
jsonReader: {
repeatitems: false,
root: "results",
records: "count"
},
loadError: function (jqXHR, textStatus, errorThrown) {
alert('HTTP status code: ' + jqXHR.status + '\n' +
'textStatus: ' + textStatus + '\n' +
'errorThrown: ' + errorThrown);
alert('HTTP message body (jqXHR.responseText): ' + '\n' + jqXHR.responseText);
},
pager: "#pager1",
sortname: "Area",
viewrecords: true,
caption: "Schedule Data",
gridview: true
});
Some comments to the above code:
I removed sortable: false to allow sorting of grid by click on the column header
with respect of prmNames option one can remove sending of unneeded parameters to the server or rename it. I used prmNames: { search: null, nd: null } to deny sending of _search and nd options. One could use sort: "$orderby", rows: "$top" to rename two other parameters, but because we need to calculate $skip and append sord after sidx we need to use serializeGridData. So the renaming of other parameters are not needed in the case.
using serializeGridData we construct the list of options which will be send to the server.
ajaxGridOptions will be used to set additional parameters of jQuery.ajax request which do jqGrid internally for access to the server. The options which I use in the example set Content-Type: application/json and X-ZUMO-APPLICATION: myKey in the HTTP headers
the response from the server don't contains total (the total number of pages), so we use beforeProcessing callback to fill the property based on other information before the response will be processed.
because we use $inlinecount=allpages options in the URL the response from the server will contains information about the total number of records and the page of data will be wrapped in the results part of the answer. So we use jsonReader: {repeatitems: false, root: "results", records: "count"} to read the response.
we have to remove loadonce: true option because the server returns only the requested page of data instead of the whole set of data.