Is there a way to get the default bookmarks with capability APIs?
I tried app.getList('BookmarkList') method, but it doesn't return any parameter identifying that it's a default bookmark.
In order to get the default bookmarks you'll have to create generic object with the following definition:
app.createGenericObject({
qInfo: {
qType: 'sheets'
},
qAppObjectListDef: {
qType: 'sheet',
qData: {
title: '/qMetaDef/title',
labelExpression: '/labelExpression',
description: '/qMetaDef/description',
descriptionExpression: '/descriptionExpression',
thumbnail: '/thumbnail',
cells: '/cells',
actions: '/actions',
rank: '/rank',
columns: '/columns',
rows: '/rows'
}
}
}, sheets => {
console.log(sheets)
})
The key here is the /actions part of the definition which ensures that the actions metadata is received
The resulting layout should include any defined actions, which, for bookmarks will look like this:
{
actionLabel: "A",
actionType: "applyBookmark",
bookmark: "db014c67-ff43-4111-88ff-836b457928e5",
cId: "KzmaWSa",
field: "",
showSystemVariables: false,
softLock: false,
value: "",
variable: ""
}
Related
I am using KeystoneJS with PostgreSQL as my backend and Apollo on the frontend for my app.
I have a schema that has a list that is linked to another list.
I want to be able to allow users to change the order of the second list.
This is a simplified version of my schema
keystone.createList(
'forms',
{
fields: {
name: {
type: Text,
isRequired: true,
},
buttons: {
type: Relationship,
ref: 'buttons.attached_forms',
many: true,
},
},
}
);
keystone.createList(
'buttons',
{
fields: {
name: {
type: Text,
isRequired: true,
},
attached_forms: {
type: Relationship,
ref: 'forms.buttons',
many: true,
},
},
}
);
So what I would like to do, is allow users to change the order of buttons so when I fetch them in the future from forms:
const QUERY = gql`
query getForms($formId: ID!) {
allforms(where: {
id: $formId,
}) {
id
name
buttons {
id
name
}
}
}
`;
The buttons should come back from the backend in a predefined order.
{
id: 1,
name: 'Form 1',
buttons: [
{
id: 1,
name: 'Button 1',
},
{
id: 3,
name: 'Button 3',
},
{
id: 2,
name: 'Button 2',
}
]
}
Or even just have some data on that returns with the query that will allow for sorting according to the user-defined sort order on the frontend.
The catch is that this relationship is many to many.
So it wouldn't be enough to add a column to the buttons schema as the ordering needs to be relationship-specific. In other words, if a user puts a particular button last on a particular form, it shouldn't change the order of that same button on other forms.
In a backend that I was creating myself, I would add something to the joining table, like a sortOrder field or similar and then change those values to change the order, or even order them on the frontend using that information.
Something like this answer here.
The many-to-many join table would have columns like formId, buttonId, sortOrder.
I have been diving into the docs for KeystoneJS and I can't figure out a way to make this work without getting into the weeds of overriding the KnexAdapter that we are using.
I am using:
{
"#keystonejs/adapter-knex": "^11.0.7",
"#keystonejs/app-admin-ui": "^7.3.11",
"#keystonejs/app-graphql": "^6.2.1",
"#keystonejs/fields": "^20.1.2",
"#keystonejs/keystone": "^17.1.2",
"#keystonejs/server-side-graphql-client": "^1.1.2",
}
Any thoughts on how I can achieve this?
One approach would be to have two "button" lists, one with a template for a button (buttonTemplate below) with common data such as name etc, and another (button below) which references one buttonTemplate and one form. This allows you to assign a formIndex property to each button, which dictates its position on the corresponding form.
(Untested) example code:
keystone.createList(
'Form',
{
fields: {
name: {
type: Text,
isRequired: true,
},
buttons: {
type: Relationship,
ref: 'Button.form',
many: true,
},
},
}
);
keystone.createList(
'Button',
{
fields: {
buttonTemplate: {
type: Relationship,
ref: 'ButtonTemplate.buttons',
many: false,
},
form: {
type: Relationship,
ref: 'Form.buttons',
many: false,
},
formIndex: {
type: Integer,
isRequired: true,
},
},
}
);
keystone.createList(
'ButtonTemplate',
{
fields: {
name: {
type: Text,
isRequired: true,
},
buttons: {
type: Relationship,
ref: 'Button.buttonTemplate',
many: true,
},
},
}
);
I think this is less likely to cause you headaches (which I'm sure you can see coming) down the line than your buttonOrder solution, e.g. users deleting buttons that are referenced by this field.
If you do decide to go with this approach, you can guard against such issues with the hook functionality in Keystone. E.g. before a button is deleted, go through all the forms and rewrite the buttonOrder field, removing any references to the deleted button.
I had a similar challenge once, so after some research and found this answer, I implemented a solution to a project using PostgreSQL TRIGGER.
So you can add a trigger where on an update, it should shift the buttonOrder.
Here is the SQL I had on me, this was the test code, I regex replaced the terms to fit your question :)
// Assign order
await knex.raw(`
do $$
DECLARE form_id text;
begin
CREATE SEQUENCE buttons_order_seq;
CREATE VIEW buttons_view AS SELECT * FROM "buttons" ORDER BY "createdAt" ASC, "formId";
CREATE RULE buttons_rule AS ON UPDATE TO buttons_view DO INSTEAD UPDATE buttons SET order = NEW.order WHERE id = NEW.id;
FOR form_id IN SELECT id FROM form LOOP
ALTER SEQUENCE buttons_order_seq RESTART;
UPDATE buttons_view SET order = nextval('buttons_order_seq') WHERE "formId" = form_id;
END LOOP;
DROP SEQUENCE buttons_order_seq;
DROP RULE buttons_rule ON buttons_view;
DROP VIEW buttons_view;
END; $$`);
// Create function that shifts orders
await knex.raw(`
CREATE FUNCTION shift_buttons_order()
RETURNS trigger AS
$$
BEGIN
IF NEW.order < OLD.order THEN
UPDATE buttons SET order = order + 1, "shiftOrderFlag" = NOT "shiftOrderFlag"
WHERE order >= NEW.order AND order < OLD.order AND "formId" = OLD."formId";
ELSE
UPDATE buttons SET order = order - 1, "shiftOrderFlag" = NOT "shiftOrderFlag"
WHERE order <= NEW.order AND order > OLD.order AND "formId" = OLD."formId";
END IF;
RETURN NEW;
END;
$$
LANGUAGE 'plpgsql'`);
// Create trigger to shift orders on update
await knex.raw(`
CREATE TRIGGER shift_buttons_order BEFORE UPDATE OF order ON buttons FOR EACH ROW
WHEN (OLD."shiftOrderFlag" = NEW."shiftOrderFlag" AND OLD.order <> NEW.order)
EXECUTE PROCEDURE shift_buttons_order()`);
One option that we came up with is to add the order to the form table.
keystone.createList(
'forms',
{
fields: {
name: {
type: Text,
isRequired: true,
},
buttonOrder: {
type: Text,
},
buttons: {
type: Relationship,
ref: 'buttons.attached_forms',
many: true,
},
},
}
);
This new field buttonOrder could contain a string representation of the order of the button Ids, like in a JSON stringified array.
The main issue with this is that it will be difficult to keep this field in-sync with the actual linked buttons.
I have made the following collection in meteor:
CodesData = new Mongo.Collection('CodesData');
CodesDataSchema = new SimpleSchema({
code: {
label: "Code",
type: Number
},
desc: {
label: "Description",
type: String,
}
});
CodesData.attachSchema(CodesDataSchema);
Now I want to prefill this collection with some data.
For example: code: 1 desc: "hello".
How can I do this manually and easily?
You can use Meteor.startup to run some actions on your collection once the server app has been loaded and is starting:
CodesData = new Mongo.Collection('CodesData');
CodesDataSchema = new SimpleSchema({ code: { label: "Code", type: Number }, desc: { label: "Description", type: String, } });
.attachSchema(CodesDataSchema);
Meteor.startup(()=>{
// Only fill if empty, otherwise
// It would fill on each startup
if (CodesData.find().count() === 0) {
CodesData.insert({ code: 1, description: 'some description' });
}
});
If you have a lot of data to prefill you can define it in a JSON and load it on startup:
Consider the following json named as pre:
{
codesdata: [
{ code: 1, description: 'foo' },
{ code: 7, description: 'bar' }
]
}
Meteor.startup(()=>{
const preData = JSON.parse( pre );
preData.codesData.forEach( entry => {
CodesData.insert( entry );
});
});
This allows you to manage your prefill more easily and also let's you version control the json if desired ( and no sensitive data is revealed ).
Considerations:
The function Meteor.startup runs on each start. So you should consider how to avoid unnecessary inserts / prefill that create doubles. A good way is to check if the collection is empty (see first example).
You may put the startup code an another js file in order to separate definitions from startup routines.
The current script does not differentiate between server or client. You should consider to do this on server and create a publication / subscription around it.
More readings:
https://docs.meteor.com/api/core.html#Meteor-startup
Importing a JSON file in Meteor
https://docs.meteor.com/api/core.html#Meteor-settings
I have this schema and the corresponding resolvers:
const schema = buildSchema(
`
type Query {
posts(id: Int): [Post]
}
type Post {
id: Int!,
title: String,
date: String
}`
);
const resolvers = {
posts(root, { id }, context, info) {
console.log(id); // Undefined
return [
{
id: 0,
date: '21/04/2018',
title: 'Post 1'
},
{
id: 1,
date: '07/10/2018',
title: 'Post 2'
}
];
},
Post(postObj) {
return {
id: postObj.id,
title: postObj.title,
date: postObj.date
}
}
}
The problem is that when I query for posts with an specified id, like this:
query {
posts(id: 0) {
title
}
}
... I get an error that says I haven't defined such argument (id).
I defined the id argument according to the GraphQL Docs. Any suggestions of what may be causing this error and how to solve it?
When you use buildSchema, you effectively prevent yourself from being able to define custom resolvers for a given field in your schema. Instead, the default resolver will always be used. The default resolver simply takes the "parent" or "root" object for a given field, looks up the property on that parent object with the same name as the field and returns its value.
You can "get away" with this for simpler schemas by passing in a root object along with your schema. This root object then becomes the parent object referenced by the default resolver, but only for top level fields (like each field you define for your Query type). So in this case, when GraphQL resolves your query, the default resolver sees a posts property on the parent object and returns that. Because posts is actually a function, it calls the function first and then returns the value, but the arguments it calls it with are not the same arguments that a resolver is called with.
Resolvers receive four parameters -- 1) the "root" or "parent" value, 2) arguments, 3) context and 4) an "info" object containing additional data about the request. Any function called by the default resolver will only get the last 3 parameters (so no "root" value).
In other words, you should change your root object to look more like this:
const root = {
posts({ id }, context, info) {
return [
{
id: 0,
date: '21/04/2018',
title: 'Post 1'
},
{
id: 1,
date: '07/10/2018',
title: 'Post 2'
}
];
},
}
However, does it this way will only let you handle top-level fields like queries. You will not be able to customize resolver behavior for fields on other types, like Post. To do that, you should use graphql-tools' makeExecutableSchema.
I'm creating a tcomb-form through an array of objects but I don't have a lot of experience with it and honestly I'm struggling a little bit to get hang of it.
This is the array structure that we are going to use:
export const AUDIT_CONTENT =
[
{type: "label", style: "title", group: "4.1", text: "field text here"},
{type: "label", style: "label", group: "4.1", text: "field text here"},
{type: "multi-checkbox", style: "checkbox", group: "4.1", text: "field text here"},
{type: "multi-checkbox", style: "checkbox", group: "4.1", text: "field text here"},
{type: "multi-checkbox", style: "checkbox", group: "4.1", text: "field text here"},
{type: "label", style: "label", group: "4.1", text: "field text here"},
{type: "multi-checkbox", style: "checkbox", group: "4.1", text: "field text here"}
]
The fields with type: label are objects that are going to store fields type: multi-checkbox, and these fields are the ones that are going to be validated. I'm grouping those fields by group, so all fields with group 4.1 are inside an array, the fields with group 4.1 as well and so on.
I managed to dynamically generate those fields by doing the following:
myFields = () => {
for (var c = 0; c < groupedFields.length; c++) {
for (var i = 0; i < groupedFields[c].length; i++ ) {
if (groupedFields[c][i].type === 'multi-checkbox') {
fields[groupedFields[c][i].text] = t.maybe(t.enums({
OPTION_1 : "OPTION 1 Label",
OPTION_2 : "OPTION 2 Label",
OPTION_3 : "OPTION 3 Label",
OPTION_4 : "OPTION 4 Label"
}));
}
}
}
}
var fields = {};
myFields()
var myFormType = t.struct(fields);
Now my problem starts here. I'm only generation the fields that receive a value, in this case the ones with type: multi-checkbox, but, I also want to dynamically render in my form the fields with type: label in the same order as my AUDIT_CONTENT array with those being objects so the result will be something like this:
"Field with style title": {
"Field with style label": [
{"Field with style multi-checkbox": "OPTION_1"},
{"Field with style multi-checkbox": "OPTION_3"},
],
"Other field with style label": [
{"Field with style multi-checkbox": "OPTION_4"},
{"Field with style multi-checkbox": "OPTION_2"},
]
}
This result will be stored in Mongo.
Hope someone can help me out with this and thanks in advance.
It would be better if you provide a visual representation of what you want but i think that you want to render and update a nested structure. For this i recommend recursive map methods for the array.
/*
To render a structure like this you can use map and assign types to the objects to decide what to render
But you should render it all.
Maybe you can use something like this:
*/
renderInputs(array){
return array.map((obj) => {
/* You can generate even nested forms if you want */
if (obj.children) {
return <div> {this.renderInputs()}</div>
} else {
return renderType(obj)
}
})
}
renderType(obj){
switch (obj.type) {
case 'label':
return <Element {...objProps} />
case 'multi-checkbox':
return <Element2 {...objProps} />
/*You even can generate other methods for nested objects*/
case 'other-case':
return <div>{this.OtherSpecialSubRenderMethodIfYoUwANT()}</div>
}
}
/**You will need an recursive method to update your state also and each object is recomended to have an unique id*/
updateState(array = this.state.array, newValue, id){
return array.map((obj) => {
if (obj.children) {
return updateState(obj.children, newValue, id)
}
if (obj.id == id) {
return { ...obj, value: newValue }
}
return obj;
})
}
I am trying to update my collection which has an array field(initially blank) and for this I am trying this code
Industry.update({_id:industryId},
{$push:{categories: id:categoryId,
label:newCategory,
value:newCategory }}}});
No error is shown, but in my collection just empty documents({}) are created.
Note: I have both categoryId and newCategory, so no issues with that.
Thanks in advance.
This is the schema:
Industry = new Meteor.Collection("industry");
Industry.attachSchema(new SimpleSchema({
label:{
type:String
},
value:{
type:String
},
categories:{
type: [Object]
}
}));
I am not sure but maybe the error is occuring because you are not validating 'categories' in your schema. Try adding a 'blackbox:true' to your 'categories' so that it accepts any types of objects.
Industry.attachSchema(new SimpleSchema({
label: {
type: String
},
value: {
type: String
},
categories: {
type: [Object],
blackbox:true // allows all objects
}
}));
Once you've done that try adding values to it like this
var newObject = {
id: categoryId,
label: newCategory,
value: newCategory
}
Industry.update({
_id: industryId
}, {
$push: {
categories: newObject //newObject can be anything
}
});
This would allow you to add any kind of object into the categories field.
But you mentioned in a comment that categories is also another collection.
If you already have a SimpleSchema for categories then you could validate the categories field to only accept objects that match with the SimpleSchema for categories like this
Industry.attachSchema(new SimpleSchema({
label: {
type: String
},
value: {
type: String
},
categories: {
type: [categoriesSchema] // replace categoriesSchema by name of SimpleSchema for categories
}
}));
In this case only objects that match categoriesSchema will be allowed into categories field. Any other type would be filtered out. Also you wouldnt get any error on console for trying to insert other types.(which is what i think is happening when you try to insert now as no validation is specified)
EDIT : EXPLANATION OF ANSWER
In a SimpleSchema when you define an array of objects you have to validate it,ie, you have to tell it what objects it can accept and what it can't.
For example when you define it like
...
categories: {
type: [categoriesSchema] // Correct
}
it means that objects that are similar in structure to those in another SimpleSchema named categoriesSchema only can be inserted into it. According to your example any object you try to insert should be of this format
{
id: categoryId,
label: newCategory,
value: newCategory
}
Any object that isn't of this format will be rejected while insert. Thats why all objects you tried to insert where rejected when you tried initially with your schema structured like this
...
categories: {
type: [Object] // Not correct as there is no SimpleSchema named 'Object' to match with
}
Blackbox:true
Now, lets say you don't what your object to be filtered and want all objects to be inserted without validation.
Thats where setting "blackbox:true" comes in. If you define a field like this
...
categories: {
type: [Object], // Correct
blackbox:true
}
it means that categories can be any object and need not be validated with respect to some other SimpleSchema. So whatever you try to insert gets accepted.
If you run this query in mongo shell, it will produce a log like matched:1, updated:0. Please check what you will get . if matched is 0, it means that your input query is not having any matching documents.