openapi stripe payload definition for events - openapi

Hi I am trying to find a way to define this structure in openapi
{
"id": "evt_1M42aUGgA02srhGVcjgIsoS2",
"object": "event",
"api_version": null,
"created": 1668432034,
"data": {
"object": {
"id": "price_1M42aUJX9HHJ5bycQphyzwec",
"object": "plan",
"active": true,
"aggregate_usage": null,
"amount": 2000,
"amount_decimal": "2000",
"billing_scheme": "per_unit",
"created": 1668432034,
"currency": "pln",
"interval": "month",
"interval_count": 1,
"livemode": false,
"metadata": {},
"nickname": null,
"product": "prod_Mm9YCJ0pVG7qCh",
"tiers_mode": null,
"transform_usage": null,
"trial_period_days": null,
"usage_type": "licensed"
}
},
"livemode": false,
"pending_webhooks": 0,
"request": {
"id": null,
"idempotency_key": null
},
"type": "plan.created"
}
where the data.object is dynamic and depends on event trigger
"object": {
"id": "price_1M42aUJX9HHJ5bycQphyzwec",
"object": "plan",
"active": true,
"aggregate_usage": null,
"amount": 2000,
"amount_decimal": "2000",
"billing_scheme": "per_unit",
"created": 1668432034,
"currency": "pln",
"interval": "month",
"interval_count": 1,
"livemode": false,
"metadata": {},
"nickname": null,
"product": "prod_Mm9YCJ0pVG7qCh",
"tiers_mode": null,
"transform_usage": null,
"trial_period_days": null,
"usage_type": "licensed"
}
the current definition I am using in openapi is:
StripeEvent:
properties:
api_version:
type: string
object:
type: string
account:
type: string
created:
type: integer
data:
type: object
$ref: '#/components/schemas/StripeEventObject'
id:
type: string
livemode:
type: boolean
pending_webhooks:
type: integer
request:
type: object
properties:
id:
type: string
idempotency_key:
type: string
type:
type: string
and the nested object
StripeEventObject:
properties:
object:
type: object
$ref: '#/components/schemas/StripeEventNestedObject'
StripeEventNestedObject:
additionalProperties:
type: object
the problem is that the embedded data.object when I am receiving the payload is empty, so my question is .. if there is a way to define this part dynamically ?
Regards.

Change the StripeEventObject schema to:
StripeEventObject:
type: object
properties:
object:
type: object # Free-form object
type: object alone without properties/additionalProperties means it's a free-form/dynamic object with arbitrary properties.
You don't need additionalProperties: {type: ...} in the nested schema, it's used to define maps/dictionaries.

Related

Error with migration postgress, i have two columns over, why?

i have a migration, i only have four colums, id, name, last_name and email but when i do a query from postman it show me other colums over SELECT \"id\", \"name\", \"lastName\", \"email\", \"createdAt\", \"updatedAt\" FROM \"Users\" AS \"User\" what is the wrong?
module.exports = {
up: (queryInterface, Sequelize) => {
return queryInterface.createTable('User', {
id: {
allowNull: false,
autoIncrement: true,
primaryKey: true,
type: Sequelize.INTEGER
},
name: {
type: Sequelize.STRING,
allowNull: false,
},
last_name: {
type: Sequelize.STRING,
allowNull: false,
},
email: {
type: Sequelize.STRING,
allowNull: false,
},
});
},
down: (queryInterface) => {
return queryInterface.dropTable('User');
}
};
and when i used my service
static async getAllUsers() {
try {
const users = await database.User.findAll();
console.log('COnsOLE ', users)
return users
} catch (error) {
throw error;
}
}
i get this error from postman:
{
"status": "error",
"message": {
"name": "SequelizeDatabaseError",
"parent": {
"length": 104,
"name": "error",
"severity": "ERROR",
"code": "42P01",
"position": "73",
"file": "parse_relation.c",
"line": "1180",
"routine": "parserOpenTable",
"sql": "SELECT \"id\", \"name\", \"lastName\", \"email\", \"createdAt\", \"updatedAt\" FROM \"Users\" AS \"User\";"
},
"original": {
"length": 104,
"name": "error",
"severity": "ERROR",
"code": "42P01",
"position": "73",
"file": "parse_relation.c",
"line": "1180",
"routine": "parserOpenTable",
"sql": "SELECT \"id\", \"name\", \"lastName\", \"email\", \"createdAt\", \"updatedAt\" FROM \"Users\" AS \"User\";"
},
"sql": "SELECT \"id\", \"name\", \"lastName\", \"email\", \"createdAt\", \"updatedAt\" FROM \"Users\" AS \"User\";"
}
}
i before used this commands many times: sequelize db:migrate and sequelize db:migrate:undo
this is my git repository: https://github.com/x-rw/basePostgresExpressjs
you should situate in server directory and write npm run dev
That's because your sequelize model instance have updatedAt and createdAt fields and because of that it queries the database to get these two fields too. But in your migration file there is no updatedAt and createdAt fields. So your database table does not have these columns.
You have two options, if you really don't want to use updatedAt and createdAt you should specify that while initializing your sequelize model instance. Check the api reference. You should see options.timestamps. You can set it to false.
class YourModel extends Sequelize.Model { }
YourModel.init(
{
name: {
type: Sequelize.DataTypes.STRING(100),
allowNull: false,
validate: {
notNull: true,
notEmpty: true,
len: [2, 100]
}
},
},
{
sequelize: sequelizeInstance,
timestamps: false // This is what you need.
}
);
If you want to use it however, check my answer here to generate correct migrations.

Athena (Presto) view with complext column type

I'm trying to create an Athena view managed with CloudFormation. This view contains list of nested records property.
Running a SELECT directly in Athena works fine:
SELECT
item_id AS material_id,
material_type AS material_type,
material_group AS material_group,
material_status AS x_plant_mat_stat,
products[1].PRODUCT_NO AS product_nr,
products[1].VERSION AS product_version,
products[1].SUPPL_CHAIN_OWNERSHIP AS supply_chain_owner,
products[1].DELETED_DATE AS global_deleted_date,
transform(
warehouses,
plant -> CAST(ROW(
plant.WAREHOUSE,
plant.PLANT_SPECIFIC_MAT_STATUS,
plant.PROCUREMENT_TYPE
) AS ROW(plant_id varchar, ps_material_stat varchar, proc_type varchar))
) AS plants
FROM raw_item_master LIMIT 5
But when I try following CloudFormation snippet:
View:
Type: "AWS::Glue::Table"
Properties:
CatalogId: !Ref "AWS::AccountId"
DatabaseName: !Ref "GlueDatabaseName"
TableInput:
TableType: "VIRTUAL_VIEW"
Name: "item_master"
Parameters:
presto_view: true
StorageDescriptor:
SerdeInfo: {}
Columns:
-
Name: "material_id"
Type: "string"
-
Name: "material_type"
Type: "string"
-
Name: "material_group"
Type: "string"
-
Name: "x_plant_mat_stat"
Type: "string"
-
Name: "product_nr"
Type: "string"
-
Name: "product_version"
Type: "string"
-
Name: "supply_chain_owner"
Type: "string"
-
Name: "global_deleted_date"
Type: "string"
-
Name: "plants"
Type: "array<struct<plant_id:string,ps_material_stat:string,proc_type:string>>"
ViewOriginalText:
"Fn::Sub":
- "/* Presto View: ${View} */"
-
View:
"Fn::Base64": !Sub '
{
"catalog": "awsdatacatalog",
"schema": "${GlueDatabaseName}",
"columns": [
{
"name": "material_id",
"type": "varchar"
},
{
"name": "material_type",
"type": "varchar"
},
{
"name": "material_group",
"type": "varchar"
},
{
"name": "x_plant_mat_stat",
"type": "varchar"
},
{
"name": "product_nr",
"type": "varchar"
},
{
"name": "product_version",
"type": "varchar"
},
{
"name": "supply_chain_owner",
"type": "varchar"
},
{
"name": "global_deleted_date",
"type": "varchar"
},
{
"name": "plants",
"type": "array(row(plant_id varchar, ps_material_stat varchar, proc_type varchar))"
}
],
"originalSql": "SELECT
item_id AS material_id,
material_type AS material_type,
material_group AS material_group,
material_status AS x_plant_mat_stat,
products[1].PRODUCT_NO AS product_nr,
products[1].VERSION AS product_version,
products[1].SUPPL_CHAIN_OWNERSHIP AS supply_chain_owner,
products[1].DELETED_DATE AS global_deleted_date,
transform(
warehouses,
plant -> CAST(ROW(
plant.WAREHOUSE,
plant.PLANT_SPECIFIC_MAT_STATUS,
plant.PROCUREMENT_TYPE
) AS ROW(plant_id varchar, ps_material_stat varchar, proc_type varchar))
) AS plants
FROM ${RawTable}"
}'
I got a following error in Athena:
INVALID_VIEW: Invalid view JSON: # here comes my JSON
However when I select just one property it works fine (field type as "type": "array(row(plant_id varchar))", transform as CAST(ROW(plant.WAREHOUSE) AS ROW(plant_id varchar)). View works with any property, but only one - as soon as I add two properties it break in Athena.
After creating view from Athena and extracting it with aws glue get-table I compared my input with Athena output and the only different were whitespaces in column definition.
My input (spaces after commas):
"type": "array(row(plant_id varchar, ps_material_stat varchar, proc_type varchar))"
Athena (no whitespaecs):
"type": "array(row(plant_id varchar,ps_material_stat varchar,proc_type varchar))"
After removing whitespaces it worked!

Swagger API specs Request object design

I have written an api specs following OpenAPI/Swagger Specification -
{
"post": {
"tags": [
"UserController"
],
"operationId": "getUsers",
"parameters": [
{
"name": "accountID",
"in": "path",
"required": true,
"schema": {
"type": "number"
}
},
{
"name": "sortKey",
"in": "query",
"required": false,
"schema": {
"type": "string"
}
},
{
"name": "sortOrder",
"in": "query",
"required": false,
"schema": {
"type": "string"
}
}
],
"responses": {
"200": {
"description": "default response",
"content": {
"*/*": {
"schema": {
"$ref": "#/components/schemas/UserResponse"
}
}
}
}
}
}
}
The API Request takes accountId, sortKey and sortOrder. Should they should be wrapped in a Top level request object (getUsersRequest) ? What is the best practice?
{
"GetUsersRequest": {
"accountID": "String",
"sortKey": "String",
"sortOrder": "String"
}
}
vs
{
"accountID": "String",
"sortKey": "String",
"sortOrder": "String"
}
Usually just use the properties. Using a "wrapper" object can be useful if the parameters belong to multiple groups.
For example if you have an api with paging:
/query?filter=findme&page=5&size=5
I see two groups of parameters.
the filter to limit the query result, that is the main purpose of the api.
the page & size parameters, which are more a technical help to limit the amount of results.
you can use an (wrapper) object to easily communicate that two of the three parameters belong together and are used for paging.
as yaml:
/query:
get:
description: ...
parameters:
- name: filter
description: filters the data by the given value
in: query
schema:
type: string
- name: paging
description: page selection
in: query
required: false
schema:
$ref: '#/components/schemas/Paging'
components:
schemas:
Paging:
type: object
properties:
page:
type: integer
size:
type: integer
So in your example you could group sortKey & sortOrder as a view group while accountId is the main parameter of the api.

MongoDB date insert

This is my schema:
db.createCollection("user_clicks", {
validator: {
$jsonSchema: {
bsonType: "object",
required: [ "session_id", "country", "browser", "url", "date"],
properties: {
session_id: {
bsonType: "string",
description: "must be a string and is required" },
country: {
bsonType: "string",
description: "country name and is required"},
browser: {
bsonType: "string",
description: "browser name and is required"},
url : {
bsonType: "string",
description: "user click url and is required"},
date: {
bsonType: "date",
description: "localdatetime and is required"}}}})
This is the code I am using to generate data:
mgeneratejs `{
"session_id": "$oid",
"country": "$country",
"browser": {
"$choose": {
"from": [
"Firefox",
"Chrome",
"Safari",
"Explorer"
],
"weights": [
1,
2,
2,
1
]
}
},
"url": {
"$choose": {
"from": [
"google.com/images",
"facebook.com/profile1538713",
"soundcloud.com/playlist03",
"some-url.com/home",
"sinoptik.ua/kyiv"
],
"weights": [
1,
2,
2,
1,
3
]
}
},
"date": {
"$date": {
"min": "2016-08-01T23:59:59.999Z",
"max": "2016-10-01T23:59:59.999Z"
}
}
}` -n 5 | mongoimport --uri="mongodb://localhost:27017/events" --collection user_clicks --mode=insert
I'm trying to generate random date by using mgeneratejs and mongoimport. The problem is that i can't insert any date like : "3/13/2019" or "2019-03-26T23:44:26Z" (that's what i actually need). The error is:
**WriteResult({
"nInserted" : 0,
"writeError" : {
"code" : 121,
"errmsg" : "Document failed validation"
}
})**
I try to insert like new Date("2019-03-26T23:44:26Z") and it works! Please help how to automate every time inserting create new Date(date) or how to fix this !
I am confused at what your issue is... everything is working as it should.. You store dates in Mongo as Date objects (that have the type of Date)..
If you want to store dates in Mongo in a format like 3/13/2019 you can do something like this:
// javascript
let dateToInsert = new Date("3/13/2019");
Use the below before inserting the random date and it should work:
new Date(randomDate);

get json key value using powershell

I have the following json output string:
{
"meta": {
"limit": 20,
"next": null,
"offset": 0,
"previous": null,
"total_count": 1
},
"objects": [{
"bcontext": "/api/v2.0/buildercontext/2/",
"bugs": [],
"build": {
"bldtype": "obj",
"branch": "main",
"buildstatus": [{
"build": "/api/v2.0/build/2140634/",
"failurereason": "_checkfailures (seen: FAIL - /testrun/18647678/ - area[4769] AIM-SANITY)",
"id": "1294397",
"lastupdate": "2015-03-31T14:30:18",
"overridden": false,
"overridedesc": "",
"overrideuser": null,
"recommended": false,
"resource_uri": "/api/v2.0/buildstatus/1294397/",
"slatype": {
"id": "26",
"name": "VA_Bats",
"resource_uri": "/api/v2.0/sla/26/"
}
}],
"changeset": "494625",
"coverage": false,
"deliverables": ["/api/v2.0/deliverable/4296455/", "/api/v2.0/deliverable/4296956/", "/api/v2.0/deliverable/4296959/", "/api/v2.0/deliverable/4296986/", "/api/v2.0/deliverable/4296992/", "/api/v2.0/deliverable/4296995/", "/api/v2.0/deliverable/4297034/", "/api/v2.0/deliverable/4297058/"],
"git_host": null,
"git_repo": null,
"id": "2140634",
"p4host": {
"id": "10",
"p4port": "perforce-rhino.eng.com:1800",
"p4weburl": "http://p4web.eng.com:1800",
"resource_uri": "/api/v2.0/perforceserver/10/"
},
"resource_uri": "/api/v2.0/build/2140634/",
"site": "/api/v2.0/site/25/",
"site_name": "mbu",
"slastested": ["/api/v2.0/sla/26/"],
"submit_time": "2015-03-31T05:40:21",
"submit_user": "haharonof"
},
"builder": "/api/v2.0/builder/1423/",
"clean": true,
"componentbuilds": "vcops-vsphere-solution-pak=sb-5242047,vrops=sb-5242013,vscm=sb-5242025,vsutilities=sb-5242029;parentbuilder=1410",
"deleted": false,
"endtime": "2015-03-31T06:20:58",
"helpzillas": [],
"id": "4296956",
"location": {
"httpserver": "sc-prd-cat-services001.eng.com",
"id": "1",
"name": "PA",
"nfsserver": "cat-results.eng.com",
"pxedir": "/mts/builder-pxe",
"resource_uri": "/api/v2.0/location/1/",
"resultspath": "/results"
},
"nfsserver": "build-storage60",
"p4client": "vmktestdevnanny-builder-1423",
"path": "/storage60/release/sb-5242148",
"ready": true,
"resource_uri": "/api/v2.0/deliverable/4296956/",
"result": "PASS",
"sbbuildid": 5242148,
"sbjobid": 5242148,
"sbuser": "arajamanickam",
"starttime": "2015-03-31T06:16:50",
"targetchangeset": "494625",
"targets": "vcopssuitevm",
"triagetime": null,
"vmodl": null
}]
}
I want to get sbbuildid using powershell. How can I get this?
By converting your json to an object, using the ConvertFrom-Json cmdlet (assuming $jsonString contains the json above):
$jsonObj = $jsonString | ConvertFrom-Json
$jsonObj.objects.sbbuildid
$sb_build_id = $build_info.Substring($build_info.IndexOf("sbbuildid") + 11, 8).trim()
Put whole string in $build_info