How to extract a specific object inside an jsonb column using postgres - postgresql

My Jsonb column looks something like this:
table name:jsonb and column name:json_test
[
{
"run_id": "EXE20170822172151192",
"user_id": "12",
"log_level": "1",
"time_stamp": "2017-08-22T10:03:38.083Z",
***"test_case_id": "1073",
"test_suite_id": "null",
"test_case_name": "Gmail Flow",***
"test_suite_name": "",
"test_suite_abort": "",
"capture_screenshots": "Y",
"abort_at_step_failure": "Y"
"teststeps": [
{
"UItype": " UI ",
"action": " UI_Open_Browser ",
"param1": "Chrome",
"step_id": " 1",
"skip_to_step": " 0 ",
"skip_from_step": " 0 ",
"step_output_value": "true",
"step_execution_time": " 0:0:12:154 ",
"step_execution_status": "success",
"step_execution_end_time": " 2017-08-22 17:22:35:813 IST+0530 ",
"step_execution_start_time": " 2017-08-22 17:22:23:967 IST+0530 ",
"use_previous_step_output_data": " N ",
"execute_next_step_after_failure": " N ",
"skip_execution_based_on_prv_step_status": " F "
},
I wanna extract the objects from the json such as "test_case_id" "test_case_name" etc..
I tried using "jsonb_array_elements" but since starting of the jsonb is an array, I am not able to fetch the objects inside the array, can somebody help with this

if you fix the json (missing comma before "teststeps"), it works:
s=# with j as (select '[
{
"run_id": "EXE20170822172151192",
"user_id": "12",
"log_level": "1",
"time_stamp": "2017-08-22T10:03:38.083Z",
"test_case_id": "1073",
"test_suite_id": "null",
"test_case_name": "Gmail Flow",
"test_suite_name": "",
"test_suite_abort": "",
"capture_screenshots": "Y",
"abort_at_step_failure": "Y",
"teststeps": [
{
"UItype": " UI ",
"action": " UI_Open_Browser ",
"param1": "Chrome",
"step_id": " 1",
"skip_to_step": " 0 ",
"skip_from_step": " 0 ",
"step_output_value": "true",
"step_execution_time": " 0:0:12:154 ",
"step_execution_status": "success",
"step_execution_end_time": " 2017-08-22 17:22:35:813 IST+0530 ",
"step_execution_start_time": " 2017-08-22 17:22:23:967 IST+0530 ",
"use_previous_step_output_data": " N ",
"execute_next_step_after_failure": " N ",
"skip_execution_based_on_prv_step_status": " F "
}
]
}
]'::jsonb b)
select jsonb_array_elements(b)->'test_case_id' from j;
?column?
----------
"1073"
(1 row)

Related

How can I remove the double space to dot mapping in VIM? Or why is it even useful?

In my VIM-VSCode setup, I'm trying to edit the behaviour of the double space tap when in insert mode to NOT add a dot. I'm not sure either why this is considered useful behaviour?
The current fixes I found are not working and I'm not sure why:
"vim.insertModeKeyBindings": [
{
"before": [" ", " "],
"after": [" ", " "],
"commands": [":nohlsearch"]
}
]
"vim.insertModeKeyBindings": [
{
"before": [" ", " "],
"after": [],
"commands": []
}
]

Backward Comaptibility issue and uncertainity in Schema Registry

I have a use case where I have a JSON and I want to generate schema and record out of the JSON and publish a record.
I have configured the value serializer and Schema setting is Backward compatible.
First JSON
String json = "{\n" +
" \"id\": 1,\n" +
" \"name\": \"Headphones\",\n" +
" \"price\": 1250.0,\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Version 1 schema registered.
Received message in avro console consumer.
Second JSON.
String json = "{\n" +
" \"id\": 1,\n" +
" \"price\": 1250.0,\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Registered schema Successfully.
Sent message.
Now tried sending the JSON 1 sent successfully
Schema 3:
String json = "{\n" +
" \"id\": 1,\n" +
" \"name\": \"Headphones\",\n" +
" \"tags\": [\"home\", \"green\"]\n" +
"}\n"
;
Got error for this case.
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Schema being registered is incompatible with an earlier schema; error code: 409
How is that schema generated from 2nd JSON was registered and the
third one was rejected? Although I didn't have any Default key for the
deleted field? Is it that the Schema Registry always accepts the 1st
evolution? (2nd schema over 1st)
Schema in schema registry
Version 1 schema
{
"fields": [
{
"doc": "Type inferred from '1'",
"name": "id",
"type": "int"
},
{
"doc": "Type inferred from '\"Headphones\"'",
"name": "name",
"type": "string"
},
{
"doc": "Type inferred from '1250.0'",
"name": "price",
"type": "double"
},
{
"doc": "Type inferred from '[\"home\",\"green\"]'",
"name": "tags",
"type": {
"items": "string",
"type": "array"
}
}
],
"name": "myschema",
"type": "record" }
Version 2:
{
"fields": [
{
"doc": "Type inferred from '1'",
"name": "id",
"type": "int"
},
{
"doc": "Type inferred from '1250.0'",
"name": "price",
"type": "double"
},
{
"doc": "Type inferred from '[\"home\",\"green\"]'",
"name": "tags",
"type": {
"items": "string",
"type": "array"
}
}
],
"name": "myschema",
"type": "record" }
Let's go over the backwards compatibility rules... https://docs.confluent.io/current/schema-registry/avro.html#compatibility-types
First, the default isn't transitive, so version 3 only will look at version 2.
The backwards rule states you can delete fields or add optional fields (those with a default). I assume your schema generator tool doesn't know how to use optionals, so you're only allowed to delete, not add.
Between version 1 and 2, you've deleted the name field, which is valid.
Between version 2 and the incoming 3, it thinks you're trying to post a new schema which removes price (this is okay}, but adds a required name field, which is not allowed.

when i use my phpmyadmin it always tells me error but i don't know what's wrong with it

when I test my database , it can read and write successfully but my phpmyadmin tells me something wrong in the mean while. But when I use it on my xcode post my url through Alamofire,the data is wrong.
like these:
Optional( { URL: http://XXXXXXX/register.php/?ema=123&pas=123 }{ Status Code: 200, Headers {
Connection = (
"Upgrade, close"
);
"Content-Encoding" = (
gzip
);
"Content-Length" = (
194
);
"Content-Type" = (
"text/html"
);
Date = (
"Wed, 06 Mar 2019 09:26:15 GMT"
);
"Proxy-Connection" = (
"keep-alive"
);
Server = (
Apache
);
Upgrade = (
h2
);
Vary = (
"Accept-Encoding"
);
"X-Powered-By" = (
"PHP/5.4.45"
);
} })
Optional(344 bytes)
FAILURE
And when I look through my phpmyadmin,it tells me there is something wrong:
{
"pma_version": "4.4.15.10",
"browser_name": "CHROME",
"browser_version": "69.0.3497.100",
"user_os": "Win",
"server_software": "Apache",
"user_agent_string": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.100 Safari/537.36",
"locale": "zh_CN",
"configuration_storage": "disabled",
"php_version": "5.4.45",
"exception_type": "js",
"exception": {
"mode": "stack",
"name": "TypeError",
"message": "Cannot read property '0' of null",
"stack": [
{
"line": 235,
"func": "",
"context": [
" lastException = null;",
" notifyHandlers.apply(null, [stack, null].concat(args));",
" }",
" }, (stack.incomplete ? 2000 : 0));",
"",
" throw ex; // re-throw to propagate to the top level (and cause wind//...",
" }",
"",
" report.subscribe = subscribe;",
" report.unsubscribe = unsubscribe;",
" return report;"
],
"column": "",
"filename": "tracekit/tracekit.js"
},
{
"func": "getFieldValidators",
"line": 302,
"column": "40",
"context": [
" * #return array of [function, parameters to be passed to function]",
" */",
"function getFieldValidators(field_id, onKeyUpOnly)",
"{",
" // look for field bound validator",
" var name = field_id.match(/[^-]+$/)[0];",
" if (typeof validators._field[name] != 'undefined') {",
" return [[validators._field[name], null]];",
" }",
"",
" // look for registered validators"
],
"filename": "config.js"
},
{
"func": "validate_field",
"line": 418,
"column": "21",
"context": [
"{",
" var args, result;",
" var $field = $(field);",
" var field_id = $field.attr('id');",
" errors[field_id] = [];",
" var functions = getFieldValidators(field_id, isKeyUp);",
" for (var i = 0; i < functions.length; i++) {",
" if (typeof functions[i][1] !== 'undefined' && functions[i][1] !== n//...",
" args = functions[i][1].slice(0);",
" } else {",
" args = [];"
],
"filename": "config.js"
},
{
"func": "HTMLDocument.<anonymous>",
"line": 511,
"column": "13",
"context": [
" var $check_page_refresh = $('#check_page_refresh');",
" if ($check_page_refresh.length === 0 || $check_page_refresh.val() == '1') {",
" // run all field validators",
" var errors = {};",
" for (var i = 0; i < $elements.length; i++) {",
" validate_field($elements[i], false, errors);",
" }",
" // run all fieldset validators",
" $('fieldset').each(function () {",
" validate_fieldset(this, false, errors);",
" });"
],
"filename": "config.js"
},
{
"func": "HTMLDocument.new_func",
"line": 279,
"column": "33",
"context": [
" */",
" wrap_function: function (func) {",
" if (!func.wrapped) {",
" var new_func = function () {",
" try {",
" return func.apply(this, arguments);",
" } catch (x) {",
" TraceKit.report(x);",
" }",
" };",
" new_func.wrapped = true;"
],
"filename": "error_report.js"
},
{
"func": "HTMLDocument.dispatch",
"line": 3,
"column": "8436",
"context": [
"/*! jQuery v1.11.1 | (c) 2005, 2014 jQuery Foundation, Inc. | jquery.org/li//...",
"!function(a,b){\"object\"==typeof module&&\"object\"==typeof module.exports?mod//...",
"if(k&&j[k]&&(e||j[k].data)||void 0!==d||\"string\"!=typeof b)return k||(k=i?a//...",
"},cur:function(){var a=Zb.propHooks[this.prop];return a&&a.get?a.get(this)://...",
";",
"",
"function sprintf() {",
"/*"
],
"filename": "jquery/jquery-1.11.1.min.js"
},
{
"func": "HTMLDocument.$event.dispatch",
"line": 374,
"column": "19",
"context": [
"$event.dispatch = function( event ){",
"\tif ( $.data( this, \"suppress.\"+ event.type ) - new Date().getTime() > 0 ){",
"\t\t$.removeData( this, \"suppress.\"+ event.type );",
"\t\treturn;",
"\t}",
"\treturn $dispatch.apply( this, arguments );",
"};",
"",
"// event fix hooks for touch events...",
"var touchHooks = ",
"$event.fixHooks.touchstart = "
],
"filename": "jquery/jquery.event.drag-2.2.js"
},
{
"func": "HTMLDocument.r.handle",
"line": 3,
"column": "5139",
"context": [
"/*! jQuery v1.11.1 | (c) 2005, 2014 jQuery Foundation, Inc. | jquery.org/li//...",
"!function(a,b){\"object\"==typeof module&&\"object\"==typeof module.exports?mod//...",
"if(k&&j[k]&&(e||j[k].data)||void 0!==d||\"string\"!=typeof b)return k||(k=i?a//...",
"},cur:function(){var a=Zb.propHooks[this.prop];return a&&a.get?a.get(this)://...",
";",
"",
"function sprintf() {",
"/*"
],
"filename": "jquery/jquery-1.11.1.min.js"
},
{
"func": "Object.trigger",
"line": 3,
"column": "7537",
"context": [
"/*! jQuery v1.11.1 | (c) 2005, 2014 jQuery Foundation, Inc. | jquery.org/li//...",
"!function(a,b){\"object\"==typeof module&&\"object\"==typeof module.exports?mod//...",
"if(k&&j[k]&&(e||j[k].data)||void 0!==d||\"string\"!=typeof b)return k||(k=i?a//...",
"},cur:function(){var a=Zb.propHooks[this.prop];return a&&a.get?a.get(this)://...",
";",
"",
"function sprintf() {",
"/*"
],
"filename": "jquery/jquery-1.11.1.min.js"
},
{
"func": "HTMLDocument.<anonymous>",
"line": 3,
"column": "15404",
"context": [
"/*! jQuery v1.11.1 | (c) 2005, 2014 jQuery Foundation, Inc. | jquery.org/li//...",
"!function(a,b){\"object\"==typeof module&&\"object\"==typeof module.exports?mod//...",
"if(k&&j[k]&&(e||j[k].data)||void 0!==d||\"string\"!=typeof b)return k||(k=i?a//...",
"},cur:function(){var a=Zb.propHooks[this.prop];return a&&a.get?a.get(this)://...",
";",
"",
"function sprintf() {",
"/*"
],
"filename": "jquery/jquery-1.11.1.min.js"
},
{
"func": "Function.each",
"line": 2,
"column": "2973",
"context": [
"/*! jQuery v1.11.1 | (c) 2005, 2014 jQuery Foundation, Inc. | jquery.org/li//...",
"!function(a,b){\"object\"==typeof module&&\"object\"==typeof module.exports?mod//...",
"if(k&&j[k]&&(e||j[k].data)||void 0!==d||\"string\"!=typeof b)return k||(k=i?a//...",
"},cur:function(){var a=Zb.propHooks[this.prop];return a&&a.get?a.get(this)://...",
";",
"",
"function sprintf() {"
],
"filename": "jquery/jquery-1.11.1.min.js"
}
],
"useragent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.100 Safari/537.36",
"incomplete": "false",
"partial": "true",
"uri": "sql.php?target="
},
"script_name": "sql.php",
"microhistory": {
"current_index": "0"
}
}
and the phpmyadmin's error:
{
"pma_version": "4.4.15.10",
"browser_name": "CHROME",
"browser_version": "69.0.3497.100",
"user_os": "Win",
"server_software": "Apache",
"user_agent_string": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.100 Safari/537.36",
"locale": "zh_CN",
"configuration_storage": "disabled",
"php_version": "5.4.45",
"exception_type": "js",
"exception": {
"mode": "stack",
"name": "TypeError",
"message": "Cannot read property '0' of null",
"stack": [
{
"line": 235,
"func": "",
"context": [
" lastException = null;",
" notifyHandlers.apply(null, [stack, null].concat(args));",
" }",
" }, (stack.incomplete ? 2000 : 0));",
"",
" throw ex; // re-throw to propagate to the top level (and cause wind//...",
" }",
"",
" report.subscribe = subscribe;",
" report.unsubscribe = unsubscribe;",
" return report;"
],
"column": "",
"filename": "tracekit/tracekit.js"
},
{
"func": "getFieldValidators",
"line": 302,
"column": "40",
"context": [
" * #return array of [function, parameters to be passed to function]",
" */",
"function getFieldValidators(field_id, onKeyUpOnly)",
"{",
" // look for field bound validator",
" var name = field_id.match(/[^-]+$/)[0];",
" if (typeof validators._field[name] != 'undefined') {",
" return [[validators._field[name], null]];",
" }",
"",
" // look for registered validators"
],
"filename": "config.js"
},
{
"func": "validate_field",
"line": 418,
"column": "21",
"context": [
"{",
" var args, result;",
" var $field = $(field);",
" var field_id = $field.attr('id');",
" errors[field_id] = [];",
" var functions = getFieldValidators(field_id, isKeyUp);",
" for (var i = 0; i < functions.length; i++) {",
" if (typeof functions[i][1] !== 'undefined' && functions[i][1] !== n//...",
" args = functions[i][1].slice(0);",
" } else {",
" args = [];"
],
"filename": "config.js"
},
{
"func": "HTMLDocument.<anonymous>",
"line": 511,
"column": "13",
"context": [
" var $check_page_refresh = $('#check_page_refresh');",
" if ($check_page_refresh.length === 0 || $check_page_refresh.val() == '1') {",
" // run all field validators",
" var errors = {};",
" for (var i = 0; i < $elements.length; i++) {",
" validate_field($elements[i], false, errors);",
" }",
" // run all fieldset validators",
" $('fieldset').each(function () {",
" validate_fieldset(this, false, errors);",
" });"
],
"filename": "config.js"
},
{
"func": "HTMLDocument.new_func",
"line": 279,
"column": "33",
"context": [
" */",
" wrap_function: function (func) {",
" if (!func.wrapped) {",
" var new_func = function () {",
" try {",
" return func.apply(this, arguments);",
" } catch (x) {",
" TraceKit.report(x);",
" }",
" };",
" new_func.wrapped = true;"
],
"filename": "error_report.js"
},
{
"func": "HTMLDocument.dispatch",
"line": 3,
"column": "8436",
"context": [
"/*! jQuery v1.11.1 | (c) 2005, 2014 jQuery Foundation, Inc. | jquery.org/li//...",
"!function(a,b){\"object\"==typeof module&&\"object\"==typeof module.exports?mod//...",
"if(k&&j[k]&&(e||j[k].data)||void 0!==d||\"string\"!=typeof b)return k||(k=i?a//...",
"},cur:function(){var a=Zb.propHooks[this.prop];return a&&a.get?a.get(this)://...",
";",
"",
"function sprintf() {",
"/*"
],
"filename": "jquery/jquery-1.11.1.min.js"
},
{
"func": "HTMLDocument.$event.dispatch",
"line": 374,
"column": "19",
"context": [
"$event.dispatch = function( event ){",
"\tif ( $.data( this, \"suppress.\"+ event.type ) - new Date().getTime() > 0 ){",
"\t\t$.removeData( this, \"suppress.\"+ event.type );",
"\t\treturn;",
"\t}",
"\treturn $dispatch.apply( this, arguments );",
"};",
"",
"// event fix hooks for touch events...",
"var touchHooks = ",
"$event.fixHooks.touchstart = "
],
"filename": "jquery/jquery.event.drag-2.2.js"
},
{
"func": "HTMLDocument.r.handle",
"line": 3,
"column": "5139",
"context": [
"/*! jQuery v1.11.1 | (c) 2005, 2014 jQuery Foundation, Inc. | jquery.org/li//...",
"!function(a,b){\"object\"==typeof module&&\"object\"==typeof module.exports?mod//...",
"if(k&&j[k]&&(e||j[k].data)||void 0!==d||\"string\"!=typeof b)return k||(k=i?a//...",
"},cur:function(){var a=Zb.propHooks[this.prop];return a&&a.get?a.get(this)://...",
";",
"",
"function sprintf() {",
"/*"
],
"filename": "jquery/jquery-1.11.1.min.js"
},
{
"func": "Object.trigger",
"line": 3,
"column": "7537",
"context": [
"/*! jQuery v1.11.1 | (c) 2005, 2014 jQuery Foundation, Inc. | jquery.org/li//...",
"!function(a,b){\"object\"==typeof module&&\"object\"==typeof module.exports?mod//...",
"if(k&&j[k]&&(e||j[k].data)||void 0!==d||\"string\"!=typeof b)return k||(k=i?a//...",
"},cur:function(){var a=Zb.propHooks[this.prop];return a&&a.get?a.get(this)://...",
";",
"",
"function sprintf() {",
"/*"
],
"filename": "jquery/jquery-1.11.1.min.js"
},
{
"func": "HTMLDocument.<anonymous>",
"line": 3,
"column": "15404",
"context": [
"/*! jQuery v1.11.1 | (c) 2005, 2014 jQuery Foundation, Inc. | jquery.org/li//...",
"!function(a,b){\"object\"==typeof module&&\"object\"==typeof module.exports?mod//...",
"if(k&&j[k]&&(e||j[k].data)||void 0!==d||\"string\"!=typeof b)return k||(k=i?a//...",
"},cur:function(){var a=Zb.propHooks[this.prop];return a&&a.get?a.get(this)://...",
";",
"",
"function sprintf() {",
"/*"
],
"filename": "jquery/jquery-1.11.1.min.js"
},
{
"func": "Function.each",
"line": 2,
"column": "2973",
"context": [
"/*! jQuery v1.11.1 | (c) 2005, 2014 jQuery Foundation, Inc. | jquery.org/li//...",
"!function(a,b){\"object\"==typeof module&&\"object\"==typeof module.exports?mod//...",
"if(k&&j[k]&&(e||j[k].data)||void 0!==d||\"string\"!=typeof b)return k||(k=i?a//...",
"},cur:function(){var a=Zb.propHooks[this.prop];return a&&a.get?a.get(this)://...",
";",
"",
"function sprintf() {"
],
"filename": "jquery/jquery-1.11.1.min.js"
}
],
"useragent": "Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.100 Safari/537.36",
"incomplete": "false",
"partial": "true",
"uri": "sql.php?target="
},
"script_name": "sql.php",
"microhistory": {
"current_index": "0"
}
}
I have solved this question. My api in my server is php,and there has some warnings in my php,I didn't slove it before,but few days ago it went wrong.But my other database running fluently.Then I try to slove these warnings,and it works.

Cloudformation aws-glue inline command

My goal is to create a glue job via cloudformation. Problem that im dealing with is that Command property doesnt seem to support inline code (like cloudformation lamba Code property does).
My question, is there a way to create a fully functional glue job solely with cloudformation template, is there a way around uploading command file in advance (and specifying ScriptLocation) ?
Something like this may work (untested, so you might have to tweak). Basically you create a custom resource to facilitate uploading your glue code to S3 and then reference the custom resource to obtain the location. Below you'll find the CloudFormation template, and below that the Rubycfn code you can use to generate the template dynamically.
{
"AWSTemplateFormatVersion": "2010-09-09",
"Description": "Glue Job Example Stack",
"Resources": {
"GlueCode": {
"Properties": {
"BucketName": "my-amazing-bucket-for-glue-jobs",
"Content": {
"Fn::Join": [
"\n",
[
"# some job code",
"# for glue"
]
]
},
"CreateMode": "plain-literal",
"FileName": "gluecode.sh",
"ServiceToken": {
"Fn::GetAtt": [
"InlineS3UploadFunction",
"Arn"
]
}
},
"Type": "Custom::InlineUpload"
},
"GlueJob": {
"Properties": {
"Command": {
"Name": "myglueetl",
"ScriptLocation": {
"Fn::Join": [
"",
[
"s3://my-amazing-bucket-for-glue-jobs/",
{
"Ref": "GlueCode"
}
]
]
}
},
"DefaultArguments": {
"--job-bookmark-option": "job-bookmark-enable"
},
"ExecutionProperty": {
"MaxConcurrentRuns": 2
},
"MaxRetries": 0,
"Name": "glue-job-1",
"Role": {
"Ref": "SomeRole"
}
},
"Type": "AWS::Glue::Job"
},
"InlineS3UploadFunction": {
"Properties": {
"Code": {
"ZipFile": {
"Fn::Join": [
"\n",
[
"import boto3",
"import cfnresponse",
"import hashlib",
"import json",
"import logging",
"import signal",
"import zipfile",
"",
"from urllib2 import build_opener, HTTPHandler, Request",
"",
"LOGGER = logging.getLogger()",
"LOGGER.setLevel(logging.INFO)",
"",
"def lambda_handler(event, context):",
" # Setup alarm for remaining runtime minus a second",
" try:",
" signal.alarm((context.get_remaining_time_in_millis() / 1000) - 1)",
" LOGGER.info('REQUEST RECEIVED: %s', event)",
" LOGGER.info('REQUEST RECEIVED: %s', context)",
" if event['RequestType'] == 'Create' or event['RequestType'] == 'Update':",
" LOGGER.info('Creating or updating S3 Object')",
" bucket_name = event['ResourceProperties']['BucketName']",
" file_name = event['ResourceProperties']['FileName']",
" content = event['ResourceProperties']['Content']",
" create_zip = True if event['ResourceProperties']['CreateMode'] == 'zip' else False",
" literal = True if event['ResourceProperties']['CreateMode'] == 'plain-literal' else False",
" md5_hash = hashlib.md5(content).hexdigest()",
" with open('/tmp/' + file_name, 'w') as lambda_file:",
" lambda_file.write(content)",
" lambda_file.close()",
" s3 = boto3.resource('s3')",
" if create_zip == True:",
" output_filename = file_name + '_' + md5_hash + '.zip'",
" zf = zipfile.ZipFile('/tmp/' + output_filename, mode='w')",
" try:",
" zf.write('/tmp/' + file_name, file_name)",
" finally:",
" zf.close()",
" data = open('/tmp/' + output_filename, 'rb')",
" s3.Bucket(bucket_name).put_object(Key=output_filename, Body=data)",
" else:",
" if literal == True:",
" data = open('/tmp/' + file_name, 'rb')",
" s3.Bucket(bucket_name).put_object(Key=file_name, Body=content)",
" else:",
" extension = file_name.split(\".\")[-1]",
" output_filename = \".\".join(file_name.split(\".\")[:-1]) + '_' + md5_hash + '.' + extension",
" data = open('/tmp/' + file_name, 'rb')",
" s3.Bucket(bucket_name).put_object(Key=output_filename, Body=content)",
" cfnresponse.send(event, context, cfnresponse.SUCCESS, { 'Message': output_filename } )",
" elif event['RequestType'] == 'Delete':",
" LOGGER.info('DELETE!')",
" cfnresponse.send(event, context, cfnresponse.SUCCESS, { 'Message': 'Resource deletion successful!'} )",
" else:",
" LOGGER.info('FAILED!')",
" cfnresponse.send(event, context, cfnresponse.SUCCESS, { 'Message': 'There is no such success like failure.'} )",
" except Exception as e: #pylint: disable=W0702",
" LOGGER.info(e)",
" cfnresponse.send(event, context, cfnresponse.SUCCESS, { 'Message': 'There is no such success like failure.' } )",
"",
"def timeout_handler(_signal, _frame):",
" '''Handle SIGALRM'''",
" LOGGER.info('Time exceeded')",
" raise Exception('Time exceeded')",
"",
"signal.signal(signal.SIGALRM, timeout_handler)"
]
]
}
},
"Handler": "index.lambda_handler",
"Role": {
"Fn::GetAtt": [
"LambdaExecutionRole",
"Arn"
]
},
"Runtime": "python2.7",
"Timeout": "30"
},
"Type": "AWS::Lambda::Function"
},
"LambdaExecutionRole": {
"Properties": {
"AssumeRolePolicyDocument": {
"Statement": [
{
"Action": [
"sts:AssumeRole"
],
"Effect": "Allow",
"Principal": {
"Service": [
"lambda.amazonaws.com"
]
}
}
],
"Version": "2012-10-17"
},
"Path": "/",
"Policies": [
{
"PolicyDocument": {
"Statement": [
{
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Effect": "Allow",
"Resource": "arn:aws:logs:*:*:*"
},
{
"Action": "s3:*",
"Effect": "Allow",
"Resource": "arn:aws:s3:::*"
}
],
"Version": "2012-10-17"
},
"PolicyName": "root"
}
]
},
"Type": "AWS::IAM::Role"
}
}
}
The Rubycfn code is below. Use the https://rubycfn.com/ CloudFormation compiler or the Rubycfn gem gem install rubycfn to compile:
ENV["GLUEBUCKET"] ||= "my-amazing-bucket-for-glue-jobs"
description "Glue Job Example Stack"
resource :glue_code,
type: "Custom::InlineUpload" do |r|
r.property(:service_token) { :inline_s3_upload_function.ref(:arn) }
r.property(:bucket_name) { ENV["GLUEBUCKET"] }
r.property(:file_name) { "gluecode.sh" }
r.property(:create_mode) { "plain-literal" }
r.property(:content) do
[
"# some job code",
"# for glue"
].fnjoin("\n")
end
end
resource :glue_job,
type: "AWS::Glue::Job" do |r|
r.property(:command) do
{
"Name": "myglueetl",
"ScriptLocation": ["s3://#{ENV["GLUEBUCKET"]}/", :glue_code.ref].fnjoin
}
end
r.property(:default_arguments) do
{
"--job-bookmark-option": "job-bookmark-enable"
}
end
r.property(:execution_property) do
{
"MaxConcurrentRuns": 2
}
end
r.property(:max_retries) { 0 }
r.property(:name) { "glue-job-1" }
r.property(:role) { :some_role.ref }
end
resource :inline_s3_upload_function,
type: "AWS::Lambda::Function" do |r|
r.property(:code) do
{
"ZipFile": [
"import boto3",
"import cfnresponse",
"import hashlib",
"import json",
"import logging",
"import signal",
"import zipfile",
"",
"from urllib2 import build_opener, HTTPHandler, Request",
"",
"LOGGER = logging.getLogger()",
"LOGGER.setLevel(logging.INFO)",
"",
"def lambda_handler(event, context):",
" # Setup alarm for remaining runtime minus a second",
" try:",
" signal.alarm((context.get_remaining_time_in_millis() / 1000) - 1)",
" LOGGER.info('REQUEST RECEIVED: %s', event)",
" LOGGER.info('REQUEST RECEIVED: %s', context)",
" if event['RequestType'] == 'Create' or event['RequestType'] == 'Update':",
" LOGGER.info('Creating or updating S3 Object')",
" bucket_name = event['ResourceProperties']['BucketName']",
" file_name = event['ResourceProperties']['FileName']",
" content = event['ResourceProperties']['Content']",
" create_zip = True if event['ResourceProperties']['CreateMode'] == 'zip' else False",
" literal = True if event['ResourceProperties']['CreateMode'] == 'plain-literal' else False",
" md5_hash = hashlib.md5(content).hexdigest()",
" with open('/tmp/' + file_name, 'w') as lambda_file:",
" lambda_file.write(content)",
" lambda_file.close()",
" s3 = boto3.resource('s3')",
" if create_zip == True:",
" output_filename = file_name + '_' + md5_hash + '.zip'",
" zf = zipfile.ZipFile('/tmp/' + output_filename, mode='w')",
" try:",
" zf.write('/tmp/' + file_name, file_name)",
" finally:",
" zf.close()",
" data = open('/tmp/' + output_filename, 'rb')",
" s3.Bucket(bucket_name).put_object(Key=output_filename, Body=data)",
" else:",
" if literal == True:",
" data = open('/tmp/' + file_name, 'rb')",
" s3.Bucket(bucket_name).put_object(Key=file_name, Body=content)",
" else:",
" extension = file_name.split(\".\")[-1]",
" output_filename = \".\".join(file_name.split(\".\")[:-1]) + '_' + md5_hash + '.' + extension",
" data = open('/tmp/' + file_name, 'rb')",
" s3.Bucket(bucket_name).put_object(Key=output_filename, Body=content)",
" cfnresponse.send(event, context, cfnresponse.SUCCESS, { 'Message': output_filename } )",
" elif event['RequestType'] == 'Delete':",
" LOGGER.info('DELETE!')",
" cfnresponse.send(event, context, cfnresponse.SUCCESS, { 'Message': 'Resource deletion successful!'} )",
" else:",
" LOGGER.info('FAILED!')",
" cfnresponse.send(event, context, cfnresponse.SUCCESS, { 'Message': 'There is no such success like failure.'} )",
" except Exception as e: #pylint: disable=W0702",
" LOGGER.info(e)",
" cfnresponse.send(event, context, cfnresponse.SUCCESS, { 'Message': 'There is no such success like failure.' } )",
"",
"def timeout_handler(_signal, _frame):",
" '''Handle SIGALRM'''",
" LOGGER.info('Time exceeded')",
" raise Exception('Time exceeded')",
"",
"signal.signal(signal.SIGALRM, timeout_handler)"
].fnjoin("\n")
}
end
r.property(:handler) { "index.lambda_handler" }
r.property(:role) { :lambda_execution_role.ref(:arn) }
r.property(:runtime) { "python2.7" }
r.property(:timeout) { "30" }
end
resource :lambda_execution_role,
type: "AWS::IAM::Role" do |r|
r.property(:assume_role_policy_document) do
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": [
"lambda.amazonaws.com"
]
},
"Action": [
"sts:AssumeRole"
]
}
]
}
end
r.property(:path) { "/" }
r.property(:policies) do
[
{
"PolicyName": "root",
"PolicyDocument": {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": %w(logs:CreateLogGroup logs:CreateLogStream logs:PutLogEvents),
"Resource": "arn:aws:logs:*:*:*"
},
{
"Effect": "Allow",
"Action": "s3:*",
"Resource": "arn:aws:s3:::*"
}
]
}
}
]
end
end

Firebase, Retrieving Data: Find all Titles that have 'Status' as 'Open' in Swift

Main mission is to find all Titles that have 'Status' as 'Open'
I want to get all the favorTit to an array to be used in indexPath.row
func retrieveQuery(){
var i = 0
queryRef.queryOrderedByChild("status").queryEqualToValue("Open")
.observeEventType(.ChildAdded, withBlock:
{ snapshot in
i++
self.antal = snapshot.children.allObjects.count
print(self.antal)
print(snapshot.value.objectForKey("favorTit")!)
print(i)
})
}
Firebase data
https://api.myjson.com/bins/4n5rn
{
"categories": [{
"altText": "T.ex. Städning, Disk, Tvätt, Matlagning",
"title": "Hushåll"
}, {
"altText": "T.ex. ",
"title": " "
}, {
"altText": " ",
"title": " "
}, {
"altText": " ",
"title": " "
}, {
"altText": " ",
"title": " "
}, {
"altText": " ",
"title": " "
}, {
"altText": " ",
"title": " "
}],
"favors": {
"1": {
"catId": 1,
"favorBudget": 212121,
"favorDes": "Gfdgsfdsg",
"favorDueDate": "Today",
"favorGeo": [-7.090910999999999, 107.668887],
"favorLocation": ["West Java, Indonesia"],
"favorTit": "Rätt",
"status": "Open",
"user": "2872e074-e45a-4d7a-a9c7-83fad641aa62",
"workCompletion": "In person"
},
"2": {
"catId": 1,
"favorBudget": 4000,
"favorDes": "Gfdgf",
"favorDueDate": "Today",
"favorGeo": [34.506668, -81.948334],
"favorLocation": ["Laurens County Airport, Laurens, SC 29360, USA"],
"favorTit": "Rätt",
"status": "Open",
"user": "2872e074-e45a-4d7a-a9c7-83fad641aa62",
"workCompletion": "In person"
},
"3": {
"catId": 1,
"favorBudget": 4000,
"favorDes": "Gfdgf",
"favorDueDate": "Today",
"favorGeo": [34.506668, -81.948334],
"favorLocation": ["Laurens County Airport, Laurens, SC 29360, USA"],
"favorTit": "Rätt",
"status": "Ongoing",
"user": "2872e074-e45a-4d7a-a9c7-83fad641aa62",
"workCompletion": "In person"
},
"7fd547be-7836-42e2-a74f-2f2a39baee43": {
"favorTit": "Test",
"favors": {
"furniture assmebly": {
"favorBudget": 60000,
"favorDes": "Assemly My ikea",
"favorDueDate": "Today",
"favorGeo": [39.0311755, -77.5283463],
"favorLocation": ["Polen Farm Blvd, Ashburn, VA 20148, USA"],
"favorTit": "Den ska bli fixad kom snabbt",
"workCompletion": "In person"
},
"handyman": {
"favorBudget": 43434,
"favorDes": "Install TV-Mount",
"favorDueDate": "Today",
"favorGeo": [49.0068901, 8.4036527],
"favorLocation": ["Karlsruhe, Germany"],
"favorTit": "JAllah",
"workCompletion": "In person"
},
"photography": {
"favorBudget": 6000,
"favorDes": "Jag vill ha ett album med bilder på mig och omgivningen💪 Du ska inte säga mycket under bröllopet men det ska vara vackra bilder",
"favorDueDate": "Within a week",
"favorGeo": [55.6178043, 12.98939],
"favorLocation": ["Krankajen 36, 211 12 Malmö, Sverige"],
"favorTit": "take a photo of my wedding",
"workCompletion": "In person"
},
"trädgård": {
"favorBudget": 2000,
"favorDes": "Jag vill ha den klippt med en sax",
"favorDueDate": "Within a week",
"favorGeo": [35.86166, 104.195397],
"favorLocation": ["China"],
"favorTit": "Klipp min gräsmatta",
"workCompletion": "In person"
}
},
"status": "Done"
}
},
"users": {
"2872e074-e45a-4d7a-a9c7-83fad641aa62": {
"email": "fille382#gmail.com",
"name": "lol",
"phone": "123567890",
"provider": "password"
},
"5f0fb39e-620a-4cd1-9e6a-2b7ae9baaf71": {
"email": "lol#gmail.com",
"name": "Gunn Bellander",
"phone": "0735158548",
"provider": "password"
},
"7fd547be-7836-42e2-a74f-2f2a39baee43": {
"about": "Johan my name is Filip Bellander and i am creating this beautiful app right now the about page is here so the user can quickly reach its target by writing about himself and his completions ",
"comments": {
"4354352": {
"comment": "He did everything even cleaned up after himself",
"stars": 3,
"tasktitle": "Do everything"
},
"423489054": {
"comment": "Yes very nice",
"stars": 1,
"tasktitle": "call my phone"
},
"5435486956": {
"comment": "It was very clean",
"stars": 3,
"tasktitle": "Clean room"
},
"54643654654": {
"comment": "He did a great job wiping all whipcream from the luggage",
"stars": 4,
"tasktitle": "Whipe dat"
}
},
"completed": 90,
"email": "test#gmail.com",
"name": "Filip Bellander",
"phone": "0735158548",
"posted": 81,
"provider": "password",
"rating": 5,
"reviews": 1337,
"skills": "App programmer, Visual designer DriversDriverslicenseDriverslicenseDriverslicenseDriverslicenselicense no"
},
"bdc6c3f8-764a-4468-825a-408f53695b24": {
"email": "Elisabet.bellander#gmail.com",
"name": "Elisabet Bellander",
"phone": "0721645504",
"provider": "password"
}
}
}
From your code you should already have the Open Titles array in your snapshot.
if let result = snapshot.children.allObjects as? [FIRDataSnapshot] {
self.items = result
}
Then your TableView will depend on how you are implementing it. There is some examples here and here.
func tableView(tableView: UITableView, cellForRowAtIndexPath indexPath: NSIndexPath) -> UITableViewCell {
...
let cellDict = items[indexPath.row]
...
return cell
}
I was able to make an array by inserting the string at index 0 everytime it runs by using the code below
queryRef.queryOrderedByChild("status").queryEqualToValue("Open")
.observeEventType(.ChildAdded, withBlock:
{ snapshot in
self.antal = snapshot.children.allObjects.count
print(self.antal)
var titles = snapshot.value.objectForKey("favorTit") as! String
var description = snapshot.value.objectForKey("favorDes") as! String
var budget = snapshot.value.objectForKey("favorBudget") as! Double
self.openTitles.insert(titles, atIndex: 0)
self.openDescription.insert(description, atIndex: 0)
self.openBudget.insert(budget, atIndex: 0)
print(self.openTitles)
print(self.openDescription)
})