Testing Cloud Function - Cannot read property 'data' of undefined - raspberry-pi

I've got a Raspberry Pi and setup a weather (and soil moisture) rig.
I found this guide: https://codelabs.developers.google.com/codelabs/iot-data-pipeline which I followed and got stuck around Step 6-7.
From what I understand - when I send data to PubSub - nothing happens. On the Raspberry End I sort of get the idea that data is being sent but it doesn't get passed into BigQuery. I did some print statements at various points to try and see where it got stuck.
As I was trying to find the error I slowly backtracked to Step 5 (Create a Cloud Function).
Step 5 along with associated code I copied can be seen here: https://codelabs.developers.google.com/codelabs/iot-data-pipeline/#4
In GCP - I click into Cloud Function -> function-weatherPubSubToBQ -> Testing (tab)
Under the heading - Trigger event - I filled out the JSON below:
{
"sensorID":"Raspberry",
"timecollected":"2020-09-11 06:45:19",
"zipcode":"00000",
"latitude":"0.0",
"longitude":"0.0",
"temperature":"-273",
"humidity":"-1",
"dewpoint":"-273",
"pressure":"0"
}
When I click on - Test the function - the output is as below
**Error: function execution failed. Details:
Cannot read property 'data' of undefined**
Screen capture of JSON and error message
I am guessing one of these two things are causing the problem.
event.data or PubSubMessage.data
I tried to make some changes to the code but I am just shooting in the dark.
I was wondering if:
I did something wrong which means there might be some other
issues somewhere else.
This guide is slightly old and there have
been some updates which make the older code in the guide not
function as desired. (not step/image in the guide matches with what
I saw online, as of Sep 2020)
If someone knows what is wrong in
the code and is able to let me know how to solve it that would be
much appreciated.
Thanks in advance.

TLDR: The tutorial is outdated, don't use it, unless you want to face multiple problems and want to learn in the hard way
I went through the tutorial and I was able to replicate the issue.. and many more. As you already mentioned it, the tutorial is outdated and many things have changed, you can infer that by looking at the images and noticing that the UI is even different, so I wouldn't recommend this tutorial to anyone who is new to GCP.
The first issue:
**Error: function execution failed. Details: Cannot read property 'data' of undefined**
Can be easily resolved by looking at the structure of what it's expected from a pub/sub message:
{
"data": string,
"attributes": {
string: string,
...
},
"messageId": string,
"publishTime": string,
"orderingKey": string
}
So easy right? However once you mimic the structure of the message with your own variables as I did:
{
"data": "",
"attributes": {
"sensorID":"Raspberry",
"timecollected":"2020-09-11 06:45:19",
"zipcode":"00000",
"latitude":"0.0",
"longitude":"0.0",
"temperature":"-273",
"humidity":"-1",
"dewpoint":"-273",
"pressure":"0"
},
"messageId": "id_1",
"publishTime": "2014-10-02T15:01:23Z",
"orderingKey": ""
}
You will get an error regarding the JSON:
SyntaxError: Unexpected token ' in JSON at position 1
This error is due to the use of ' on the construction of the JSON inside the variable incomingData so you have to change the first variable declaration, I did it by using template literals:
const incomingData = PubSubMessage.data ? Buffer.from(PubSubMessage.data, 'base64').toString() : `{"sensorID": "na","timecollected":"01/01/1970 00:00:00","zipcode":"00000","latitude":"0.0","longitude":"0.0","temperature":"-273","humidity":"-1","dewpoint":"-273","pressure":"0"}`;
But this is not the end of the issues, after doing some tests while trying to insert into BigQuery, I got an error regarding the insertion, but didn't get a clue of what was really happening, so I isolated the consult in an external script and found that the error handling was wrong, the first thing I recommend you to change is the BigQuery version in the package.json from:
"#google-cloud/bigquery": "^0.9.6"
Into
"#google-cloud/bigquery": "5.2.0"
Which is the last version at the time of writing this answer. The next part is to redefine the way you're using BigQuery constructor into:
const bigquery = new BigQuery({
projectId: projectId
});
Then after many tests I found that the catch wasn't doing it's job as expected, so have to rewrite that part into:
bigquery
.dataset(datasetId)
.table(tableId)
.insert(rows)
.then((foundErrors) => {
rows.forEach((row) => console.log('Inserted: ', row));
if (foundErrors && foundErrors.insertErrors != undefined) {
foundErrors.forEach((err) => {
console.log('Error: ', err);
})
}
})
.catch((err) => {
bigquery
.dataset(datasetId)
.table(tableId)
.insert(rows)
.then((foundErrors) => {
rows.forEach((row) => console.log('Inserted: ', row));
if (foundErrors && foundErrors.insertErrors != undefined) {
foundErrors.forEach((err) => {
console.log('Error: ', err);
})
}
})
.catch((err) => {
if(err.name=='PartialFailureError'){
if (err && err.response.insertErrors != undefined) {
err.response.insertErrors.forEach((errors) => {
console.log(errors);
})
}
}else{
console.log("GENERIC ERROR:",err)
}
});
});
After this you will finally notice that the error is due to (again) the incomingData variable:
Could not parse \'01/01/1970 00:00:00\' as a timestamp. Required format is YYYY-MM-DD HH:MM[:SS[.SSSSSS]]'
You have to change the date from 01/01/1970 00:00:00 into 1970-01-01 00:00:00.
Still here with me? Well there's another error coming from the usage of callback at the end of the CF:
This is due to the fact that Cloud Functions now require three parameters and callback is the last one, so change the function declaration into:
exports.subscribe = function (event, context, callback)
After all of this you have been able to insert data into BigQuery, however we're using the local variable and not the data coming from pub/sub, at this point I gave up, since I will need to practically rewrite the full function in order to make it work by using Attributes instead of the data.
So as mentioned it before don't follow this tutorial if you're beginning in the GCP world.

Related

Protractor: how to store the text value of a component and re-use it in a spec?

I have a string displayed in a page and I want to store this string in a variable so I can use this variable for a check further in the test process.
Here is what it looks like:
let storedValue: string;
element(by.id('myElement')).getText().then((elementValue: string) => {
storedValue = elementValue;
console.log('stored value: ' + storedValue);
});
// Some unrelated code will go here in the future
browser.wait(() => storedValue !== null, 5000, 'browser.wait timeout');
expect(element(by.id('myElement')).getText()).toEqual(storedValue);
If the string in the page is hello, the test fails with the log Expected 'Hello' to equal undefined.
Why is it still undefined ?
I thought that Protractor would queue the last two instructions synchronously in the ControlFlow.
Maybe there is another way to store this variable, or to wait for the storedValue to be defined ?
browser.wait(() => storedValue !== null, 5000, 'browser.wait timeout').then(function(){
expect(element(by.id('myElement')).getText()).toEqual(storedValue);
});
Setting up the expect inside the wait will solve your problem. However, I don't know enough about the specifics of ControlFlow to explain exactly why yours doesn't work. Hopefully someone else can come along and do that.
You can force expect to run inside control flow by doing a browser.call
browser.call(() => expect(element(by.id('myElement')).getText()).toEqual(storedValue));
But you shouldn't have to do this. Could be a bug.

how to retrieve response.error and response.success messages in c# (Unity3D)?

I had the exactly same problem with this thread
https://www.parse.com/questions/how-to-retrieve-responseerror-and-responsesuccess-messages-in-c-unity3d
which is archived without any comment or question.
On the cloud code
Parse.Cloud.beforeSave("GameScore", function(request, response) {
response.error("MY_ERROR_CODE");});
In Unity C# code:
ParseCloud.CallFunctionAsync<string>("Function", paramsDict)
.ContinueWith(t => {
if(t.IsFaulted){
foreach(Exception e in t.Exception.InnerExceptions){
ParseException ex = (ParseException)e;
Debug.Log (ex.Message);
}
}
Debug.Log ("HERE1");
string result = t.Result
Debug.Log ("HERE2");
});
The output is always "400 Bad Request", or something not my own error code "MY_RRROR_CODE".
I want to make my app know the different errors exactly to have corresponding result.
Questions:
How can I get "MY_ERROR_CODE" in Unity?
Why "HERE1" is printed in the console but not "HERE2"? Every code after "string result = t.Result" is not called. Am I missing some important features of the ContinueWith function?
In your Cloud Code you aren't creating a Cloud Function called Function, you are only creating a beforeSave() handler.
In your Unity C# code your string result = t.Result doesn't have a ; at the end, otherwise it should work.
According to Christine Abernathy here: https://www.parse.com/questions/unity-sdk-handling-errors ... it's not possible to get more information about errors from Parse in Unity because of Unity's limited HTTP stack.

Field validations in sugarcrm

I just started using SugarCRM CE for the first time (Version 6.5.15 (Build 1083)). I'm quite impressed with the ease of use when adding new fields or modules, but there's one quite indispensable thing that seems to be missing: Validation of user input.
I would for example like to check a lot of things:
Check if a emailadres has a valid format, using some regular expression
Check if a postalcode exists (maybe do a webswervice call to validate it)
Do a calculation to see if a citizen service number is valid
etc.
The only thing I seem to be able to do in studio is make a field required or not, there doesn't seem to be any standard way to execute a validation on a field.
All I can find when I google on it is lots of ways to hack into the source code, like this one: http://phpbugs.wordpress.com/2010/01/22/sugarcrm-adding-javascript-validation-on-form-submit/ And even then I don't find any examples that actually do a validation.
Am I just missing something? Or is editing source code the only way to add this?
I don't think the "standard" validations are available in the CE edition.
What surprises me is that you can't define a validation somewhere and attach it to a field. I kind of expected this, since the rest of the system is very well structured (modules, packages, etc..)
I now for instance created a 11-check, this is a very specific check for a dutch bank account number. to get this to work, I did the following (based upon examples I found googling around):
I added the bank account to contacts in studio and after that edited \custom\modules\Contacts\metadata\editviewdefs.php
I added the following snippets:
'includes'=> array(
array('file'=>'custom/modules/Contacts/customJavascript.js')),
array (
0 =>
array(
'customCode' =>
'<input title="Save [Alt+S]" accessKey="S" onclick="this.form.action.value=\'Save\'; return check_custom_data();" type="submit" name="button" value="'.$GLOBALS['app_strings']['LBL_SAVE_BUTTON_LABEL']>',
),
1 =>
array(
'customCode' =>
'<input title="Cancel [Alt+X]" accessKey="X" onclick="this.form.action.value=\'index\'; this.form.module.value=\''.$module_name.'\'; this.form.record.value=\'\';" type="submit" name="button" value="'.$GLOBALS['app_strings']['LBL_CANCEL_BUTTON_LABEL'].'">'
)
),
And in customJavascript.js i placed this code:
function check_custom_data()
{
if (!eleven_check(document.getElementById("bankaccount_c").value)){
alert ('Bank account not valid');
return false;
} else {
return check_form('EditView');
}
function eleven_check(bankaccount) {
bankaccount=bankaccount.replace(/\D/, "");
charcount=bankaccount.length;
var som=0;
for (i=1; i<10; i++) {
getal=bankaccount.charAt(i-1);
som+=getal*(10-i);
}
if (som % 11==0 && charcount==9) {
return true
} else {
return false
}
}
}
This check now works the way I want it to work, but I'm wondering if this is the best way to add a validation. this way of adding a validation doesn't however accommodate PHP validations, for instance, if I want to validate against some data in the database for one or another reason, I would have to use ajax calls to get that done.
Email validation is in the pro edition, I had assumed it was in CE as well but I'm not 100% sure.
The other 2 are a lot more specific - postcode validation would depend upon your country so would be difficult to roll out. For these you will need to write your own custom validation.
I know its late, but maybe still someone needs this.
You can just add your custom javascript validation as a callback in your vardefs like this:
'validation' =>
array (
'type' => 'callback',
'callback' => 'function(formname,nameIndex){if($("#" + nameIndex).val()!=999){add_error_style(formname,nameIndex,"Only 999 is allowed!"); return false;}; return true;}',
),
I documented it here as its not well documented elsewhere:
https://gunnicom.wordpress.com/2015/09/21/suitecrm-sugarcrm-6-5-add-custom-javascript-field-validation/
You can add custom validation code to the following file: ./custom/modules/.../clients/base/views/record/record.js
There you can add validation code. In this example, I will validate if the phone_number is not empty when an accounts has a customer-type:
EXAMPLE CODE IN RECORD.JS:
({
extendsFrom: 'RecordView',
initialize: function (options) {
app.view.invokeParent(this, {type: 'view', name: 'record', method: 'initialize', args:[options]});
//add validation
this.model.addValidationTask('check_account_type', _.bind(this._doValidateCheckType, this));
},
_doValidateCheckType: function(fields, errors, callback) {
//validate requirements
if (this.model.get('account_type') == 'Customer' && _.isEmpty(this.model.get('phone_office')))
{
errors['phone_office'] = errors['phone_office'] || {};
errors['phone_office'].required = true;
}
callback(null, fields, errors);
}
})
Don't forget to repair en rebuild!
The full documentation can be found here

Incrementally update Kendo UI autocomplete

I have a Kendo UI autocomplete bound to a remote transport that I need to tweak how it works and am coming up blank.
Currently, I perform a bunch of searches on the server and integrate the results into a JSON response and then return this to the datasource for the autocomplete. The problem is that this can take a long time and our application is time sensitive.
We have identified which searches are most important and found that 1 search accounts for 95% of the chosen results. However, I still need to provide the data from the other searches. I was thinking of kicking off separate requests for data on the server and adding them the autocomplete as they return. Our main search returns extremely fast and would be the first items added to the list. Then as the other searches return, I would like them to add dynamically to the list.
Our application uses knockout.js and I thought about making the datasource part of our view model, but from looking around, Kendo doesn't update based on changes to your observables.
I am currently stumped and any advice would be welcomed.
Edit:
I have been experimenting and have had some success simulating what I want with the following datasource:
var dataSource = new kendo.data.DataSource({
transport: {
read: {
url: window.performLookupUrl,
data: function () {
return {
param1: $("#Input").val()
};
}
},
parameterMap: function (options) {
return {
param1: options.param1
};
}
},
serverFiltering: true,
serverPaging: true,
requestEnd: function (e) {
if (e.type == "read") {
window.setTimeout(function() {
dataSource.add({ Name: "testin1234", Id: "X1234" })
}, 2000);
}
}
});
If the first search returns results, then after 2 seconds, a new item pops into the list. However, if the first search fails, then nothing happens. Is it proper to use (abuse??) the requestEnd like this? My eventual goal is to kick off the rest of the searches from this function.
I contacted Telerik and they gave me the following jsbin that I was able to modify to suit my needs.
http://jsbin.com/ezucuk/5/edit

Mongoose won't remove embedded documents

I'm scratching my head here, as usual it seems with node projects, and I'm not sure if I'm doing something wrong or if I've run into a bug.
I've got a schema of Server that can have any number of embedded docs called services. I'm running into a problem though where, even though I've successfully removed the individual service from the server object, when I tell it to save it doesn't remove it from the database. The save function is working because it's saving any changes I've made and is also pushing in new embedded docs, it's just not removing one that are already there.
Here is a relatively simplified example of my code:
app.put('/server/:id', function(req, res, next){
app.Server.findOne({_id: req.params.id}, function(err, server) {
server.updated = new Date();
...
for (var num = _.size(req.body.server.services) - 1; num >= 0; num--){
// Is this a new service or an existing one
if (server.services[num]) {
// Is it marked for deletion? If so, delete it
if (req.body.server.services[num].delete == "true") {
server.services[num].remove()
} else { // else, update it
server.services[num].type = req.body.server.services[num].type
...
}
} else {
// It's new, add it
delete req.body.server.services[num]["delete"]
server.services.push(req.body.server.services[num]);
}
}
server.save(function(err){
if (!err) {
req.flash('success', 'Server updated')
} else {
req.flash('error', 'Err, Something broke when we tried to save your server. Sorry!')
console.log(err)
}
res.redirect('/')
});
})
});
So the remove() is actually removing the service. If I do a server.toObject() before the save, it's not there. Any ideas why it wouldn't be removing it from the database when it saves?
Edit: I suppose the version numbers would be helpful. node#0.4.2, mongoose#1.1.5 express#2.0.0rc
I could be wrong, since I've not tested your example, but this sounds like Mongoose isn't detecting that the embedded document is modified.
From the schema types documentation page:
Since it is a schema-less type, you can change the value to anything else you like, but Mongoose loses the ability to auto detect/save those changes. To "tell" Mongoose that the value of a Mixed type has changed, call the .markModified(path) method of the document passing the path to the Mixed type you just changed.
person.anything = { x: [3, 4, { y: "changed" }] };
person.markModified('anything');
person.save(); // anything will now get saved
So you answer might be as simple as using the markModified() function.
I found a way to temporary fix this problem.
What I did is load the embedded documents into an array, splice the one to be deleted and replace the array. Something like this:
var oldusers = dl.users;
oldusers.splice(dl.users.indexOf(req.currentUser.id), 1);
dl.users = oldusers;
dl.save(function(err) {...
I know that depending on the size of the document it will