Google Cloud Functions GitHub auto-deployer says: The request has errors - github

I have installed the GitHub auto-deployer for Google Cloud Functions, but when I now push my function into my GitHub repository, I receive an abstract error message "The request has errors" with the following rather non-descript details. What could specifically be going wrong here?
E githubAutoDeployer [CODE] 2017-12-30 19:19:37.362
Failed to create function projects/[MY_BUCKET]/locations/us-central/functions/[MY_FUNCTION] { Error: The request has errors
at Request._callback (/user_code/node_modules/googleapis/node_modules/google-auth-library/lib/transporters.js:85:15)
at Request.self.callback (/user_code/node_modules/googleapis/node_modules/request/request.js:186:22)
at emitTwo (events.js:106:13)
at Request.emit (events.js:191:7)
at Request.<anonymous> (/user_code/node_modules/googleapis/node_modules/request/request.js:1163:10)
at emitOne (events.js:96:13)
at Request.emit (events.js:188:7)
at IncomingMessage.<anonymous> (/user_code/node_modules/googleapis/node_modules/request/request.js:1085:12)
at IncomingMessage.g (events.js:292:16)
at emitNone (events.js:91:20)
code: 400,
errors:
[ { message: 'The request has errors',
domain: 'global',
reason: 'badRequest' } ] }
E githubAutoDeployer [CODE] 2017-12-30 19:19:37.363 Error: The request has errors
at Request._callback (/user_code/node_modules/googleapis/node_modules/google-auth-library/lib/transporters.js:85:15)
at Request.self.callback (/user_code/node_modules/googleapis/node_modules/request/request.js:186:22)
at emitTwo (events.js:106:13)
at Request.emit (events.js:191:7)
at Request.<anonymous> (/user_code/node_modules/googleapis/node_modules/request/request.js:1163:10)
at emitOne (events.js:96:13)
at Request.emit (events.js:188:7)
at IncomingMessage.<anonymous> (/user_code/node_modules/googleapis/node_modules/request/request.js:1085:12)
at IncomingMessage.g (events.js:292:16)
at emitNone (events.js:91:20)
D githubAutoDeployer [CODE] 2017-12-30 19:19:37.365
Function execution took 2319 ms, finished with status code: 500
UPDATE The mentioning of google-auth-library in the stack trace made me think that something may be wrong with my credentials. But the output from gcloud auth list appears alright:
Credentialed Accounts
ACTIVE ACCOUNT
* [MY_ID]#gmail.com
UPDATE What is perhaps slightly unconventional is that I have "path":"", in my config.json. But then my index.js resides directly at the top of my repository, so there is no path to specify.
UPDATE This is where the error from Google Cloud Functions is passed on by githubAutoDeployer (unfortunately source code for the upstream server is apparently not available):
gcf.projects.locations.functions.create({ resource, location }, (err, operation) => {
if (err && err.errors && err.errors[0] && err.errors[0].reason === 'alreadyExists') {
// ...
} else if (err) {
console.error(`Failed to create function ${resource.name}`, err);
reject(err);
}

The trouble was that I was specifying "location" : "us-central" instead of us-central1 (which supports Google Cloud Functions) in my config.json.
I found out by sending a raw POST request to the Google Cloud Functions API (after obtaining a service account and access token, etc.). At this level the API returns a clear error indication:
"fieldViolations": [
{
"field": "region",
"description": "region us-central is not supported."
}
]
Apparently and unfortunately this does not enter any of the log files when githubAutoDeployer attempts the same call.

Related

Cannot make work loopback 4 todo-list tutorial with connection to PostgreSQL

I am doing tutorials in the loopback documentation by I have a problem at the last stage of the todo-list tutorial, when I want to connect the app to a database (PostgreSQL, as in the tutorial).
I have initialized the application by doing lb4 example todo-list and then I have followed instructions given at https://loopback.io/doc/en/lb4/todo-list-tutorial-sqldb.html.
I did not forget to do npm run migrate -- --rebuild, tables are well created in the database. Tables are empty.
When I POST /todo-lists (using http://localhost:3000/explorer/#/TodoListController/TodoListController.create) with this body
{ "title": "grocery list" }
I get the answer
{
"error": {
"statusCode": 500,
"message": "Internal Server Error"
}
}
And I have this log in the console
npm start
...
Server is running at http://[::1]:3000
Request POST /todo-lists failed with status code 500. error: null value in column "id" of relation "todolist" violates not-null constraint
at Parser.parseErrorMessage (C:\Users\azias\Documents\dev\js\loopback4-example-todo-list\node_modules\pg-protocol\dist\parser.js:278:15)
at Parser.handlePacket (C:\Users\azias\Documents\dev\js\loopback4-example-todo-list\node_modules\pg-protocol\dist\parser.js:126:29)
at Parser.parse (C:\Users\azias\Documents\dev\js\loopback4-example-todo-list\node_modules\pg-protocol\dist\parser.js:39:38)
at Socket.stream.on (C:\Users\azias\Documents\dev\js\loopback4-example-todo-list\node_modules\pg-protocol\dist\index.js:10:42)
at Socket.emit (events.js:198:13)
at addChunk (_stream_readable.js:288:12)
at readableAddChunk (_stream_readable.js:269:11)
at Socket.Readable.push (_stream_readable.js:224:10)
at TCP.onStreamRead [as onread] (internal/stream_base_commons.js:94:17)
As I understand the mentioned id is not automatically generated (I suppose it should be) and so is missing.
I am discovering loopback so I do not know what I have to change to make it works.
I think I have finally found.
In all models, the id was defined as property with the setting generated: false, it has to be changed to generated: true. This make sequences to be generated in the database and the id automatically generated.
For example, in todo-list.model.ts, the code
#property({
type: 'number',
id: true,
generated: false,
})
id?: number;
has to be changed to
#property({
type: 'number',
id: true,
generated: true,
})
id?: number;
(same in todo-list-image.model.ts and todo.model.ts)

Authentication fail when connecting to Atlas MongoDB

I have an error when connecting to the MongoDB Atlas database after I migrated the data from the old MLAB.
I have definitely setup the username and password correctly as in the documentation (obviously i replaced PASSWORD with my correct MLAB password:
var mongoURI = 'mongodb+srv://heroku_3kcdl3j9:PASSWORD#cluster-3kcdl3j9.auof1.mongodb.net/heroku_3kcdl3j9?retryWrites=true&w=majority';
I have migrated my database from MLAB to Atlas successfully set the correct Network access settings to 0.0.0.0 IP address. Setup the environment variable in Heroku.
I connect to the atlas database with this code, do I need some special options? (this code works with the old MLAB connection)
mongoose.connect(mongoURI,
// { config: { autoIndex: true } },
// { options : { ssl: true } },
function (error) {
if (error) console.error(error);
else console.log('mongo connected');
const con = new mongoose.mongo.Admin(mongoose.connection.db)
con.buildInfo( (err, mongoURI) => {
if(err){
throw err
}
// see the db version
// console.log('mongo db.version(): '+ db.version);
})
});
However, I still get this error, I don't know what I am doing wrong:
{ MongoError: authentication fail
at /Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/topologies/replset.js:1462:15
at /Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/connection/pool.js:868:7
at /Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/connection/pool.js:844:20
at finish (/Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/auth/scram.js:232:16)
at handleEnd (/Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/auth/scram.js:242:7)
at /Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/auth/scram.js:351:15
at /Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/connection/pool.js:531:18
at process._tickCallback (internal/process/next_tick.js:61:11)
name: 'MongoError',
message: 'authentication fail',
errors:
[ { name: 'cluster-3kcdl3j9-shard-00-01.auof1.mongodb.net:27017',
err: [MongoError] },
{ name: 'cluster-3kcdl3j9-shard-00-00.auof1.mongodb.net:27017',
err: [MongoError] },
{ name: 'cluster-3kcdl3j9-shard-00-02.auof1.mongodb.net:27017',
err: [MongoError] } ],
[Symbol(mongoErrorContextSymbol)]: {} }
(node:47015) UnhandledPromiseRejectionWarning: TypeError: Cannot read property 's' of undefined
at Admin.buildInfo (/Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb/lib/admin.js:100:37)
at /Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/index.js:95:13
at $initialConnection.then.err (/Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongoose/lib/connection.js:556:14)
at process._tickCallback (internal/process/next_tick.js:68:7)
(node:47015) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:47015) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
(node:47015) UnhandledPromiseRejectionWarning: MongoError: authentication fail
at /Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/topologies/replset.js:1462:15
at /Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/connection/pool.js:868:7
at /Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/connection/pool.js:844:20
at finish (/Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/auth/scram.js:232:16)
at handleEnd (/Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/auth/scram.js:242:7)
at /Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/auth/scram.js:351:15
at /Users/bensmith/Downloads/DocumentsDirNew/Scraper and API/diveapi/node_modules/mongodb-core/lib/connection/pool.js:531:18
at process._tickCallback (internal/process/next_tick.js:61:11)
(node:47015) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 2)
On the migration documentation from MLAB to ATLAS I can read:
Atlas servers always run with requireSSL and only accept TLS/SSL encrypted connections.
I can see that your ssl option is commented.
This is one problem.
The problem was it was something to do with the username and password. I believe there was an illegal character in the password. So I created another user with a better password and the server authenticated and logged in allowing it to connect.

Problem on virtualize hyperledger network on multiple hosts

I have followed and finished the tutorial
https://medium.com/#malliksarvepalli/hyperledger-fabric-1-2-on-multiple-hosts-using-docker-swarm-and-compose-11c13635e69e
and I have 3 hosts connected and all services are up. I had the correct result when I run the ./script.sh in the PC2 host.
Now I am following the next tutorial which is:
https://medium.com/#malliksarvepalli/hyperledger-explorer-with-fabric-1-2-running-on-multiple-hosts-89c5af691b7e
Can anyone please enlighten me in this tutorial? I have figured out that I should create a new host with ubuntu 16.04 and install the following prerequites:
nodejs 8.11.x
PostgreSQL 9.5 or greater
Jq
Am I right?
And in the exploreconfig.json I updated postgres variables:
host:192.168.1.136(4th's vm ip),
port:5432,
username:postgres,
password:psql,
database:fabric
Are they correct?
I have also modified Orderer, Org1 [peer0 & peer1), Org2(peer0 & peer1) IP address in config.json file with the ips of first three Vms where the network is up and running.
And follow the rest instructions and tests but when I run the command ./start.sh I have logs in console which are:
false 'ssl-certs' '/home/database/blockchain-explorer/ssl-certs'
postgres://christy:christy#192.168.1.136:5432/fabric
error when connecting to db: { Error: connect ECONNREFUSED 192.168.1.136:5432
at Object._errnoException (util.js:992:11)
at _exceptionWithHostPort (util.js:1014:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1186:14)
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect',
address: '192.168.1.136',
port: 5432 }
******* Initialization started for hyperledger fabric platform ******, {
'network-1':
{ version: '1.0',
clients: { 'client-1': [Object] },
channels: { mychannel: [Object] },
organizations: { Org1MSP: [Object], Org2MSP: [Object], OrdererMSP:
[Object] },
peers:
{ 'peer0.org1.ntua.com': [Object],
'peer1.org1.ntua.com': [Object],
'peer0.org2.ntua.com': [Object],
'peer1.org2.ntua.com': [Object] },
orderers: { 'orderer.ntua.com': [Object] } },
'network-2': {} }
client_configs.name undefined client_configs.profile undefined
FabricUtils.createFabricClient
<<<<<<<<<<<<<<<<<<<<<<<<<< Explorer Error >>>>>>>>>>>>>>>>>>>>>
Error : [ 'Invalid platform configuration, Please check the log' ]
error when connecting to db: TypeError: Cannot read property 'on' of
undefined
at Timeout.handleDisconnect [as _onTimeout] (/home/database/blockchain-
explorer/app/persistence/postgreSQL/PgService.js:68:16)
at ontimeout (timers.js:498:11)
at tryOnTimeout (timers.js:323:5)
at Timer.listOnTimeout (timers.js:290:5)
<<<<<<<<<<<<<<<<<<<<<<<<<< Explorer Error >>>>>>>>>>>>>>>>>>>>>
TypeError: "callback" argument must be a function
at setTimeout (timers.js:450:11)
at Timeout.handleDisconnect [as _onTimeout] (/home/database/blockchain-
explorer/app/persistence/postgreSQL/PgService.js:85:5)
at ontimeout (timers.js:498:11)
at tryOnTimeout (timers.js:323:5)
at Timer.listOnTimeout (timers.js:290:5)
Received kill signal, shutting down gracefully
Closed out connections
false 'ssl-certs' '/home/database/blockchain-explorer/ssl-certs'
postgres://christy:christy#192.168.1.136:5432/fabric
error when connecting to db: { Error: connect ECONNREFUSED 192.168.1.136:5432
at Object._errnoException (util.js:992:11)
at _exceptionWithHostPort (util.js:1014:20)
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1186:14)
code: 'ECONNREFUSED',
errno: 'ECONNREFUSED',
syscall: 'connect',
address: '192.168.1.136',
port: 5432 }
******* Initialization started for hyperledger fabric platform ******,
{
'network-1':
{ version: '1.0',
clients: { 'client-1': [Object] },
channels: { mychannel: [Object] },
organizations: { Org1MSP: [Object], Org2MSP: [Object], OrdererMSP:
[Object] },
peers:
{ 'peer0.org1.ntua.com': [Object],
'peer1.org1.ntua.com': [Object],
'peer0.org2.ntua.com': [Object],
'peer1.org2.ntua.com': [Object] },
orderers: { 'orderer.ntua.com': [Object] } },
'network-2': {} }
client_configs.name undefined client_configs.profile undefined
FabricUtils.createFabricClient
<<<<<<<<<<<<<<<<<<<<<<<<<< Explorer Error >>>>>>>>>>>>>>>>>>>>>
Error : [ 'Invalid platform configuration, Please check the log' ]
error when connecting to db: TypeError: Cannot read property 'on' of
undefined
at Timeout.handleDisconnect [as _onTimeout] (/home/database/blockchain-
explorer/app/persistence/postgreSQL/PgService.js:68:16)
at ontimeout (timers.js:498:11)
at tryOnTimeout (timers.js:323:5)
at Timer.listOnTimeout (timers.js:290:5)
<<<<<<<<<<<<<<<<<<<<<<<<<< Explorer Error >>>>>>>>>>>>>>>>>>>>>
TypeError: "callback" argument must be a function
at setTimeout (timers.js:450:11)
at Timeout.handleDisconnect [as _onTimeout] (/home/database/blockchain-
explorer/app/persistence/postgreSQL/PgService.js:85:5)
at ontimeout (timers.js:498:11)
at tryOnTimeout (timers.js:323:5)
at Timer.listOnTimeout (timers.js:290:5)
Received kill signal, shutting down gracefully
Received kill signal, shutting down gracefully
Closed out connections
If the other logs are also needed please let me know. Any help will be helpful. Thanks a lot.
I can give you some aspects to be checked on your environment.
First you need to do check if you are able to access postgresql DB(4th's vm) from the vm where you are trying to launch Hyperledger Explorer. If not, you need to change your postgres configuration (postgresql.conf/pg_hba.conf) for enabling external access, I think.
$ sudo -u postgres psql -h 192.168.1.136 -d fabric -c 'table peer'
Second, please confirm that you did run ./createdb.sh on 4th VM. If you changed database name, you also need to align the script
$ cd app/persistence/fabric/postgreSQL/db/
$ ./createdb.sh

Difficulty connecting to PostgreSQL#localhost using node-postgress (Error:28000)

I'm currently running openSuse on an rPi3B+ (aarch64) and have hit a wall running a NodeJS connection script.
I went through the standard install of PostgreSQL (v10 is what is offered on this version of openSuse) then created a new role with
CREATE ROLE new_role LOGIN PASSWORD 'passwd';
and then a db with
CREATE DATABASE new_db OWNER new_role;
Both the \l & \du return the expected outputs show that both the role and db have been created successfully with the correct owner.
So I then quickly created a node project directory and copied the test script from the docs: https://node-postgres.com/features/connecting
const { Pool, Client } = require('pg')
const connectionString = 'postgresql://new_role:passwd#localhost:5432/new_db'
const pool = new Pool({
connectionString: connectionString,
})
pool.query('SELECT NOW()', (err, res) => {
console.log(err, res)
pool.end()
})
const client = new Client({
connectionString: connectionString,
})
client.connect()
client.query('SELECT NOW()', (err, res) => {
console.log(err, res)
client.end()
})
This returns a few broken promise errors that I haven't caught(.cath()) correctly yet, and an error code of 28000 that looks like this:
{ error: Ident authentication failed for user "new_role"
at Connection.parseE (/home/eru/postgresDB/node_modules/pg/lib/connection.js:554:11)
at Connection.parseMessage (/home/eru/postgresDB/node_modules/pg/lib/connection.js:379:19)
at Socket.<anonymous> (/home/eru/postgresDB/node_modules/pg/lib/connection.js:119:22)
at Socket.emit (events.js:182:13)
at addChunk (_stream_readable.js:283:12)
at readableAddChunk (_stream_readable.js:264:11)
at Socket.Readable.push (_stream_readable.js:219:10)
at TCP.onStreamRead [as onread] (internal/stream_base_commons.js:94:17)
name: 'error',
length: 99,
severity: 'FATAL',
code: '28000',
detail: undefined,
hint: undefined,
position: undefined,
internalPosition: undefined,
internalQuery: undefined,
where: undefined,
schema: undefined,
table: undefined,
column: undefined,
dataType: undefined,
constraint: undefined,
file: 'auth.c',
line: '328',
routine: 'auth_failed' } undefined
So I'm pretty sure the attempt made it to the intended port otherwise I wouldn't have received the detailed error in terminal. The error code = invalid_authorization_specification
Is the there something I need to do on the server ,psql interface, that will fulfill the authorization specification?
When I've looked into that specific one I can't seem to find useful search results relevant to my situation.
Fairly new to postgres here so I'm sure this is a pretty noob mistake that I'm missing but any help or input is very appreciated!
Found an answer after a little more digging here: error: Ident authentication failed for user
Ended up editing my pg_hba.conf from the ident method to md5
This is rather crude because I don't really understand what I changed aside from telling postgreSQL to check the md5 encrypted password instead of checking if my username matched the roles created on the server.
If anyone has a proper explanation for what's changed and why I'm all ears.

Error With Rest Server: HyperLedger

I am following the tutorial here:
https://hyperledger.github.io/composer/tutorials/queries.html#step-six-test-the-rest-apis-and-create-some-data
When I try to post the data,
I get the following response:
{
"error": {
"statusCode": 400,
"name": "SyntaxError",
"message": "Unexpected token \n in JSON at position 38",
"body": "{\n \"$class\": \"org.acme.biznet.Trader,\n \"tradeId\": \"TRADER1\",\n \"firstName\": \"Jenny\",\n \"lastName\": \"Jones\"\n}",
"status": 400,
"stack": "SyntaxError: Unexpected token \n in JSON at position 38\n at JSON.parse (<anonymous>)\n at parse (/usr/local/lib/node_modules/composer-rest-server/node_modules/body-parser/lib/types/json.js:88:17)\n at /usr/local/lib/node_modules/composer-rest-server/node_modules/body-parser/lib/read.js:116:18\n at invokeCallback (/usr/local/lib/node_modules/composer-rest-server/node_modules/raw-body/index.js:262:16)\n at done (/usr/local/lib/node_modules/composer-rest-server/node_modules/raw-body/index.js:251:7)\n at IncomingMessage.onEnd (/usr/local/lib/node_modules/composer-rest-server/node_modules/raw-body/index.js:307:7)\n at emitNone (events.js:105:13)\n at IncomingMessage.emit (events.js:207:7)\n at endReadableNT (_stream_readable.js:1059:12)\n at _combinedTickCallback (internal/process/next_tick.js:138:11)\n at process._tickCallback (internal/process/next_tick.js:180:9)"
}
}
I will appreciate any guidance on this, because as far as I can tell, I have followed the docs verbatim.
Looks like you're just missing the close quotes from org.acme.biznet.Trader in your input.