Is it possible to create a new gcloud SQL instance from code?
For an RnD project, I need to write a tool that is able to spin up and delete postgres database hosted with gcloud. I see this can be done for compute instances using node. I would preferbly like to be able to do it using node or python but am not tied down to any particilar lanauge.
Is this possible and do you have any suggestions?
Yes, The Cloud SQL instances.insert API Call can be used to create instances. However there is no nice nodejs package like #google-cloud/compute. Instead you muse use the the generic, alpha googleapis library. This looks something like:
const {google} = require('googleapis');
const sql = google.sql({version: 'v1beta4'});
async function main () {
const auth = new google.auth.GoogleAuth({scopes: ['https://www.googleapis.com/auth/sqlservice.admin']});
const authClient = await auth.getClient();
const project = "your-project-id-123";
dbinstance = {
// see https://cloud.google.com/sql/docs/postgres/admin-api/rest/v1beta4/instances#DatabaseInstance
// for parameters
};
const res = await sql.instances.insert({project: project, requestBody: dbinstance, auth: authClient});
// ...
}
Related
Trying to connect to my mongodb database in the latest version of Nextjs. Things have changed so much, so I don't longer know what to do.
There's an example of how to set up the connection here: https://github.com/vercel/next.js/tree/canary/examples/with-mongodb
They use this file:
//The mongodb.js file from the example
import { MongoClient } from 'mongodb'
const uri = process.env.MONGODB_URI
const options = {}
let client
let clientPromise
if (!process.env.MONGODB_URI) {
throw new Error('Please add your Mongo URI to .env.local')
}
if (process.env.NODE_ENV === 'development') {
// In development mode, use a global variable so that the value
// is preserved across module reloads caused by HMR (Hot Module Replacement).
if (!global._mongoClientPromise) {
client = new MongoClient(uri, options)
global._mongoClientPromise = client.connect()
}
clientPromise = global._mongoClientPromise
} else {
// In production mode, it's best to not use a global variable.
client = new MongoClient(uri, options)
clientPromise = client.connect()
}
// Export a module-scoped MongoClient promise. By doing this in a
// separate module, the client can be shared across functions.
export default clientPromise
However, they forgot to add how to actually use it. I can't even begin to figure it out.
//pages/api/user.js
import client from '/lib/mongodb.js'
export default async function handler(req, res) {
//How do I connect here?
}
And two bonus questions:
I used to do caching on my database connection. Is it not needed anymore?
What happened to the utils folder? It used to be special, in that it didn't send anything there to the client. Now everyone seem to use lib but I don't think there's anything special with it?
You can do like this:
const dbClient = await client;
const db = dbClient.db('db-name');
const collection = db.collection('collection-name');
// example to get a doc in collection
const doc = await collection.findOne({query:""}, {...options})
I want to trigger a Google Composer airflow dag using Appscript. Is there any way to do it via rest API or another way.
If it is possible please suggest the solution.
Airflow has an endpoint that allows to trigger a DAG through its REST API, however it’s not possible to access it directly, since within the Cloud Composer architecture, the Airflow web server is located under an App Engine flexible environment. By default, the Airflow web server is integrated with Identity-Aware Proxy (IAP) and authentication is required.
Based on that, I found an example in the Cloud Composer documentation, that guides you to trigger a DAG using Cloud Functions, although the code is in JavaScript I don’t think it’s possible to execute it by Google App Script.
On the other hand, a workaround is to follow the Triggering DAGs guide changing some settings as follows.
In the creation of the function instead of setting the trigger type as Cloud Storage set it as HTTP, and check the “Allow unauthenticated invocations” for test purpose. An URL will be displayed, the goal is that every time that URL is accessed the DAG is executed.
Modify the first part of the index.js file, since no data would be passed as parameters and also the makeIapPostRequest function to return the response of the API call.
exports.triggerDag = async (req, res) => { // Modification
// Fill in your Composer environment information here.
// The project that holds your function
const PROJECT_ID = 'your-project-id';
// Navigate to your webserver's login page and get this from the URL
const CLIENT_ID = 'your-iap-client-id';
// This should be part of your webserver's URL:
// {tenant-project-id}.appspot.com
const WEBSERVER_ID = 'your-tenant-project-id';
// The name of the DAG you wish to trigger
const DAG_NAME = 'composer_sample_trigger_response_dag';
// Other constants
const WEBSERVER_URL = `https://${WEBSERVER_ID}.appspot.com/api/experimental/dags/${DAG_NAME}/dag_runs`;
const USER_AGENT = 'gcf-event-trigger';
const BODY = {conf: ‘’}; // Modification
// Make the request
try {
const iap = await authorizeIap(CLIENT_ID, PROJECT_ID, USER_AGENT);
const apiReponse = await makeIapPostRequest(WEBSERVER_URL, BODY, iap.idToken, USER_AGENT); // Modification
res.status(200).send('DAG_running!'); // Modification
} catch (err) {
console.error('Error authorizing IAP:', err.message);
throw new Error(err);
}
};
const makeIapPostRequest = async (url, body, idToken, userAgent) => {
const res = await fetch(url, {
method: 'POST',
headers: {
'User-Agent': userAgent,
Authorization: `Bearer ${idToken}`,
},
body: JSON.stringify(body),
});
if (!res.ok) {
const err = await res.text();
console.error('Error making IAP post request:', err.message);
throw new Error(err);
}
return {
apiRes: res.ok, // Modification
};
};
At this point, anything else has to be changed, so in your Script file execute the next instructions in order to trigger the DAG.
function myFunction() {
var response = UrlFetchApp.fetch("Cloud-function-URL");
Logger.log(response.getAllHeaders());
}
Finally, verify in the Airflow web interface if the DAG was triggered.
I'm trying to create a cloud function which saves some data (documents from firestore) to cloud storage.
Wrote some cloud functions before, but kind of new to cloud storage, buckets etc.
From what I've read I have to "stream" this data to a bucket.
I'd love to see a short snippet that does just that :)
For you to achieve that, it should not be something very complicated, so I hope I can help you.
To perform this, I will follow the example explained in the article Backup Firestore data to storage bucket on a schedule in GCP - which you can follow completely, in case you are interested - focusings in the upload from Firestore to Cloud Storage. I will explain which parts to use and how to use them, to achieve your agoal
Once you created your Cloud Storage bucket - it should have Multi-regional and Nearline configured in the settings - you need to use the below code as indicated after them.
index.js file:
const firestore = require('#google-cloud/firestore');
const client = new firestore.v1.FirestoreAdminClient();
// Replace BUCKET_NAME
const bucket = 'gs://<bucket-id>'
exports.scheduledFirestoreBackup = (event, context) => {
const databaseName = client.databasePath(
process.env.GCLOUD_PROJECT,
'(default)'
);
return client
.exportDocuments({
name: databaseName,
outputUriPrefix: bucket,
// Leave collectionIds empty to export all collections
// or define a list of collection IDs:
// collectionIds: ['users', 'posts']
collectionIds: [],
})
.then(responses => {
const response = responses[0];
console.log(`Operation Name: ${response['name']}`);
return response;
})
.catch(err => {
console.error(err);
});
};
package.json file:
{
"dependencies": {
"#google-cloud/firestore": "^1.3.0"
}
}
These files should be created, with the configuration in the Cloud Function as following: an unique name; Cloud Sub/Pub as trigger; topic name similar or equal to initiateFirestoreBackup; using Node.js, source will be the above files written and function to execute scheduledFirestoreBackup.
The above codes should be enough for you to export from your Firestore to your Cloud Storage, due to the fact that it will be getting all your collections - or you can define specifics - and sending to the bucket you already created.
Besides that, in case you want more information on uploading files to Cloud Storage using Cloud Functions, you can check here as well: Uploading files from Firebase Cloud Functions to Cloud Storage
Let me know if the information helped you!
Thanks to #gso_gabriel I was able to create a partial backup for my documents.
To anyone who's interested, here's a simplified version of my code:
const bucketName = 'myproject.appspot.com';
exports.backup = functions.https.onCall(async (data, context) => {
const userDoc = await admin.firestore().collection('users').doc('abc123').get();
await writeFile('temp', 'user.json', JSON.stringify(userDoc.data()));
}
async function writeFile(dirName, fileName, content) {
var bucket = admin.storage().bucket(bucketName);
const destFilename = dirName + '/' + fileName;
const file = bucket.file(destFilename);
const options = {
destination: destFilename,
metadata: { contentType: "application/json" }
};
await bucket.file(destFilename).save(content, options);
}
I'm using the account linking feature for Actions SDK and following the guide here (https://developers.google.com/assistant/identity/google-sign-in#start_the_authentication_flow)
It shows the initialization like this
const app = actionssdk({
// REPLACE THE PLACEHOLDER WITH THE CLIENT_ID OF YOUR ACTIONS PROJECT
clientId: CLIENT_ID,
});
But for my use case, I'll read the clientId from DB which is stored against the projectId of the project. I can extract the projectId only after the MAIN intent is triggered.
My question is, how can I set the clientId after initializing actionssdk?
This solution uses the new Actions SDK, but the principal is the same for the legacy SDK as well:
const {
conversation,
Canvas,
} = require('#assistant/conversation');
const functions = require('firebase-functions');
const wrapper = async (req, res) => {
// You can get any data you need here:
const myAsyncBootstrapData = await getData();
const app = conversation({debug: true, ...myAsyncBootstrapData});
app.handle('welcome', (conv) => {
conv.add('This is a demo.');
});
return app(req, res);
};
exports.ActionsOnGoogleFulfillment = functions.https.onRequest(wrapper);
functions.https.onRequest accepts any callable, including ones that return promises. If you need to block while loading configuration data asynchronously, you can do so by wrapping your definition in an async function.
I found a simple solution to this. I am adding it here for future references.
// handler.js
async function handleRequest(req, res) {
const clientId = // retrieve the clienId using your business logic
const app = actionssdk({
clientId: clientId
})
}
module.exports = handleRequest;
Instead of directly creating an instance of actionssdk, wrap it inside a function like this.
// index.js
const handler = require('./path/to/hander.js');
app.post('/webhook', handler);
Then when defining the webhook, use the wrapper function to handle the webhook requests
I'm struggling to find an example using the current gremlin javascript driver with OrientDB. I can't get it to connect to OrientDB (already using the tinkerpop enabled version).
My sample code looks like this:
const gremlin = require("gremlin")
const DriverRemoteConnection = gremlin.driver.DriverRemoteConnection
const graph = new gremlin.structure.Graph()
const g = graph.traversal().withRemote(new DriverRemoteConnection('ws://localhost:8182/demodb'))
g.V().toList().then(function(data) {
console.log(data)
}).catch(function(err) {
console.log(err)
})
Does someone have any experience using those together? Thanks
If you want to connect OrientDB trough Gremlin try this:
Local -> gremlin> g = new OrientGraph("plocal:db_path/nomeDB")
In-Memory -> gremlin> g = new OrientGraph("memory:nomeDB")
Remote -> gremlin> g = new OrientGraph("remote:localhost/nomeDB")
Hope it helps
Regards
I dug a little bit. Gremlin Javascript driver does not support GraphSON3
https://github.com/jbmusso/gremlin-javascript/issues/109
Add this to your server.yaml serializer configuration in order to add support for v2
- { className: org.apache.tinkerpop.gremlin.driver.ser.GraphSONMessageSerializerGremlinV2d0, config: { ioRegistries: [org.apache.tinkerpop.gremlin.orientdb.io.OrientIoRegistry] }}
Then it should work
Try this:
import * as gremlin from 'gremlin';
const DriverRemoteConnection = gremlin.driver.DriverRemoteConnection;
const authenticator = new gremlin.driver.auth.PlainTextSaslAuthenticator(<db username>, <db password>);
const traversal = gremlin.process.AnonymousTraversalSource.traversal;
const g = traversal().withRemote(new DriverRemoteConnection('ws://localhost:8182/gremlin', {
authenticator: authenticator
}));
Notice the /gremlin in the url, not /demodb.
To point to demodb or another database modify the demodb.properties file in the config folder from OrientDB.
If you want you can create another properties file, just remember to point to it in the gremlin-server.yaml file.