How to connect to a bigtable emulator using cbt command line - command-line

I ran the command gcloud beta emulators bigtable start but when i ran the command cbt listinstances, i got the error below
Getting list of instances: rpc error: code = Unimplemented desc = unknown service google.bigtable.admin.v2.BigtableInstanceAdmin
How can i use cbt command to connect my local bigtable emulator?
The emulator command https://cloud.google.com/sdk/gcloud/reference/beta/emulators/bigtable/start
The cbt command
https://cloud.google.com/bigtable/docs/go/cbt-reference

The Cloud Bigtable emulator doesn't support any instance-level operations (CRUD for instances). You can use any arbitrary instance name when connecting to it and start by creating a table.

once the emulator is running, you need to do the following:
export BIGTABLE_EMULATOR_HOST=localhost:8086
run cbt -project test -instance localhost:8086 ls

You can use the local emulator if you prefix cbt:
cbt -project <PROJECT NAME> -instance localhost:8086
Hope this helps!

Related

Invoking pg_repack extension in gcp cloud sql

We have installed the pg_repack extension in Cloud SQL by following the guide:
https://cloud.google.com/sql/docs/postgres/extensions#pg_repack
The installation of the extension works fine and it shows up in the list of extensions when running \dx.
We then want to invoke the extension, but it is unclear from where this should be done. The docs just say to "run the command":
pg_repack -h <hostname> -d testdb -U csuper1 -k -t t1
We cant find anywhere in our project where this command can be invoked though. Do we have to set up a compute engine instance for this, or is there some other way?
We only use Cloud Run for running our code at the moment and would like to keep things as small/simple as possible.
Our solution: We built a docker image that wrapped pg_repack with http, and then deployed it as a Cloud Run service. This enabled us to invoke pg_repack periodically using the cloud scheduler.

is vscode remote-ssh inception possible?

I am connecting remote linux and developing some application. To run the application I need to connect to another host for debugging due to resource availibility using srun command.
Seems like partial inception is possible using below in launch.json.
"miDebuggerServerAddress": "remoteHostname:9091",
And running the command on remote host as:
user#remoteHostname $ gdbserver :9091 ./myapplication arguments to my program

gcloud compute scp error: All sources must be local files

I tried to copy file from my google cloud instance to local machine with the following command:
gcloud compute scp nlp-2:to_test.txt C:\Temp
And got back the following error message:
ERROR: (gcloud.compute.scp) All sources must be local files when destination is remote. Got sources: [nlp-2:to_test.txt], destination:
C:Temp
What is exactly wrong? I am confident that the same command worked like 2 days ago.
Update: I am connecting to Ubuntu 16.04 (google instance) from Win 7 (local machine)
In order to resolve copying files to the instance, I had to create a path in D: (in your case can be C:) the same as the one represented by ~ in the ubuntu instance (/home/example_name/) and put the files to copy in that windows directory:
sudo gcloud beta compute scp --project="projectname" --zone="zonename" ~/Filename.zip instancename:~/
The reason is because the console scp does not support :
I have just tried to replicate the issue running the following code on a Google Cloud SDK Shell from a machine with Windows Server 2008 R2:
gcloud compute scp instance-1:/home/username/file C:\Users\username\file2
where instance-1 is a Debian 4.9.51-1 and I have been able to copy the file.
Therefore I think you misspelled something writing the command (also because you wrote that it was working for you as well some days ago) or I didn't understand correctly your configuration.
If this is the case an you provide some information more editing the question?
EDIT
I tested as well to do SCP between Debian machine having "weird" names and I have been always able to copy the files both from a remote location and to a remote location:
gcloud compute scp instance-1:/paolo '/C:\\Temp'
and
gcloud compute scp instance-2:'/C:\\Temp' .
Note despite the weird notation that C:\Temp is a file stored in a Linux instance
You may like to use as it worked for me:
in my case every file was in the folder jupyter!
gcloud beta compute scp --project "project_name" --zone "zone_name" instance_name:~jupyter/file_name /home/Downloads

Deployd :: Failed To Start MongoDB

Am a newbie in Deployd and MongoDB. I have installed Deployd (www.deployd.com) 0.6.9 on my Windows-XP system. I executed the following command at the prompt as instructed by the book am studying
>dpd create sportsstore
Now, when i run the following command
> dpd –p 5500 sportsstore/app.dpd
to start the Deployd server, i get the following error::
starting deployd v0.6.9...
Failed to start MongoDB
It states that MongoDb has failed to start. I went into the directory "C:\Program Files\Deployd\tools" and found that there is a file called "mongod".
I have never installed MongoDB before on my computer. It is my first time working with Deployd so i don't know whether the "Mongod" file in the tools folder is the same as MongoDB or i have to install MongoDB separately on the computer.
Can someone point me to the right direction.
Thanks.
First of all, you have to install mongodb in your computer. You can do it from the official site for MongoDB. The standard option would be to install the Community version.
After that, you can check if mongodb was properly installed and can be used with Deployd:
If you are in iOS or Linux you can try:
sudo dpd
If you are in Windows try opening a command window with "Run as Administrator" and try:
dpd
For Windows Installation.
1) Run npm install deployd -g
2) Install Mongo DB separatetly using windows installer available on
https://www.mongodb.com/download-center#community
and configure to run it as service as mentioned on below url.
https://docs.mongodb.com/manual/tutorial/install-mongodb-on-windows/
Service option is convenient to start and stop database using net start and net stop command
3) create a deployd module using 'dpd create abcd'.
4) navigate to deployd module you have created (using cd abcd) to run dpd -d command. Before you run dpd -d command you need to ensure the you start the mongodb service from same command prompt running in elevated mode or admin mode.( use the command : 'net start MongoDB')
In case It is still giving path error then try below command:
set path=%PATH%; "C:\Program Files\MongoDB\Server\3.4.1\bin"
this command is alternative for setting path in environment variable (in case user doesn't have permissions to modify environment variable)
Still not working, same path issue ??
run using command --mongod as mentioned below
dpd -m "C:\Program Files\MongoDB\Server\3.4.1\bin\mongod.exe"
After installed the depolyd run:
dpd -e production
To solve the problem of "failed to start mongoDB"
1- I installed mongoDB for windows from this web page (https://docs.mongodb.com/manual/installation/).
2- Then I created a path as shown below
path
3 - I installed the deployd and I run:
dpd -e production

Cant find gcloud utility using MAMP

After attempting to innitialize cluster/kube-up via php using the following code from my local virtual host:
$old_path = getcwd();
chdir('/Users/username/kubernetes');
$output = shell_exec('cluster/kube-up.sh');
chdir($old_path);
print_r("<pre>$output</pre>") ;
I received the following error:
Can't find gcloud in PATH. Do you wish to install the Google Cloud SDK? [Y/n]
I have gcloud available in my bash_profile. I am also running MAMP and included the path variable in /Applications/MAMP/Library/bin/envvars_* and envvars-std -
I am still getting this prompt. Any ideas?
I managed to bypass this by doing the following:
I created a script file in my local kubernetes directory.
In the script, I inserted the following code:
export PATH="/Users/username/google-cloud-sdk/bin:$PATH"
cluster/kube-up.sh
This then ran kube-up.sh creating a cluster with values I had set in kubernetes/cluster/gceconfig-default.sh using php.