For a project, I have created project in Coverity Server and 2 streams for Java and CPP in that proejct.
I'm running coverity for the project in Jenkins. And coverity report will be append in mail template.
I also want to give a link to the project in the coverity server.
Like http://192.168.1.20:8081/defects/index.htm?projectId=10068.
I found out the project will be listed only after the coverity is finished running and then only I can see project and project ID in server.
If I get the project ID, I can create the project link.
I'm running below code in script to export report to csv file by passing Project Name.
/opt/coverity/cov-sa-linux64-5.5.3/bin/cov-manage-im --mode defects --show --action Undecided --project Jenkins_Week34_Coverity --host 192.168.1.20 --user admin --password admin123 --port 8081 --fields cid,file >/opt/cov/curr.csv
Similar way, is there any way to get the project ID by passing Project Name?
Or while committing report to server, do we get project id?
It's possible to craft a link that uses search parameters, using your values as an example the below should allow you to send the email:
http://192.168.1.20/query/defects.htm?project=Jenkins_Week34_Coverity
Note that the project name is case sensitive.
Hi I found the answer by:
Api list all Project name also ProjectKey (project ID):
curl -u username:pw https://your-coverity-server/api/v2/projects
Or you can get specific project info:
curl -u username:pw
https://your-coverity-server/api/v2/projects/project-name
Related
We have a requirement where we need to access a file hosted on our github private repo in our Azure Databricks notebook.
Currently we are doing it using curl command using the Personal Access Token of a user.
curl -H 'Authorization: token INSERTACCESSTOKENHERE' -H 'Accept:
application/vnd.github.v3.raw' -O -L
https://api.github.com/repos/*owner*/*repo*/contents/*path*
Is there a way we can avoid the use of PAT and use deploy keys or anything?
From summer 2021 databricks has introduced integration of git repos functionality.
More info here: https://learn.microsoft.com/en-us/azure/databricks/repos
If you add your file (excel, json etc.) in the repo, then you can use a relative path to access it and read it.
e.g. pd.read_excel("./test_data.xlsx")
Be aware that you need a cluster with a databricks version 8.4+ (or 9.1+?)
You can also test what is your current working directory by executing the following command. os.getcwd()
If you have correctly integrated the repo then your result should be something like:
/Workspace/Repos/george#myemail.com/REPO_FOLDER/analysis
otherwise it will be something like: /databricks/driver
Integrate Git and azure databricks.
This documentation shows how to integrate Git and azure databricks
Step1: Get raw URL of the File.
Step2: Use wget to access the file:
wget https://github.com/githubtraining/hellogitworld/blob/master/resources/labels.properties
I want update project settings for existing Rundeck Project. For example : add resources / nodes to the Rundeck project using CLI. How do I do that?
Solution:
Below is the command using Rundeck CLI to add a resource file in .xml format to an existing Rundeck project
$ rd projects configure set -p MG-Test-CLI --
--resources.source.1.config.file="/home/rundeck/iidas/resources.xml" \ --resources.source.1.config.generateFileAutomatically=true
--resources.source.1.config.includeServerNode=true
--resources.source.1.type=file
Here more useful Rundeck CLI project commands:
# create project
rd projects create -p project1
# get project configuration
rd projects configure get -p project1
# Configure nodes from remote URL (GitLab)
rd projects configure update -p project1 -- \
--resources.source.1.type=url \
--resources.source.1.config.url='https://git.i.example.com/api/v4/projects/3/repository/files/project1%2Fdev%2Fnodes.json/raw?ref=master&private_token=1234567890' \
--resources.source.1.config.timeout=10 \
--resources.source.1.config.cache=false
See also Rundeck CLI documentation and project parameter.
Im able to login to concourse successfully using
fly -t login -c -n -b
when i run,
fly -t <target> teams -d
i see, users = local:admin
Im not able to see my github account.
Not sure what config is to be modified
just ensure the team names are correctly entered as they are case sensitive
i want to run odoo 9.0 directly from source code in windows 10. I done this:
1.- I downloaded source code from [GitHub][1]
2.- I already installed all system requirements (Python 2.7, Node.js, etc)
3.- I already have an postgresql database in AWS (RDS)
My problem is when i run odoo.py and i open this url (http://localhost:8069/web/), it show me an empty screen.
Here is my startup code:
Python odoo.py --db_host VALID-URL.us-east-1.rds.amazonaws.com -r USERNAME -w PASSWORD -d DATABASENAME --addons-path=addons
¿What i am doing wrong?
just add the option --debug to your command to get debug debug infos
or
you will find a file named odoo.conf (v9 -> v11) or openerp-server.conf (v7 or less)
search for debug_mode = false and set them to true.
I think I had this problem and needed to install node js, and npm less
Is there a way to get console access to Dokku's PostgreSQL plugin? On Heroku I'd do heroku pg:psql. Is this possible in a Dokku environment and if so how?
There is in fact a way to do this directly with the dokku-pg-plugin.
The command postgresql:restore <db> < dump_file.sql connects to the specified database and restores it with the provided dump file. If you simply omit the dump file part (< dump_file.sql), a psql console session opens.
Since postgresql:restore <db> is semantically not the best way to open a console session, I have opened a pull request to add the command postgresql:console <db>.
So, until my PR is merged, the options for opening a psql console for a database are either:
doing it manually with psql -h 172.17.42.1 -p <port> -U root db with the <port> and password taken from the output of dokku postgresql:info <db>,
using the semantically incorrect command dokku postgresql:restore <db>, or
use my forked and patched version of the plugin which adds the command postgresql:console <db>.
Edit:
The owner of the dokku-pg-plugin has merged my pull request. If you're using this plugin and are looking for a way to access your PostgreSQL console with it, you might want to update it to the latest version. Once you have done that, you can use the command postgresql:console <db> to open a psql session to the specified database.
This worked for me for my Rails app that I'm running on Dokku:
dokku run <app-name> rails db
That brought up the console for the PostgreSQL container I created (via dokku postgresql:create <db>). I couldn't figure out another way to get at the PostgreSQL instance in that container, short of attempting to directly connect to the DB, with the connection info/credentials listed when you do this:
dokku postgresql:info <db>
I haven't tried that, though I suspect it would work.