Exclude folder from Bullseye code coverage - bullseye

I want to remove some folders from my .cov file after coverage. My script that run coverage is not in the same folder as src.
I have followed couple of already existing stack answer but they seem not to work. I have already tried
covselect --file "%COVFILE%" --add \/src/bin/qwerty/
covselect --file "%COVFILE%" --add \../../src/bin/qwerty/
covselect --file "%COVFILE%" --remove ../../src/bin/qwerty/
covselect --file "%COVFILE%" --remove /src/bin/qwerty/
but no luck so far. I cannot find any information in Bullseye documentation. Any idea how to do it?

I finally found it here: https://www.bullseye.com/help/build-exclude.html. So the pattern I finally used is as follows:
covselect --file "%COVFILE%" --add !**/src/bin/qwerty/
Where ** is a wildcard.

Related

How to import all files from a folder to MongoDB?

I have a folder containing CSVfiles. I want to upload the data of all the CSVs to MongoDB. I have tried the following command:
for i in *.csv; do mongoimport -d mydatabase -c ${i%.*} --type csv --file $i --headerline ; done
I have modified it to suit my scenario. The modified command is as follows:
for i in "C:\Users\lenovo-pc\Desktop\Testing sample csv\*.csv"; do mongoimport -d Better -c TESTCSV --type csv --file $i --headerline ; done
But It is giving error: i was unexpected at this time.
I would like to know how I can upload all the CSVs from a folder at once. I do not want to upload them one by one. Kindly help.
Try this:
for %i in (*.csv) do mongoimport -d Better -c TESTCSV --type csv --file "%i" --headerline
Make sure you run this from the directory in which the *.csv files are present.
I've made this script for Windows that imports all CSVs in the same folder and names each collection with the name of the respective CSV.
You need to copy these lines in a .bat file, then edit the variables MONGO_HOME and db as you need:
TITLE MongoDB CSV Importer
SET "MONGO_HOME=C:\Program Files\MongoDB\Server\3.6"
SET db=datasets
for %%v in (*.csv) do "%MONGO_HOME%\bin\mongoimport.exe" -d %db% -c %%~nv --type CSV --file %%v --headerline
TITLE "Import completed!"
PAUSE
and this works the same for Linux shell scripts (.sh):
db="datasets"
for entry in *".csv"
do
coll=$(echo "$entry" | cut -f 1 -d '.')
echo $name
mongoimport -d $db -c $coll --type CSV --file $coll".csv" --headerline
done
echo "end of import."
Hope this helps

mongoDB mongoimport error .. getting file doesn't exist error

I am a newbie in Mongodb. when i am trying to mongoimport i am getting the below error.
I have the file placed in c:\mongo\data\db\mongo.csv ... can anyone pls help me.
C:\mongodb\bin>mongoimport.exe -d test -c foo --file c:\mongo\data\db\mongo.csv --type csv
connected to: 127.0.0.1
file doesn't exist: c:\mongo\data\db\mongo.csv
either you should give path like as you are in C directive only
C:\mongodb\bin>mongoimport.exe -d test -c foo --file /mongo/data/db/mongo.csv --type csv
or you can give path in a qoute (" ") as mentioned by Gianpj
C:\mongodb\bin>mongoimport.exe -d test -c foo --file "c:\mongo\data\db\mongo.csv" --type csv
Are you sure the file does exist? There are no spaces in the file path?
Try with double quotes:
--file "c:\mongo\data\db\mongo.csv"
Lastly where did got that .csv file? from mongoexport --csv ?

Cannot import mongodb

I tried mongo import like this
mongoimport -d test -c foo importfile.json
mongoimport --host localhost --db local --collection lecturer --type json --file temp.json --headerline --upsert
and I've got same error message "Syntax Error: missing ; before statement (shell):1"
what's wrong with my code & how to import if my data stored in C:\Documents and Settings\User\Desktop ?? please help, thank's in advance
mongoimport is intended to run in command prompt and not in the mongo shell. Try exiting out of the shell and running the command.
One solution is:
First, in cmd, change to the directory containing mongoexport.exe file, then type your command.
C:\Program Files\MongoDB\Server\3.2\bin> .\mongoexport.exe -d foo -c bar -o output.json
mongoimport is to be run on the terminal and not inside the mongo shell. To run mongoimport in terminal, you will need to install the same. On ubuntu, you can do :
apt-get install mongo-tools
Hope this helps :)
I had the same problem and was able to figure it out after a brief struggling and googling.
1. Navigate to the bin directory in command prompt
(cd c:..\bin)
2. Run the mongoimport command but you have to specify the full path of your json file.
That solves the problem
try to use CSV is a good.
mongoimport -d mydb -c things --type csv --file locations.csv --headerline --upsert
You can convert by ms excel.
Open the "Mongo/Server/3.4/bin" folder of mongo db in another command window and try again.It Will work.
Open a new terminal or command prompt within the location of the file you want to import and it should work. It will not work on MongoDB shell

import data into openshift mongoDb

I created a java application on openshift with the mongoDb cartridge.
My application runs fine, both locally on jboss AS7 as on openshift.
So far so good.
Now I would like to import an csv into the mongoDb on the openshift cloud.
The command is fairly simple:
mongoimport -d dbName -c collectionName --type csv data.csv --headerline
This works fine locally, and I know how to connect to the openshift-shell and remote mongo-db. But my question is: how can I use a locally stored file (data.csv) when executing this commando in a ssh-shell.
I found this on the openshift forum, but I don't realy know what this tmp directory is and how to use it.
I work on windows, so I use Cygwin as a shell-substitute.
Thanks for any help
The tmp directory is shorthand for /tmp. On Linux, it's a directory that is cleaned out whenever you restart the computer, so it's a good place for temporary files.
So, you could do something like:
$ rsync data.csv openshiftUsername#openshiftHostname:/tmp
$ ssh openshiftUsername#openshiftHostname
$ mongoimport -d dbName -c collectionName --type csv /tmp/data.csv --headerline
This is what I needed in October 2014:
mongoimport --host $OPENSHIFT_MONGODB_DB_HOST --port $OPENSHIFT_MONGODB_DB_PORT -u admin -p 123456789 -d dbName -c users /tmp/db.json
Note that I used a json file instead of csv
When using Openshift you must use the environment variables to ensure your values are always correct. Click here to read more about Openshift Envrionment variables
SSH into your openshift server then run (remember to change the bold bits in the command to match your values):
mongoimport --headerline --type csv \
--host $OPENSHIFT_NOSQL_DB_HOST \
--port $OPENSHIFT_NOSQL_DB_PORT \
--db **your db name** \
--collection **your collection name** \
--username $OPENSHIFT_NOSQL_DB_USERNAME \
--password $OPENSHIFT_NOSQL_DB_PASSWORD \
--file ~/**your app name**/data/**your csv file name**
NOTE
When importing csv files using mongoimport the data is saved as strings and numbers only. It will not save arrays or objects. If you have arrays or object to be saved you must first convert your csv file into a proper json file and then mongoimport the json file.
I installed RockMongo on my openshift instance to manage the mongodb.
It's a nice userinterface, a bit like phpMyAdmin for mysql
Users who wish to use mongorestore the following worked for me:
First copy your dump using scp to the data dir on openshift:
scp yourfile.bson yourhex#yourappname.rhcloud.com:app-root/data
rhc ssh into your app and cd to the app-root/data folder.
mongorestore --host $OPENSHIFT_MONGODB_DB_HOST
--port $OPENSHIFT_MONGODB_DB_PORT
--username $OPENSHIFT_MONGODB_DB_USERNAME
--password $OPENSHIFT_MONGODB_DB_PASSWORD
-d yourdb
-c yourcollection
yourfilename.bson --drop
Similar to Simon's answer, but this is how I imported .json to the database:
mongoimport --host $OPENSHIFT_MONGODB_DB_HOST -u admin -p 123456 --db dbname --collection grades < grades.json

restoring a dump with mongodb fails

I've dumped a mongodb database with the following mongodump command line
mongodump -h www.myhost.com -u myusername -p mypassword -d mydb > dump.bson
And I'm trying to restore the dump on my local server:
mongorestore -h localhost -d mydb dump.bson
Unfortunately it fails with the following error:
assertion: 10264 invalid object size: 1096040772
Does anyone know what could cause this error?
On both servers mongo's version is 1.8.3
Thanks
Because first string output from mongodump is "db level locking enabled: 0"
You need to do this
tail -n+2 dump.bson > dump_fix.bson
mongorestore -h localhost -d mydb dump_fix.bson
excuse my english :P this happened to me when i did export with mongoexport and try to import with mongorestore :D my mistake! i had to use mongoimport.
Remember: mongoexport/mongoimport, and mongodump/mongorestore
i hope this is usefull to some one :P
I also encountered this problem. And finally I found that this problem was caused by using mongodump command in a wrong way.
well use mongo restore instead of mongodump
This isn't explained very well anywhere that I found, but I found a solution that worked.
I downloaded a .tgz file from mongolab, which contained .bson and .json files in it.
I created a ~/dump folder on my mac.
I copied all those .bson and .json files into the ~/dump folder, so I had ~/dump/users.bson for example.
I ran this command in terminal:
mongorestore -h 127.0.0.1 -db <the_db_name_on_server_this_backup_is_from>
It imported in seconds. I'm sure there are other ways/options, but this is what worked for me.