I want to mock mongo in order to make some unit test with unittest for Flask. The doc about this is so huge and I don't really understand how to make it.
I want to test a POST method with the following data:
from unittest import TestCase, main as unittest_main, mock
from bson.objectid import ObjectId
from app import app
sample_user = {
'Id': ObjectId('5d55cffc4a3d4031f42827a3'),
'Username': 'LeTest',
'Mail': 'sendme#gmail.com',
'password': 'test123',
'Qrcode': 'TODO'
}
Can you explain me how I can test if the sample_user where added to my mongo collection ?
Thx !
I found the answer:
Here you have my code in order to mock mongoDB Data with Flask
def test_post_food(self):
# Mock the food value in ./api.food.py
with unittest.mock.patch('api.food.food') as MockFood:
# Force the return value of food.insert_one(json) to sample_food
MockFood.insert_one.return_value = sample_food
with self.client.post("/api/addFood", json=sample_food[0]) as res:
# Check if food.insert_one(json) was called
MockFood.insert_one.assert_called()
self.assertEqual(res.status_code, 200)
self.assertEqual(res.data, b'{"Response":"Food was added"}\n')
sample_food = [{
"_id": {
"$oid": "619e8f45ee462d6d876bbdbc"
},
'Utilisateur': "999",
'Nom': 'Danette Vanille',
'Marque': 'Danone',
'Quantite': 4,
'ingredients': [
'lait entier',
'lait écrémé reconstitué à base de lait en poudre',
'sucre',
'crème',
'lait écrémé concentré ou en poudre',
'épaississants (amidon modifié, carraghénanes)',
'perméat de petit lait (lactosérum) en poudre',
'amidon',
'arôme (lait)',
'colorant (bêta-carotène)'
],
'Date': '20/12/2021',
'Valeurs': {
'Energie': '107 kcal',
'Matières grasses': '3,0g',
'Glucides': '17,1g',
'Proteines': '3g',
'Sel': '0,14g'
},
'Poids': '125g',
'Lieu': 'Frigo',
'Category': "Produit laitiers"
}]
Related
I have been trying to connect to a cluster in MongoDB Atlas using the mongodb:Client. I am not able to find any connection string that is supported by the Ballerina client. I could not find any sample code that suggests how to do so.
Following is the source code
import ballerinax/mongodb;
configurable string app = ?;
configurable string pwd = ?;
mongodb:Client mongoCli = check new ({connection: {url: string `mongodb+srv://${app}:${pwd}#fina-a-journey.ugfjnsm.mongodb.net/?retryWrites=true&w=majority`}});
public function main() {
mongodb:Error? insert = mongoCli->insert({name: "Jhon", age: 16}, "users");
}
Please refer to https://lib.ballerina.io/ballerinax/mongodb/4.0.0/records/ConnectionConfig
You may try this:
mongodb:ConnectionConfig mongoConfig = {
connection: {
url: "xxxxx"
},
databaseName: "MyDb"
};
mongodb:Client mongoClient = check new (mongoConfig);
The password in the connection string that I had passed to the Client had some special characters that should be escaped using %. After escaping it worked. It is specified here https://www.mongodb.com/docs/atlas/troubleshoot-connection/#special-characters-in-connection-string-password
You may try this
mongodb:ConnectionConfig mongoConfig = {
connection: {url: "mongodb+srv://<username>:<password>#xxxxx.xxxx.mongodb.net/?retryWrites=true&w=majority"},
databaseName: "xxxxx"
};
I have tried this in an service endpoint.
mongodb:Client mongoClient = check new (mongoConfig);
string collection = "test2";
map<json> doc = { "key1": "Value", "key2": "value2" };
check mongoClient->insert(doc, collection);
If i need to take default value in django rest_framework how should i need to take entries in json format
models.py
class Employee(models.Model):
id=models.AutoField(primary_key=True)
name=models.CharField(max_length=200)
email=models.CharField(max_length=200,unique=True)
empid=models.IntegerField(blank=True,null=True)
phone=models.CharField(max_length=15,unique=True)
thunder_client
{
"id":1,
"name":"abhishek kannu",
"email":"Abhishek17rd#gmail.com",
"phone":7353557213
}
error
{
"empid": [
"This field is required."
]
}
As pandas supports dataframe to json conversion and the dataframe can be converted to a json data as below: (1) and 2) are just for references nothing to do with sapui5,
1) for eg:
import pandas as pd
df = pd.DataFrame([['madrid', 10], ['venice', 20],['milan',40],['las vegas',35]],columns=['city', 'temp'])
df.to_json(orient="records")
gives:
[{"city":"madrid","temp":10},{"city":"venice","temp":20},{"city":"milan","temp":40},{"city":"las vegas","temp":35}]
and
df.to_json(orient="split")
gives:
{"columns":["city","temp"],"index":[0,1,2,3],"data":[["madrid",10],["venice",20],["milan",40],["las vegas",35]]}
As we have json data , this data could be used as input to plot properties.
2)for the same json data I have created an API (running on localhost):
http://127.0.0.1:****/graph
API using in flask:(just for refernce)
from flask import Flask
import pandas as pd
app=Flask(__name__)
#app.route('/graph')
def plot():
df=pd.DataFrame([['madrid', 10], ['venice', 20], ['milan', 40], ['las vegas', 35]],
columns=['city', 'temp'])
jsondata=df.to_json(orient='records')
return jsondata
if __name__=='__main__':
app.run()
postman result:
[
{
"city": "madrid",
"temp": 10
},
{
"city": "venice",
"temp": 20
},
{
"city": "milan",
"temp": 40
},
{
"city": "las vegas",
"temp": 35
}
]
3)How can I make use of this sample api to fetch data and then plot a sample graph for {city vs temp} using sapui5 ??
looking for an example to do so, (or) any help on how to make use of api's in sapui5 ?
I'm unable to get a Spring-Cloud based AWS Lambda Function with an SQS Message trigger to work. I'm using the Spring Cloud Function AWS adapter version 2.0.1.RELEASE and attempting to deploy to the AWS EU-WEST-2 Region.
My SpringBootRequestHandler is defined as follows:
import org.springframework.cloud.function.adapter.aws.SpringBootRequestHandler;
import com.amazonaws.services.lambda.runtime.events.SQSEvent;
public class ReplicationHandler extends SpringBootRequestHandler<SQSEvent, String>{
}
My #Bean function looks as follows:
#Bean
public Function<SQSEvent, String> handleEvent() {
return value -> processEvent((SQSEvent)value);
}
I feed this with the following test event:
{
"Records": [
{
"messageId": "02a4e04b-a1d2-417a-b073-56123be35ac6",
"receiptHandle": "AQEB0fsSc76vU9Y6vQEz",
"body": "hello world",
"attributes": {
"ApproximateReceiveCount": "1",
"SentTimestamp": "1553860061037",
"SenderId": "AIDAIVEA3AGEU7NF6DRAG",
"ApproximateFirstReceiveTimestamp": "1553860061042"
},
"messageAttributes": {},
"md5OfBody": "a4d19d8b1019e01bb875eea6232bf2f1",
"eventSource": "aws:sqs",
"eventSourceARN": "arn:aws:sqs:eu-west-2:XXXXX:YYYYY",
"awsRegion": "eu-west-2"
}
]
}
When I run this , I get the following error:
{
"errorMessage": "reactor.core.publisher.FluxJust cannot be cast to com.amazonaws.services.lambda.runtime.events.SQSEvent",
"errorType": "java.lang.ClassCastException",
"stackTrace": [
"org.springframework.cloud.function.adapter.aws.SpringFunctionInitializer.apply(SpringFunctionInitializer.java:132)",
"org.springframework.cloud.function.adapter.aws.SpringBootRequestHandler.handleRequest(SpringBootRequestHandler.java:48)"
]
}
Anyone have any suggestions around what's going wrong here? Alternatively, if there are any working samples on the web for my exact scenario, that would be good as well.
I have been trying to send Data from Sensu to Influx DB.
I created DB for Sensu, and also updated to listen on port 8090 in my case.
User login looks fine on influxdb.
I configured almost everything similar to this link
https://libraries.io/github/nohtyp/sensu-influxdb
I am not getting any success, and not seeing any data in the database ..
Anyone tried this ?
You can also use the custom script in case default configuration is not working. it gives the options to write only the data you want to save, before running the script, install InfluxDBClient (sudo apt-get install python-influxdb)
from influxdb import InfluxDBClient
import fileinput
import json
import string
import datetime
json_body = " "
for line in fileinput.input():
json_body = json_body + string.replace(line, '\n', ' ')
json_body = json.loads(json_body)
alert_in_ip = str(json_body["client"]["name"])
alert_in_ip = 'ip-' + string.replace(alert_in_ip, '.', '-')
alert_type = json_body["check"]["name"]
status = str(json_body['check']['status'])
time_stamp =(datetime.datetime.fromtimestamp(int(json_body["timestamp"])).strftime('%Y-%m-%d %H:%M:%S'))
json_body = [{ "measurement": alert_type,
"tags": {
"host": alert_in_ip
},
"time": time_stamp,
"fields": {
"value": int(status)
}
}]
client = InfluxDBClient('localhost', 8086, 'root', 'root', 'sensu')
client.write_points(json_body)
And call the above script from your handler.
For example:
"sendinflux":{
"type": "pipe",
"command": "echo $(cat) | /usr/bin/python /home/ubuntu/save_to_influx.py",
"severites": ["critical", "unknown"]
}
Hope it helps!!