Well im kind of new with all the Back-end stuff so pardon if i ask silly questions or my code makes no sense ;).
so what im trying to do is to transfer data between my api and the database. right now focusing on get-all and post methods. the data that im transferring is Als files (abelton live set).
i have a main.py file which is where the routes sit, database.py containing the engine and the functions and file_model.py which is the pydantic model.
im able to post a file and indeed i can see it in my DB, it has the default mongo _id : objectid
and the file.
when i try the get_all_files_db() function the errors start to show, well there was one problem with the _id which python cant read properly so i tried to fix it with some web searching.
its hard to define my problem because I probably have many, so specifying the error will not be relevant.
so this is where you guys come in, your answers could be code recommendations or a link to some docs that may help me understand better or even using some different libraries/dependencies because i see that fastapi and mongoDB is annoying when dealing with bigger files and not regular json.
those are my 2 routes in main.py (used Uploadfile because of the file type)
`#app.post("/api/files" , response_model=Als)
async def post_file(title : str = Form(...), file : UploadFile = File(...)):
Als(title=title)
response = await upload_file(title,file.file)
if response :
return response
raise HTTPException(400, "Something went wrong")
#app.get("/api/files" , response_model= Als )
async def get_all_files():
response =await get_all_files_db()
return response`
here are my functions in the database.py and the connection to database.
client = motor.motor_asyncio.AsyncIOMotorClient('localhost', 27017)
db = client['Files']
collection = db['ALS_Files']
async def upload_file(title,file):
await collection.insert_one({title:file.read()})
return file
async def get_all_files_db():
files = []
cursor = collection.find({})
async for document in cursor:
#doc_copy = document.copy()
files.append(document)
return files()
and my file_model.py
class PyObjectId(ObjectId):
""" Custom Type for reading MongoDB IDs """
#classmethod
def __get_validators__(cls):
yield cls.validate
#classmethod
def validate(cls, v):
if not ObjectId.is_valid(v):
raise ValueError("Invalid object_id")
return ObjectId(v)
#classmethod
def __modify_schema__(cls, field_schema):
field_schema.update(type="string")
class Als(BaseModel) :
id: PyObjectId = Field(default_factory=PyObjectId, alias="_id")
title: str
class Config:
allow_population_by_field_name = True
arbitrary_types_allowed = True
json_encoders = {ObjectId: str}
I did put password of that user in the field and i have tried all possible combination with dbname.I dont know which dbname it is referring to. I have searched many places, didnt get any answers. Can some one please help me how to configure.
app.config['DEBUG']=True
app.config['MONGOALCHEMY_CONNECTION_STRING']='mongodb+srv://user:
<password>#test.usvae.mongodb.net/<dbname>?retryWrites=true&w=majority'
db=MongoAlchemy(app)
This is my configeration.
This is the error i am getting
raise ImproperlyConfiguredError("You should provide a database name "
flask_mongoalchemy.ImproperlyConfiguredError: You should provide a database name (the MONGOALCHEMY_DATABASE setting)
Thanks in advance
import pymongo
## DB Connection ##
client = pymongo.MongoClient("mongodb+srv://mongouser:password#<cluster>/<db>?retryWrites=true&w=majority")
## DB Creation ##
db = client.<db>
## Collection Creation ##
col1 = db.Users
if client:
print("connected")
else:
print("not connected")
# Single Value Insert ##
Users = {"ID":"481292","Name":"DS"}
#x1 = col1.insert_one(Users)
You can try above code to connect mongodb atlas to flask.
Below code for using mongodb atlas with flask_mongoalchemy
from flask import Flask
from flask_mongoalchemy import MongoAlchemy
app = Flask(__name__)
DB_URI = 'mongodb+srv://<user>:<password>#<cluster>/<db>?retryWrites=true&w=majority'
app.config['MONGOALCHEMY_DATABASE'] = 'test'
app.config["MONGODB_HOST"] = DB_URI
db = MongoAlchemy(app)
class Users(db.document):
name = db.StringField()
age = db.IntField()
You need to insert value as well, to avoid error.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I need some Tips for building a REST API with about 35000 static (non-changing) JSON data.
It's my first time building a REST API seriously, so I need some design decision advice.
First, I was planning to use Flask to build the API since I am familiar with it and MongoDB to store the data. But I've heard that MongoDB is not a good choice for data that do not change.
What I would like to know are:
Which DB is suitable for this kind of data?
Is Flask a good choice if I am expecting many users using the API at the same time?
What are the brief steps for doing this? What I have in my mind right now is something like below:
Steps:
1) Upload my data to DB
2) Create a REST API that helps the user fetch the data
3) Upload the REST API to some server
4) Test with Postman to see if it works
Is my overall thought correct?
Any advice would be great. Thanks in advance.
If you are unsure about what DB to use I would just go with PostgreSQL. It's scalable so if you ever need to build on your dataset it will work just fine. In terms of performance it just depends on how many requests it gets, but I bet it could handle whatever you throw at it.
Regarding the API, if you are set with Flask then I recommend the package Flask-Restful. Outline your db using an ORM in a file called models.py. In a folder called resources, make files that serve as your API resources. Example would be blogposts.py, which would have a get request for all or a single post, post, put, and delete for single posts. Here is something I have for a really lightweight blog. Using peewee as an ORM and another package called Flask-HTTPAuth for authentication.
# blogposts.py
import json
from flask import jsonify, Blueprint, abort, make_response
from flask_restful import (Resource, Api, reqparse, inputs, fields,
url_for, marshal, marshal_with)
from auth import auth
import models
blogpost_fields = {
'id': fields.Integer,
'title': fields.String,
'content': fields.String,
'created': fields.DateTime
}
def blogpost_or_404(id):
try:
blogpost = models.BlogPost.get(models.BlogPost.id==id)
except models.BlogPost.DoesNotExist:
abort(404)
else:
return blogpost
class BlogPostList(Resource):
def __init__(self):
self.reqparse = reqparse.RequestParser()
self.reqparse.add_argument(
'title',
required=True,
help='No title provided',
location=['form', 'json']
)
self.reqparse.add_argument(
'content',
required=False,
nullable=True,
location=['form', 'json'],
default=''
)
super().__init__()
def get(self):
blogpost = [marshal(blogpost, blogpost_fields)
for blogpost in models.BlogPost.select()]
return {'BlogPosts': blogpost}
#marshal_with(blogpost_fields)
#auth.login_required
def post(self):
args = self.reqparse.parse_args()
blogpost = models.BlogPost.create(**args)
return (blogpost, 201, {
'Location': url_for('resources.blogposts.blogpost', id=blogpost.id)
})
class BlogPost(Resource):
def __init__(self):
self.reqparse = reqparse.RequestParser()
self.reqparse.add_argument(
'title',
required=False,
help='No title provided',
location=['form', 'json']
)
self.reqparse.add_argument(
'content',
required=False,
nullable=True,
location=['form', 'json'],
default=''
)
super().__init__()
#marshal_with(blogpost_fields)
def get(self, id):
return (blogpost_or_404(id))
#marshal_with(blogpost_fields)
#auth.login_required
def put(self, id):
args = self.reqparse.parse_args()
try:
blogpost = models.BlogPost.select().where(
models.BlogPost.id==id).get()
except models.BlogPost.DoesNotExist:
return make_response(json.dumps(
{'error': 'That blogpost does not exist or is not editable'}
), 403)
else:
query = blogpost.update(**args).where(models.BlogPost.id==id)
query.execute()
blogpost = (blogpost_or_404(id))
return (blogpost, 200, {
'Location': url_for('resources.blogposts.blogpost', id=id)
})
#auth.login_required
def delete(self, id):
try:
blogpost = models.BlogPost.select().where(
models.BlogPost.id==id).get()
except models.BlogPost.DoesNotExist:
return make_response(json.dumps(
{'error': 'That blogpost does not exist or is not editable'}
), 403)
else:
query = blogpost.delete().where(models.BlogPost.id==id)
query.execute()
return '', 204, {'Location': url_for('resources.blogposts.blogposts')}
blogposts_api = Blueprint('resources.blogposts', __name__)
api = Api(blogposts_api)
api.add_resource(
BlogPostList,
'/blogposts',
endpoint='blogposts'
)
api.add_resource(
BlogPost,
'/blogposts/<int:id>',
endpoint='blogpost'
)
Resource classes have methods with the http method name, this is what sets which methods are allowed. For instance, if I tried to delete to /blogposts without an ID, it would respond with method not allowed. Delete is only defined for a single post. Marshaling determines what information is in the response, you define it with blogpost_fields at the top. In the init of each class, we define the Request Parser which is what determines the information the API needs. In this example we only need a title and the post content. In a users resource you would add in things like email, username, password, verify password, admin status etc.
# models.py
import datetime
import jwt
from argon2 import PasswordHasher
from peewee import *
import config
DATABASE = PostgresqlDatabase('blogdb', user=config.DB['USER'], password=config.DB['PW'], host=config.DB['HOST'])
HASHER = PasswordHasher()
class User(Model):
username = CharField(unique=True)
email = CharField(unique=True)
password = CharField()
class Meta:
database = DATABASE
#classmethod
def create_user(cls, username, email, password, **kwargs):
email = email.lower()
try:
cls.select().where(
(cls.email==email)|(cls.username**username)
).get()
except cls.DoesNotExist:
user = cls(username=username, email=email)
user.password = user.set_password(password)
user.save()
return user
else:
raise Exception("User with that email or username already exists")
#staticmethod
def verify_auth_token(token):
try:
payload = jwt.decode(token, config.SECRET_KEY)
return payload['sub']
except jwt.ExpiredSignatureError:
return 'Signature expired. Please log in again.'
except jwt.InvalidTokenError:
return 'Invalid token. Please log in again.'
#staticmethod
def set_password(password):
return HASHER.hash(password)
def verify_password(self, password):
return HASHER.verify(self.password, password)
def generate_auth_token(self, id):
try:
payload = {
'exp': datetime.datetime.utcnow() + datetime.timedelta(days=0, seconds=5),
'iat': datetime.datetime.utcnow(),
'sub': id
}
return jwt.encode(
payload,
config.SECRET_KEY,
algorithm='HS256'
)
except Exception as e:
return e
class BlogPost(Model):
title = CharField(default='', unique=True)
content = TextField(default='')
created = DateTimeField(default=datetime.datetime.now)
class Meta:
database = DATABASE
def initialize():
DATABASE.connect()
DATABASE.create_tables([User, BlogPost], safe=True)
DATABASE.close()
# auth.py
from flask import g
from flask_httpauth import HTTPTokenAuth
import models
auth = HTTPTokenAuth(scheme='Bearer')
#auth.verify_token
def verify_token(token):
user = models.User.verify_auth_token(token)
if user is not None:
g.user = user
return True
return False
Models is pretty self explanatory if you've ever worked with an ORM like SQLAlchemy. I would recommend that package since your dataset is far far far larger than the one for this example. HTTPAuth allows you to decorate your API resource methods with a required authentication method. In my example, logging in will generate a JWT which needs to be sent with each request as a Bearer token.
Once all of that is set up you register your API blueprints in app.py
# app.py
app = Flask(__name__)
app.register_blueprint(users_api, url_prefix='/api/v1')
app.register_blueprint(blogposts_api, url_prefix='/api/v1')
app.register_blueprint(login_api)
That's it!
I am using AVA for testing. I have 2 files. In file1.spec.js, I am creating a user, and once the user is created a userId is generated and returned. I need this userId in file2.spec.js to test some other API calls specific to this user. How can I successfully export the userId created in file1.spec.js and import it into file2.spec.js? Thanks in advance!
I have tried the following:
file1.spec.js:
method: 'POST',
url: '/api/users',
data: setupFixture.postUsersAtLocation1
}).catch(err => { console.log(err.response.data); return err.response; });
if (result.status === 200) {
_int.userId = result.data.userId;
SCENARIO 1:
module.exports = {userId, userId1};
SCENARIO 2:
export {userId1};
export let userId = _int.userId;
file2.spec.js:
import test from 'ava';
import setup from './setup.spec.js';
const {userId, userId1} = setup;
var userIdA = userId;
var userId1A = userId1;
When I run this, it complains that file2.spec.js has an unexpected identifier (test) in import test from 'ava'. If I remove "import setup from './setup.spec.js';", and all after it, it no longer complains about test, but I never get the variables imported, either way.
Each test file is executed in a new worker process. Test files should not depend on another file having been executed first. Instead try and use a different database / table / IDs in each test file, then share setup code (if necessary) through helpers.
It looks like in Vapor 2 you could do something like:
let query = <some fluent query object>
logger?.debug(query)
and it would print out the full SQL statement, but I'm not seeing any documentation of how to do that now in Vapor 3.
How can I see what SQL is being generated by my QueryBuilder?
Thanks to Nick in the comments, who pointed me to the right set of docs. This can be accomplished by using the enableLogging method. So now my configure.swift includes this code:
let dbConfig: PostgreSQLDatabaseConfig
if let url = Environment.get("DATABASE_URL"), let psqlConfig = PostgreSQLDatabaseConfig(url: url, transport: .unverifiedTLS) {
dbConfig = psqlConfig
} else {
dbConfig = ...something for the local db...
}
let postgresql = PostgreSQLDatabase(config: dbConfig)
/// Register the configured SQLite database to the database config.
var databases = DatabasesConfig()
databases.enableLogging(on: .psql)
databases.add(database: postgresql, as: .psql)
services.register(databases)
The important line being the third from the bottom. For a while I was trying to enable debugging on PostgreSQLDatabaseConfig, so to anyone in the future, take note that you're enabling it on the DatabasesConfig object instead.