According to the pymongo documentation,
PyMongo is thread-safe and even provides built-in connection pooling for threaded applications.
I normally initiate my mongodb connection like this:
import pymongo
db = pymongo.Connection()['mydb']
and then I can just use it like db.users.find({'name':..})...
Does this mean that I can actually place that two lines in the lib/apps_global.py like:
class Globals(object):
def __init__(self, config):
self.cache = CacheManager(**parse_cache_config_options(config))
import pymongo
self.db_conn = pymongo.connection()
self.db = self.db_conn['simplesite']
and then in my base controller:
class BaseController(WSGIController):
def __call__(self, environ, start_response):
"""Invoke the Controller"""
# WSGIController.__call__ dispatches to the Controller method
# the request is routed to. This routing information is
# available in environ['pylons.routes_dict']
ret = WSGIController.__call__(self, environ, start_response)
# Don't forget to release the thread for mongodb
app_globals.db_conn.end_request()
return ret
And start calling app_global's db variable throughout my controllers?
I hope it is really that easy.
Ben Bangert, an author of Pylons, has written his blog engine with mongodb.
You can browse it's source code online.
Related
I can use MongoDB with FastAPI either
with a global client: motor.motor_asyncio.AsyncIOMotorClient object, or else
by creating one during the startup event per this SO answer which refers to this "Real World Example".
However, I also want to use fastapi-users since it works nicely with MongoDB out of the box. The downside is it seems to only work with the first method of handling my DB client connection (ie global). The reason is that in order to configure fastapi-users, I have to have an active MongoDB client connection just so I can make the db object as shown below, and I need that db to then make the MongoDBUserDatabase object required by fastapi-users:
# main.py
app = FastAPI()
# Create global MongoDB connection
DATABASE_URL = "mongodb://user:paspsword#localhost/auth_db"
client = motor.motor_asyncio.AsyncIOMotorClient(DATABASE_URL, uuidRepresentation="standard")
db = client["my_db"]
# Set up fastapi_users
user_db = MongoDBUserDatabase(UserDB, db["users"])
cookie_authentication = CookieAuthentication(secret='lame secret' , lifetime_seconds=3600, name='cookiemonster')
fastapi_users = FastAPIUsers(
user_db,
[cookie_authentication],
User,
UserCreate,
UserUpdate,
UserDB,
)
After that point in the code, I can import the fastapi_users Routers. However, if I want to break up my project into FastAPI Routers of my own, I'm hosed because:
If I move the client creation to another module to be imported into both my app and my routers, then I have different clients in different event loops and get errors like RuntimeError: Task <Task pending name='Task-4' coro=<RequestResponseCycle.run_asgi() running at /usr/local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py:389> cb=[set.discard()]> got Future <Future pending cb=[_chain_future.<locals>._call_check_cancel() at /usr/local/lib/python3.8/asyncio/futures.py:360]> attached to a different loop (touched on in this SO question)
If I user the solutions of the "Real World Example", then I get stuck on where to build my fastapi_users object in my code example: I can't do it in main.py because there's no db object yet.
I considered making the MongoDBUserDatabase object as part of the startup event code (ie within async def connect_to_mongo() from the Real World Example), but I'm not able to get that to work either since I can't see how to make it work.
How can I either
make a global MongoDB client and FastAPI-User object in a way that can be shared among my main app and several routers without "attached to a different loop" errors, or
create fancy wrapper classes and functions to set up FastAPI users with the startup trigger?
I don't think my solution is complete or correct, but I figured I'd post it in case it inspires any ideas, I'm stumped. I have run into the exact dilemma, almost seems like a design flaw..
I followed this MongoDB full example and named it main.py
At this point my app does not work. The server starts up but result results in the aforementioned "attached to a different loop" whenever trying to query the DB.
Looking for guidance, I stumbled upon the same "real world" example
In main.py added the startup and shudown event handlers
# Event handlers
app.add_event_handler("startup", create_start_app_handler(app=app))
app.add_event_handler("shutdown", create_stop_app_handler(app=app))
In dlw_api.db.events.py this:
import logging
from dlw_api.user import UserDB
from fastapi import FastAPI
from fastapi_users.db.mongodb import MongoDBUserDatabase
from motor.motor_asyncio import AsyncIOMotorClient
LOG = logging.getLogger(__name__)
DB_NAME = "dlwLocal"
USERS_COLLECTION = "users"
DATABASE_URI = "mongodb://dlw-mongodb:27017" # protocol://container_name:port
_client: AsyncIOMotorClient = None
_users_db: MongoDBUserDatabase = None
def get_users_db() -> MongoDBUserDatabase:
return _users_db
async def connect_to_db() -> None:
global _users_db
# logger.info("Connecting to {0}", repr(DATABASE_URL))
client = AsyncIOMotorClient(DATABASE_URI)
db = client[DB_NAME]
collection = db[USERS_COLLECTION]
_users_db = MongoDBUserDatabase(UserDB, collection)
LOG.info(f"Connected to {DATABASE_URI}")
async def close_db_connection(app: FastAPI) -> None:
_client.close()
LOG.info("Connection closed")
And dlw_api.events.py:
from typing import Callable
from fastapi import FastAPI
from dlw_api.db.events import close_db_connection, connect_to_db
from dlw_api.user import configure_user_auth_routes
from fastapi_users.authentication import CookieAuthentication
from dlw_api.db.events import get_users_db
COOKIE_SECRET = "THIS_NEEDS_TO_BE_SET_CORRECTLY" # TODO: <--|
COOKIE_LIFETIME_SECONDS: int = 3_600
COOKIE_NAME = "c-is-for-cookie"
# Auth stuff:
_cookie_authentication = CookieAuthentication(
secret=COOKIE_SECRET,
lifetime_seconds=COOKIE_LIFETIME_SECONDS,
name=COOKIE_NAME,
)
auth_backends = [
_cookie_authentication,
]
def create_start_app_handler(app: FastAPI) -> Callable:
async def start_app() -> None:
await connect_to_db(app)
configure_user_auth_routes(
app=app,
auth_backends=auth_backends,
user_db=get_users_db(),
secret=COOKIE_SECRET,
)
return start_app
def create_stop_app_handler(app: FastAPI) -> Callable:
async def stop_app() -> None:
await close_db_connection(app)
return stop_app
This doesn't feel correct to me, does this mean all routes that use Depends for user-auth have to be included on the server startup event handler??
The author (frankie567) of fastapi-users created a repl.it showing a solution of sorts. My discussion about this solution may provide more context but the key parts of the solution are:
Don't bother using FastAPI startup trigger along with Depends for your MongDB connectivity management. Instead, create a separate file (ie db.py) to create your DB connection and client object. Import this db object whenever needed, like your Routers, and then use it as a global.
Also create a separate users.py to do 2 things:
Create globally used fastapi_users = FastAPIUsers(...) object for use with other Routers to handle authorization.
Create a FastAPI.APIRouter() object and attach all the fastapi-user routers to it (router.include_router(...))
In all your other Routers, import both db and fastapi_users from the above as needed
Key: split your main code up into
a main.py which only import uvicorn and serves app:app.
an app.py which has your main FastAPI object (ie app) and which then attaches all our Routers, including the one from users.py with all the fastapi-users routers attached to it.
By splitting up code per 4 above, you avoid the "attached to different loop" error.
I faced similar issue, and all I have to do to get motor and fastapi run in the same loop is this:
client = AsyncIOMotorClient()
client.get_io_loop = asyncio.get_event_loop
I did not set on_startup or whatsoever.
Is there a way to register databases in tortoise-orm from my Sanic app other than calling Tortoise.init?
from tortoise import Tortoise
await Tortoise.init(
db_url='sqlite://db.sqlite3',
modules={'models': ['app.models']}
)
# Generate the schema
await Tortoise.generate_schemas()
Sanic maintainer here.
Another answer offers the suggestion of using tortoise.contrib.sanic.register_tortoise that uses before_server_start and after_server_stop listeners.
I want to add a caveat to that. If you are using Sanic in ASGI mode, then you really should be using the other listeners: after_server_start and before_server_stop.
This is because there is not really a "before" server starting or "after" server stopping when the server is outside of Sanic. Therefore, if you are implementing the suggested solution as adopted by tortoise in ASGI mode, you will receive warnings in your logs every time you spin up a server. It is still supported, but it might be an annoyance.
In such case:
#app.listener('after_server_start')
async def setup_db(app, loop):
...
#app.listener('before_server_stop')
async def close_db(app, loop):
...
Yes, you can use register_tortoise available from tortoise.contrib.sanic
It registers before_server_start and after_server_stop hooks to set-up and tear-down Tortoise-ORM inside a Sanic webserver. Check out this sanic integration example from tortoise orm.
you can use it like,
from sanic import Sanic, response
from models import Users
from tortoise.contrib.sanic import register_tortoise
app = Sanic(__name__)
#app.route("/")
async def list_all(request):
users = await Users.all()
return response.json({"users": [str(user) for user in users]})
register_tortoise(
app, db_url="sqlite://:memory:", modules={"models": ["models"]}, generate_schemas=True
)
if __name__ == "__main__":
app.run(port=5000)
models.py
from tortoise import Model, fields
class Users(Model):
id = fields.IntField(pk=True)
name = fields.CharField(50)
def __str__(self):
return f"User {self.id}: {self.name}"
I've started a new project and I want to make Celery save results to several MongoDB collections instead of one. Is there a way to do that through configs or do I need to extend Celery and Kombu to achieve that?
You don't need to modify Celery, you can extend it. That's exactly what I did for one internal project. I didn't want to touch the standard results backend (Redis in my case), but wanted to also store the tasks' state and results in MongoDB for good while enhancing the state/results at the same time.
I ended up creating a little library with class called TaskTracker that uses Celery signals machinery to achieve the goal. The key parts of the implementation look like this:
import datetime
from celery import signals, states
from celery.exceptions import ImproperlyConfigured
from pymongo import MongoClient, ReturnDocument
class TaskTracker(object):
"""Track task processing and store the state in MongoDB."""
def __init__(self, app):
self.config = app.conf.get('task_tracker')
if not self.config:
raise ImproperlyConfigured('Task tracker configuration missing')
self.tasks = set()
self._mongo = None
self._connect_signals()
#property
def mongo(self):
# create client on first use to avoid 'MongoClient opened before fork.'
# warning
if not self._mongo:
self._mongo = self._connect_to_mongodb()
return self._mongo
def _connect_to_mongodb(self):
client = MongoClient(self.config['mongodb']['uri'])
# check connection / error handling
# ...
return client
def _connect_signals(self):
signals.task_received.connect(self._on_task_received)
signals.task_prerun.connect(self._on_task_prerun)
signals.task_retry.connect(self._on_task_retry)
signals.task_revoked.connect(self._on_task_revoked)
signals.task_success.connect(self._on_task_success)
signals.task_failure.connect(self._on_task_failure)
def _on_task_received(self, sender, request, **other_kwargs):
if request.name not in self.tasks:
return
collection = self.mongo \
.get_database(self.config['mongodb']['database']) \
.get_collection(self.config['mongodb']['collection'])
collection.find_one_and_update(
{'_id': request.id},
{
'$setOnInsert': {
'name': request.name,
'args': request.args,
'kwargs': request.kwargs,
'date_received': datetime.datetime.utcnow(),
'job_id': request.message.headers.get('job_id')
},
'$set': {
'status': states.RECEIVED,
'root_id': request.root_id,
'parent_id': request.parent_id
},
'$push': {
'status_history': {
'date': datetime.datetime.utcnow(),
'status': states.RECEIVED
}
}
},
upsert=True,
return_document=ReturnDocument.AFTER)
# similarly for other signals...
def _on_task_prerun(self, sender, task_id, task, args, kwargs,
**other_kwargs):
# ...
def _on_task_retry(self, sender, request, reason, einfo, **other_kwargs):
# ...
# ...
def track(self, task):
"""Set up tracking for given task."""
# accept either task name or task instance (for use as a decorator)
if isinstance(task, str):
self.tasks.add(task)
else:
self.tasks.add(task.name)
return task
Then you need to provide configuration for MongoDB. I use YAML configuration file for Celery so it looks like this:
# standard Celery settings...
# ...
task_tracker:
# MongoDB database for storing task state and results
mongodb:
uri: "\
mongodb://myuser:mypassword#\
mymongo.mydomain.com:27017/?\
replicaSet=myreplica&tls=true&connectTimeoutMS=5000&\
w=1&wtimeoutMS=3000&readPreference=primaryPreferred&maxStalenessSeconds=-1&\
authSource=mydatabase&authMechanism=SCRAM-SHA-1"
database: 'mydatabase'
collection: 'tasks'
In your tasks module, you just create the class instance providing your Celery app and decorate your tasks:
import os
from celery import Celery
import yaml
from celery_common.tracking import TaskTracker # my custom utils library
config_file = os.environ.get('CONFIG_FILE', default='/srv/celery/config.yaml')
with open(config_file) as f:
config = yaml.safe_load(f) or {}
app = Celery(__name__)
app.conf.update(config)
tracker = TaskTracker(app)
#tracker.track
#app.task(name='mytask')
def mytask(myparam1, myparam2, *args, **kwargs):
pass
Now your tasks' state and results are going to be tracked in MongoDB, separate from the standard results backend. If you need to store it in multiple databases, you can adjust it a bit, create multiple TaskTracker instances and provide multiple decorators to your tasks.
Celery is licensed under The BSD License. The source code is on https://github.com/celery/celery
it is possible to extend Celery ?
Yes, of course. This is parts of the freedom granted by open source licenses.
I was wondering weather I can make Celery and save results to several MongoDB collections instead of one?
So you download the source code and take the necessary time and efforts to study it and modify it.
Read about forking a software development. Consider proposing your code improvements upstream (on github, with a pull request).
Could you please tell me how are you fetching specific details from the tracker store. Elaborating my doubt below:
in my run_app.py (socketIO class) i have used mongotracker like this-
db = MongoTrackerStore(domain=“d.yml”,host=‘host ip’, db=‘xyz’, username=“x”,password=“x”,collection=“x”,event_broker=None)
agent = Agent.load(‘models/dialogue’, interpreter=‘models/current/nlu’,action_endpoint = action_endpoint,tracker_store=db)
now i want to fetch some data like db.sender_id or db.event. the reason of doing it is to store it column wise on my mongodb.Please help me solving this problem.
This information should already be stored in your mongodb, so there should be no extra need for you to store it.
Maybe see the documentation for this https://rasa.com/docs/core/tracker_stores/ and make sure your endpoints.yml file includes the correct information:
tracker_store:
store_type: mongod
url: <url to your mongo instance, e.g. mongodb://localhost:27017>
db: <name of the db within your mongo instance, e.g. rasa>
username: <username used for authentication>
password: <password used for authentication>
auth_source: <database name associated with the user’s credentials>
For information on how to fetch specific details from your mongodb maybe have a look at the mongodb docs https://docs.mongodb.com/manual/reference/method/db.collection.find/.
look at this Example
I am using Pymongo to connect monogoDB. try to understand my code
from typing import Any, Text, Dict, List
from pymongo.database import Database
from pymongo import MongoClient
from rasa_sdk import Action, Tracker
from rasa_sdk.executor import CollectingDispatcher
import pymongo
# url="http://localhost:3000/api"
client = pymongo.MongoClient("localhost", 27017)
db=client.sample
class mercdesCarAction(Action):
def name(self):
return "mercdesCarAction"
def run(self,dispatcher,tracker,domain):
res = db.datas.find({'action':'mercdesCarAction'})
for i in res:
dispatcher.utter_button_message(i['text'],i['buttons'])
return []
I am running a play framework website that uses squeryl and mysql database.
I need to use squeryl to run all read queries to the slave and all write queries to the master.
How can I achieve this? either via squeryl or via jdbc connector itself.
Many thanks,
I don't tend to use MySQL myself, but here's an idea:
Based on the documentation here, the MySQL JDBC driver will round robin amongst the slaves if the readOnly attribute is properly set on the Connnection. In order to retrieve and change the current Connection you'll want to use code like
transaction {
val conn = Session.currentSession.connection
conn.setReadOnly(true)
//Your code here
}
Even better, you can create your own readOnlyTransaction method:
def readOnlyTransaction(f: => Unit) = {
transaction {
val conn = Session.currentSession.connection
val orig = conn.getReadOnly()
conn.setReadOnly(true)
f
conn.setReadOnly(orig)
}
}
Then use it like:
readOnlyTransaction {
//Your code here
}
You'll probably want to clean that up a bit so the default readOnly state is reset if an exception occurs, but you get the general idea.