Local variable referenced before assignment in class? with python, discordpy - python-3.7

i'm having some trouble in order to make a cog with the discordpy rewrite branch in python.
I'm trying to make a command to start a connection to a database using mysql connector and to create a simple table. The problem is that when i define a cursor variable like stated in the official mysql docs i get an error:
"local variable 'cnx' referenced before assignment"
Now this is the code:
import discord
from discord.ext import commands
import json
import asyncio
import mysql.connector
from mysql.connector import errorcode
with open("config.json") as configfile:
config = json.load(configfile)
class testcog:
def __init__(self, client):
self.client = client
#commands.command()
async def dbconnect(self, ctx):
await ctx.message.author.send('I\'m connecting to the database, please be patient.')
try:
cnx = mysql.connector.connect(user=config['sqlconfig']['user'], password=config['sqlconfig']['password'],
host=config['sqlconfig']['host'],
database=config['sqlconfig']['database'])
except mysql.connector.Error as err:
if err.errno == errorcode.ER_ACCESS_DENIED_ERROR:
print("Something is wrong with your user name or password")
elif err.errno == errorcode.ER_BAD_DB_ERROR:
print("Database does not exist")
else:
print(err)
else:
cnx.close()
cursor = cnx.cursor()
TABLES = {}
TABLES['employee'] = (
"CREATE TABLE `employee` ("
" `emp_no` int(11) NOT NULL AUTO_INCREMENT,"
" `birth_date` date NOT NULL,"
" `first_name` varchar(14) NOT NULL,"
" `last_name` varchar(16) NOT NULL,"
" `gender` enum('M','F') NOT NULL,"
" `hire_date` date NOT NULL,"
" PRIMARY KEY (`emp_no`)"
") ENGINE=InnoDB")
for table_name in TABLES:
table_description = TABLES[table_name]
try:
print("Creating table {}: ".format(table_name), end='')
cursor.execute(table_description)
except mysql.connector.Error as err:
if err.errno == errorcode.ER_TABLE_EXISTS_ERROR:
print("already exists.")
else:
print(err.msg)
else:
print("OK")
cursor.close()
cnx.close()
def setup(client):
client.add_cog(testcog(client))
The table and the code to create it was copied directly from the official docs.
The piece of code that gives me the error is : cursor = cnx.cursor() just before the TABLES dictionary is created.
I don't understand what i'm doing wrong, help is much appreciated.

I think I can provide some help for you!
When working in a cog file, you need to inherit commands.Cog in your main class. In addition to this, you should async be opening and closing your json file.
We use async with discord.py to make it so if multiple people use your commands, the bot wont get backed up (It's so the bot can do multiple things at one time). There is an async library for MySql, and async libraries for opening json files, so let's look into using them.
You can check out the aiomysql documentation here: https://aiomysql.readthedocs.io/en/latest/
Let's work on setting up your problem. In order to do this, we need to make sure our bot is setup for our db. We setup something called a "pool", that changes as the DB changes.
Im going to show the file structure I have in this example for you:
main.py
/cogs
testcog.py
# When creating our bot, we want to setup our db (database) connection, so we can reference it later
from discord.ext import commands
import discord
import aiomysql
import asyncio
import aiofiles, json
loop = asyncio.get_event_loop()
bot = commands.Bot(command_prefix = "!", intents=discord.Intents.all())
#bot.event
async def on_ready():
config = json.loads(await(await aiofiles.open("/home/pi/Desktop/Experimental/prestagingapi.json")).read())
bot.pool = await aiomysql.create_pool(host=config['sqlconfig']['host'], port = 0000, user = config['sqlconfig']['user'],
password = config['sqlconfig']['password'],
db = config['sqlconfig']['database'], loop=loop)
print("Bot is online!")
# We need to load our cogs and setup our db loop to reference it later
initial_extension = (
"cogs.testcog",
)
for extension in initial_extension:
bot.load_extension(extension)
bot.run("YOUR_TOKEN", reconnect=True)
Now, we can work inside of our cog to set everything up. I named the file of this cog, testcog.py inside of the folder, cogs.
import discord
from discord.ext import commands
class testCog(commands.Cog): # I defined that our class inherits the cog for discords
def __init__(self, bot):
self.bot = bot
#commands.command()
async def create_table(self, ctx):
await ctx.author.send('I\'m connecting to the database, please be patient.') #ctx.message.author is ctx.author
# now you can create your db connection here:
# looking at the aiomysql documentation, we can create a connection and execute what we need
async with self.bot.pool.acquire() as conn:
async with conn.cursor() as cur:
# in order to execute something (creaing a table for ex), we can do this:
await cur.execute()
def setup(bot): # every cog needs a setup function
bot.add_cog(testCog(bot))

Related

Pymongo not finding recently created element in pytest

I am writing a unit test where I check if an object can be found after being inserted in a mongodb, my unit test looks like this:
class TestReviewCRUD:
app = FastAPI()
config = dotenv_values("../.env")
app.include_router(review_router, tags=["reviews"], prefix="/review")
def setup_method(self):
self.app.db_client = MongoClient(f'mongodb://{self.config["DB_USER"]}:{self.config["DB_PASSWORD"]}#localhost:27017/')
self.app.db = self.app.db_client[self.config['TEST_DB_NAME']]
def teardown_method(self):
self.app.db_client.close()
def test_get_review(self):
with TestClient(self.app) as client:
response = self.given_a_new_review(client)
assert response.status_code == 201 # <- this works
new_review = client.get(f'/review/{response.json().get("_id")}')
assert new_review.status_code == 200 # <- this doesn't work
The element seems to be added to the database (per the 201 http code) and if I go into the docker container, I can see it in the mongo database, but running that get keeps failing, I'm not that versed in python so maybe I am missing something? My get method is structured as:
#router.get("/{id}", response_description="Get a single review by id", response_model=Review)
def find_review(id: str, request: Request):
review = request.app.db["my_db"].find_one({"_id": ObjectId(id)})
if review is not None:
return review
raise HTTPException(status_code=status.HTTP_404_NOT_FOUND, detail=f"Review with ID {id} not found")
If I look for an existing ID, it works, it is failing when I insert a new object and immediately look for it
Could someone shed a light, please?

Asyncio & Asyncpg & aiohttp one loop event for pool connection

I've been struggling to find a solution for my problem, I hope I've come to the right place.
I have a django rest framework API which connect to a postgresql db and I run bots on my own API in order to do stuff. Here is my code :
def get_or_create_eventloop():
"""Get the eventLoop only one time (create it if does not exist)"""
try:
return asyncio.get_event_loop()
except RuntimeError:
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
return asyncio.get_event_loop()
My DB class which use asyncpg to connect / create a pool :
Class DB():
def __init__(self,loop):
self.pool = loop.run_until_complete(self.connect_to_db())
def connect_to_db():
return await asyncpg.create_pool(host="host",
database="database",
user="username",
password="pwd",
port=5432)
My API class :
Class Api(APIView):
#create a loop event since its not the main thread
loop = get_or_create_eventloop()
nest_asyncio.apply() #to avoid the <loop already running> problem
#init my DB pool directly so I wont have to connect each time
db_object = DB(loop)
def post(self,request):
... #I want to be able to call "do_something()"
async def do_something(self):
...
I have my bots running and sending post/get request to my django api via aiohttp.
The problem I'm facing is :
How to implement my post function in my API so he can handle multiple requests knowing that it's a new thread each time therefore a new event loop is created AND the creation of the pool in asyncpg is LINKED to the current event loop, i.e can't create new event loop, I need to keep working on the one created at the beginning so I can access my db later (via pool.acquire etc)
This is what I tried so far without success :
def post(self,request):
self.loop.run_until_complete(self.do_something())
This create :
RuntimeError: Non-thread-safe operation invoked on an event loop other than the current one
which I understand, we are trying to call the event loop from another thread possibly
I also tried to use asyng_to_sync from DJANGO :
#async_to_sync
async def post(..):
resp = await self.do_something()
The problem here is when doing async_to_sync it CREATES a new event loop for the thread, therefore I won't be able to access my DB POOL
edit : cf https://github.com/MagicStack/asyncpg/issues/293 for that (I would love to implement something like that but can't find a way)
Here is a quick example of one of my bot (basic stuff) :
import asyncio
from aiohttp import ClientSession
async def send_req(url, session):
async with session.post(url=url) as resp:
return await resp.text()
async def run(r):
url = "http://localhost:8080/"
tasks = []
async with ClientSession() as session:
for i in range(r):
task = asyncio.asyncio.create_task(send_req(url, session))
tasks.append(task)
responses = await asyncio.gather(*tasks)
print(responses)
if __name__ == '__main__':
asyncio.run(main())
Thank you in advance
After days of looking for an answer, I found the solution for my problem. I just used the package psycopg3 instead of asyncpg (now I can put #async_to_sync to my post function and it works)

How can I run my own method in every pymodbus server transaction that is processed

I would like to run my own method whenever in a pymodbus server whenever a message is processed. Is that possible ?
Thanks
While running through the examples, I came across an example where they subclass pymodbus.datastore.ModbusSparseDataBlock https://pymodbus.readthedocs.io/en/latest/source/example/callback_server.html
The example probably implements more than you'd need, at minimum you should just override:
__init__: by passing the a dict of values, it provides the server with the legal address range a client can request
setValues: this is where the magic happens: here you can add your own callbacks to any incoming values for a given address.
My minimal example looks like this:
import logging
from pymodbus.datastore import (
ModbusServerContext,
ModbusSlaveContext,
ModbusSparseDataBlock,
)
from pymodbus.server.sync import StartSerialServer
from pymodbus.transaction import ModbusRtuFramer
logger = logging.getLogger(__name__)
class CallbackDataBlock(ModbusSparseDataBlock):
"""callbacks on operation"""
def __init__(self):
super().__init__({k: k for k in range(60)})
def setValues(self, address, value):
logger.info(f"Got {value} for {address}")
super().setValues(address, value)
def run_server():
block = CallbackDataBlock()
store = ModbusSlaveContext(di=block, co=block, hr=block, ir=block)
context = ModbusServerContext(slaves=store, single=True)
StartSerialServer(
context,
framer=ModbusRtuFramer,
port="/dev/ttyNS0",
timeout=0.005,
baudrate=19200,
)
if __name__ == "__main__":
run_server()

How in Scala/Spark copy file from Hadoop (hdfs) to remote SFTP server?

In the file system of Hadoop I have Excel file.
I have task to copy that file from Hadoop to remote SFTP server in my Scala/Spark application.
I have formed the opinion that directly it will not work. If my fears are correct, I need to make next steps:
1) Remove excel file from Hadoop to local directory. For example I can make it with Scala DSL:
import scala.sys.process._
s"hdfs dfs -copyToLocal /hadoop_path/file_name.xlsx /local_path/" !
2) From local directory send file to remote SFTP server. What kind of library you can recommend for this task?
Is my reasoning correct? What the best way to solve my problem?
As mentioned in the comment spark-sftp is good choice
if not you can try below sample code from apache-commons-ftp libraries.. which will list all remote files.. similarly you can delete the files as well.. untested pls try it.
Option1:
import java.io.IOException
import org.apache.commons.net.ftp.FTPClient
//remove if not needed
import scala.collection.JavaConversions._
object MyFTPClass {
def main(args: Array[String]): Unit = {
// Create an instance of FTPClient
val ftp: FTPClient = new FTPClient()
try {
// Establish a connection with the FTP URL
ftp.connect("ftp.test.com")
// Enter user details : user name and password
val isSuccess: Boolean = ftp.login("user", "password")
if (isSuccess) {
// empty array is returned
val filesFTP: Array[String] = ftp.listNames()
var count: Int = 1
// Iterate on the returned list to obtain name of each file
for (file <- filesFTP) {
println("File " + count + " :" + file) { count += 1; count - 1 }
}
}
// Fetch the list of names of the files. In case of no files an
// Fetch the list of names of the files. In case of no files an
ftp.logout()
} catch {
case e: IOException => e.printStackTrace()
} finally try ftp.disconnect()
catch {
case e: IOException => e.printStackTrace()
}
}
}
Option 2:
There is something called jsch library you can see this question and example snippet from SO
Well, finally I found the way to solve the task. I decided to use jsch library.
build.sbt:
libraryDependencies += "com.jcraft" % "jsch" % "0.1.55"
.scala:
import scala.sys.process._
import com.jcraft.jsch._
// Copy Excel file from Hadoop file system to local directory with Scala DSL.
s"hdfs dfs -copyToLocal /hadoop_path/excel.xlsx /local_path/" !
val jsch = new JSch()
val session = jsch.getSession("XXX", "XXX.XXX.XXX.XXX") // Set your username and host
session.setPassword("XXX") // Set your password
val config = new java.util.Properties()
config.put("StrictHostKeyChecking", "no")
session.setConfig(config)
session.connect()
val channelSftp = session.openChannel("sftp").asInstanceOf[ChannelSftp]
channelSftp.connect()
channelSftp.put("excel.xlsx", "sftp_path/") // set your path in remote sftp server
channelSftp.disconnect()
session.disconnect()

How to use MongoDB with r2d2 and actix in rust

I am trying to make a basic web application with the rust language, using the actix framework and r2d2 with mongodb as the database. I could not find any complete and working documentation on how to archive this. Maybe someone can help me out here.
The problem is, that i can't seem to get a mongodb connection from the r2d2 connection pool. Sadly this part isnt covered in any documentation i found.
Some links i found:
Using r2d2 with actix: https://github.com/actix/examples/blob/master/r2d2/src/main.rs
Using mongodb with r2d2: https://docs.rs/r2d2-mongodb/0.2.2/r2d2_mongodb/
This part creates the connection pool and hands it to actix.
fn main() {
std::env::set_var("RUST_LOG", "actix_web=info");
env_logger::init();
let manager = MongodbConnectionManager::new(
ConnectionOptions::builder()
.with_host("localhost", 27017)
.with_db("mydatabase")
.build()
);
let pool = Pool::builder()
.max_size(16)
.build(manager)
.unwrap();
HttpServer::new( move || {
App::new()
// enable logger
.wrap(middleware::Logger::default())
// store db pool in app state
.data(pool.clone())
// register simple handler, handle all methods
.route("/view/{id}", web::get().to(view))
})
.bind("127.0.0.1:8080")
.expect("Can not bind to port 8080")
.run()
.unwrap();
}
This is the handler function trying to access the connection pool
fn view(req: HttpRequest,
pool: web::Data<Pool<MongodbConnectionManager>>) -> impl Responder {
let id = req.match_info().get("id").unwrap_or("unknown");
let conn = pool.get().unwrap();
let result = conn.collections("content").findOne(None, None).unwrap();
// HERE BE CODE ...
format!("Requested id: {}", &id)
}
This is the error showing my problem. The conn variable doesnt seem to be a propper mongodb connection.
error[E0599]: no method named `collections` found for type `std::result::Result<r2d2::PooledConnection<r2d2_mongodb::MongodbConnectionManager>, r2d2::Error>` in the current scope --> src\main.rs:29:23
|
29 | let result = conn.collections("content").findOne(None, None).unwrap();
|
10 | let coll = conn.collection("simulations");
| ^^^^^^^^^^
|
= help: items from traits can only be used if the trait is in scope
= note: the following trait is implemented but not in scope, perhaps add a `use` for it:
`use crate::mongodb::db::ThreadedDatabase;`
my compiler told me to add mongodb::db::ThreadedDatabase in scope.