Unit test telegram dispatcher handler - pytest

Here's a simple handler initialization in telegram dispatcher:
from telegram.ext import Updater, CallbackQueryHandler
updater = Updater(tg_token, use_context=True)
updater.dispatcher.add_handler(CallbackQueryHandler(
item_details,
pattern=r"^item_(\d+)$",
))
How to unit test this?
I want to send somewhere some data, e.g. item_232, and check if function item_details was called properly, but I can't find where this data should be passed. Please, help

Related

Why does on_command_error fire when I have already ignored a message that is a bad command?

I have the following 2 listeners in the sole cog for my bot:
#Cog.listener()
async def on_command_error(self, ctx, error) :
if isinstance(error, CommandNotFound):
await ctx.send_help()
#Cog.listener()
async def on_message(self, message):
if message.author.bot:
return
if message.content.startswith(">--"):
return
The command prefix is >-, and I was under the impression that the return for messages starting with >-- would mean that the message goes no further than on_message, yet when I give the command >--halp, on_command_error fires with the error:
CommandNotFound('Command "-halp" is not found')
Is my understanding of how to ignore messages fault, or does on_command_error even fire for "dead" messages, or what am I doing wrong?
Registering a listener inside of a cog does not replace the default one used by the bot (if a default exists).
In your case, every time a message is sent to a channel the bot can see, both the custom on_message event in the cog as well as the default on_message event will trigger.
This is why the CommandNotFound error is raised, since the default on_message event still tries to process the message and check if a command was called.
If you override the default on_message event in your main file (where your bot client is defined), then both custom on_message events will still trigger (the main and the cog).
This can be verified with the below. When the bot sees a message, both bot.py and cog.py will be printed, indicating both on_message events are triggered.
bot.py
from discord.ext import commands
client = commands.Bot(command_prefix='>-')
client.load_extension('cog')
#client.event
async def on_message(message):
print('bot.py')
await client.process_commands(message)
client.run('TOKEN')
cog.py
from discord.ext import commands
class TestCog(commands.Cog):
def __init__(self, bot):
self.bot = bot
#commands.Cog.listener()
async def on_message(self, message):
print('cog.py')
def setup(bot):
bot.add_cog(TestCog(bot))
If you want the bot to completely ignore certain messages, then you need to do so in a custom on_message event in the file where the bot client is defined.

Using `delay` in Sidekiq with dynamic class/method names

Let's say I have some internal business logic that determines what mailer needs to be sent.
The output of this business logic is a Hash in the following format -
{ mailer_class: SomeMailer, mailer_method: "foo_email" }
{ mailer_class: OtherMailer, mailer_method: "bar_email" }
# etc...
I need to call the appropriate mailer based on the info above so I try something like this with Sidekiq's built in delay -
data = { mailer_class: ..., mailer_method: "..." }
data[:mailer_class].delay.send(mailer[:method])
This results in Sidekiq queueing up the send method which will eventually be called on my mailer.
Functionally it might work correctly, because that class after all will receive the appropriate method. But it feels a bit dirty and it interrupts other processes that watch the sidekiq queue because they expect to see a mailer method name but find :send instead.
Is there a good way around this or am I stuck modifying the rest of my application logic to work with this?
Thanks!
Why not pass that Hash to a Sidekiq Worker which knows how to send emails with that class/method combo?
def perform(hash)
hash['mailer_class'].constantize.send(hash['mailer_method'])
end

How to make internal synchronous post request in Play framework and scala?

I'm new to Play and Scala. I'm trying to build an Application using Play and Scala. I need to make post call internally to get data from my server. But this should be synchronous. After getting the data from this post request, I need to send that data to front end. I've seen many resources but all are asynchronous. Please help me.
I'm fetching data from DB and then should return the data as response.
DB is at remote server not in the hosted server.
I think you should not block anyway.
def action = Action.async {
WS.url("some url")
.post(Json.toJson(Map("query"->query)))
.map { response =>
val jsonResponse = response.json
// in this place you have your response from your call
// now just do whatever you need to do with it,
// in this example I will return it as `Ok` result
Ok(jsonResponse)
}
}
Just map the result of your call and modify it staying in context of Future and use Action.async that takes a Future.
If you really want to block use Await.result(future, 5 seconds), importing
import scala.concurrent.duration._
import scala.concurrent.Await
See docs for Await here
All requests are asynchronous but nothing prevents you from waiting the response with await in your code.
val response = await(yourFutureRequest).body
The line written above will block until the future has finished.

Is this correct way to implement GMail widget in Lift?

I'm trying to implement a GMail widget similar to those on iGoogle or Netvibse to practise how to use Comet in Lift web framework.
Currently what I have is the following code, its short and works amazingly.
But I'm not sure about is this best way to implement it. Because retrieve mails from GMail is a time-consuming job, and the following code only has one GMailListener, which will blocks when get mails from GMail.
I guess that means if there are two users on my website, for example UserA and UserB.
Although the following code is thread safe, but if they both on the page that using this Comet, UserB still have to wait until mail of UserA is processed to get his own result, right?
What is the best way to avoid the blocking?
import net.liftweb.actor.LiftActor
import net.liftweb.util.Schedule
import net.liftweb.util.Helpers._
import net.liftweb.http.CometActor
import net.liftweb.http.js.JsCmds.SetHtml
import net.liftweb.http.js.jquery.JqJsCmds._
case class FetchGMail(userID: Int, sender: CometActor)
case class NewStuffs(mails: List[Stuff])
object GMailListener extends LiftActor
{
def getMails(userID: Int) = {
// Get Mails from GMail
}
def messageHandler = {
case FetchGMail(userID, sender) =>
println("Get FetchMail request")
sender ! NewStuffs(getMails(userID))
Schedule.schedule(this, FetchGMail(userID, sender), 5 minutes)
}
}
class Inbox extends CometActor with JSImplicit
{
def render = <div>Empty Inbox</div>
GMailListener ! FetchGMail(1, this)
override def lowPriority = {
case NewStuffs(mails) =>
println("get new mails")
partialUpdate(AppendHtml("mails", <div>{mails}</div>))
}
}
Just keep in mind that an actor can only process one message at a time and will only consume resources when it is processing messages. Your GmailListener is a singleton, so it could be a bottleneck right now, but there is no reason you can't create an instance of GmailListener for each user. Each instance will only wake up and utilize a thread to do Gmail lookups when your schedule call dictates. Just make sure that you shut the corresponding GmailListener down when the Inbox shuts down. Take a look at net.liftweb.http.CometListener which I think should help with that.

store and setRequest

I have a jobque mechanism in ZF.
The jobque simlpy stores the the function call (Class, Method and params) and later executes it as CLI daemon. The daemon works, however at places the application looks for information from the request object, and when called from the CLI these places fail, or get no info.
I would like to store the original request object together with the job and when the job is processed set the request object back as if the job was done by the originall request, somethin along the line of the following pseudo code:
$ser_request = serialize(Zend_Controller_Front::getInstance ()->getRequest ());
-->save to db
-->retrive from db
$ZCF= new Zend_Controller_Front;
$ZCF::getInstance ()->setRequest (unserialize($ser_request))
The aim is to store and replay the jobs later withouth having to change the rest of the application.
Any suggestions how to do that?
I am not sure if this works, but here's an idea. Try to implement _sleep and _wakeup magic methods for the request object. Haven't tried it out, but maybe it's at least a starting solution.