Is there a way to access a variable from a calling function in python? - global

I'm unsure if this is possible but I was wondering if there is a way to get a variable from an outer scope without passing it as an argument.
I've played around with global() and inspect but i'm having issues trying to get the attribute.
Here's what i'm trying to do:
class Example:
#staticmethod
def query(**kwargs):
print(f.read())
with open(__file__) as f:
Example.query(foo='bar')

So after a while of back and forth I have finally found a solution to my issue.
As Matthias suggested I use global to find the object, i decided to use inspect to add it myself like so:
Assigning
def __enter__(self):
inspect.stack()[1][0].f_globals["_ExampleName"] = self
Retrieving (Fixed)
#staticmethod
def _find_example():
stack = inspect.stack()
for stack_index in range(2, len(stack)):
stack_globals = inspect.stack()[stack_index][0].f_globals
if "_ExampleName" in stack_globals.keys():
return stack_globals["_ExampleName"]
This is a bit 'dodgy' as inspect is not ment to be used in a production environment, However works and solves my issue

Here is a MCVE of what you're trying to do.
class Example:
#staticmethod
def query(**kwargs):
print(f.read())
with open(__file__) as f:
Example.query(foo='bar')
Works as expected.

What you should do is have the Client class set a class variable to the current session.
class Client:
last_session = None
def Session():
# code that creates new session, in variable s
Client.last_session = s
return s
client = Client()
with client.Session as s:
Example.query(foo=bar)
Class Example:
#staticmethod
def query(**kwargs):
s = Client.last_session
s.magic_stuff()

Related

Pytest setup class once before testing

I'm using pytest for testing with mixer library for generating model data. So, now I'm trying to setup my tests once before they run. I grouped them into TestClasses, set to my fixtures 'class' scope, but this doesn't work for me.
#pytest.mark.django_db
class TestCreateTagModel:
#classmethod
#pytest.fixture(autouse=True, scope='class')
def _set_up(cls, create_model_instance, tag_model, create_fake_instance):
cls.model = tag_model
cls.tag = create_model_instance(cls.model)
cls.fake_instance = create_fake_instance(cls.model)
print('setup')
def test_create_tag(self, tag_model, create_model_instance, check_instance_exist):
tag = create_model_instance(tag_model)
assert check_instance_exist(tag_model, tag.id)
conftest.py
pytest.fixture(scope='class')
#pytest.mark.django_db(transaction=True)
def create_model_instance():
instance = None
def wrapper(model, **fields):
nonlocal instance
if not fields:
instance = mixer.blend(model)
else:
instance = mixer.blend(model, **fields)
return instance
yield wrapper
if instance:
instance.delete()
#pytest.fixture(scope='class')
#pytest.mark.django_db(transaction=True)
def create_fake_instance(create_related_fields):
"""
Function for creating fake instance of model(fake means that this instance doesn't exists in DB)
Args:
related (bool, optional): Flag which indicates create related objects or not. Defaults to False.
"""
instance = None
def wrapper(model, related=False, **fields):
with mixer.ctx(commit=False):
instance = mixer.blend(model, **fields)
if related:
create_related_fields(instance, **fields)
return instance
yield wrapper
if instance:
instance.delete()
#pytest.fixture(scope='class')
#pytest.mark.django_db(transaction=True)
def create_related_fields():
django_rel_types = ['ForeignKey']
def wrapper(instance, **fields):
for f in instance._meta.get_fields():
if type(f).__name__ in django_rel_types:
rel_instance = mixer.blend(f.related_model)
setattr(instance, f.name, rel_instance)
return wrapper
But I'm catching exception in mixer gen_value method: Database access not allowed, use django_db mark(that I'm already use). Do you have any ideas how this can be implemented?
You can set things up once before a run by returning the results of the setup, rather than modifying the testing class directly. From my own attempts, it seems any changes to the class made within class-scope fixtures are lost when individual tests are run. So here's how you should be able to do this. Replace your _setup fixture with these:
#pytest.fixture(scope='class')
def model_instance(self, tag_model, create_model_instance):
return create_model_instance(tag_model)
#pytest.fixture(scope='class')
def fake_instance(self, tag_model, create_fake_instance):
return create_fake_instance(tag_model)
And then these can be accessed through:
def test_something(self, model_instance, fake_instance):
# Check that model_instance and fake_instance are as expected
I'm not familiar with Django myself though, so there might be something else with it going on. This should at least help you solve one half of the problem, if not the other.

How can I provide a custom header to a ZIO during tests

I have service that returns a ZIO[Has[MyCustomHeader]], and I'm having trouble testing it.
Other services in our organisation are tested by converting ZIO to Twitterfuture using runtime.unsafeRunToFuture (where runtime is a Runtime[ZEnv] ) and then awaiting the future, thus running the tests in blocking mode.
However this service has a Has[] requirement and runtime.unsafeRunToFuture doesnt handle those. So far my approach has been to try to convert my ZIO[Has[MyCustomHeader]] to a ZIO[ZEnv], but I've yet to succeed at this.
from what I gather I need to provide a ZLayer via ZIO.provideSomeLayer() but I'm simply too stupid to understand how to construct a ZLayer properly?
Am I even on the right path here? and if so, How do I construct a ZLayer with a static value for MyCustomHeader to use in my tests?
This is how far along I am at trying to add a header for testing purposes: it doesn't work, but might illustrate what I'm trying to achieve..maybe... I'm pretty confused myself:
object effectAwait {
implicit class ZioEffect[A](private val value: ZIO[Has[EnvironmentHeader], RequestFailure, A]) extends AnyVal {
final def await(implicit runtime: Runtime[ZEnv] = Runtime.default): A = {
val zmanaged = ZManaged.fromEffect(value).provide(Has(EnvironmentHeader("test")))
val layered = value.provideSomeLayer(zmanaged.toLayer)
val sf = runtime.unsafeRunToFuture(layered)
Await.result(sf, 10.seconds)
}
}
}
this however gives me the error:
could not find implicit value for izumi.reflect.Tag[A]. Did you
forget to put on a Tag, TagK or TagKK context bound on one of the
parameters in A? e.g. def x[T: Tag, F[_]: TagK] = ...
<trace>:
deriving Tag for A, dealiased: A:
could not find implicit value for Tag[A]: A is a type parameter without an implicit Tag!
val layered = value.provideSomeLayer(zmanaged.toLayer)
I think you can just use ZIO.provideLayer (instead of provideSomeLayer) here :)
Also, there's a runtime.unsafeRun that will wait for the result as well, so you don't necessarily have to convert it to a Future. Also, also, instead of relying on an implicit runtime, there's always zio.Runtime.default that you can use anywhere (it's a Runtime[ZEnv] so it should work just as well, unless you've otherwise customized the runtime's behavior)

How do I test code that requires an Environment Variable?

I have some code that requires an Environment Variable to run correctly. But when I run my unit tests, it bombs out once it reaches that point unless I specifically export the variable in the terminal. I am using Scala and sbt. My code does something like this:
class something() {
val envVar = sys.env("ENVIRONMENT_VARIABLE")
println(envVar)
}
How can I mock this in my unit tests so that whenever sys.env("ENVIRONMENT_VARIABLE") is called, it returns a string or something like that?
If you can't wrap existing code, you can change UnmodifiableMap System.getenv() for tests.
def setEnv(key: String, value: String) = {
val field = System.getenv().getClass.getDeclaredField("m")
field.setAccessible(true)
val map = field.get(System.getenv()).asInstanceOf[java.util.Map[java.lang.String, java.lang.String]]
map.put(key, value)
}
setEnv("ENVIRONMENT_VARIABLE", "TEST_VALUE1")
If you need to test console output, you may use separate PrintStream.
You can also implement your own PrintStream.
val baos = new java.io.ByteArrayOutputStream
val ps = new java.io.PrintStream(baos)
Console.withOut(ps)(
// your test code
println(sys.env("ENVIRONMENT_VARIABLE"))
)
// Get output and verify
val output: String = baos.toString(StandardCharsets.UTF_8.toString)
println("Test Output: [%s]".format(output))
assert(output.contains("TEST_VALUE1"))
Ideally, environment access should be rewritten to retrieve the data in a safe manner. Either with a default value ...
scala> scala.util.Properties.envOrElse("SESSION", "unknown")
res70: String = Lubuntu
scala> scala.util.Properties.envOrElse("SECTION", "unknown")
res71: String = unknown
... or as an option ...
scala> scala.util.Properties.envOrNone("SESSION")
res72: Option[String] = Some(Lubuntu)
scala> scala.util.Properties.envOrNone("SECTION")
res73: Option[String] = None
... or both [see envOrSome()].
I don't know of any way to make it look like any/all random env vars are set without actually setting them before running your tests.
You shouldn't test it in unit-test.
Just extract it out
class F(val param: String) {
...
}
In your prod code you do
new Foo(sys.env("ENVIRONMENT_VARIABLE"))
I would encapsulate the configuration in a contraption which does not expose the implementation, maybe a class ConfigValue
I would put the implementation in a class ConfigValueInEnvVar extends ConfigValue
This allows me to test the code that relies on the ConfigValue without having to set or clear environment variables.
It also allows me to test the base implementation of storing a value in an environment variable as a separate feature.
It also allows me to store the configuration in a database, a file or anything else, without changing my business logic.
I select implementation in the application layer.
I put the environment variable logic in a supporting domain.
I put the business logic and the traits/interfaces in the core domain.

Modifying value returned from a method in Scala (Highcharts lib)

I am using an external library in Scala, which uses a set of traits to pass around complex configuration options to different methods. This is Highcharts Scala API, but the problem seems to be more general.
The library defines a trait (HighchartsOptions in the actual usage), which is just a data transfer object that stores a number of fields and allows them to be passed around. Code simplified and generalized for clarity looks like this:
trait Opts {
def option1: Int = 3
def option2: String = "abc"
//Many more follow, often of more complex types
}
As long as the complete set of options can be generated in one place, this allows for a neat syntax:
val opts = new Opts() {
override val option1 = 5
//And so on for more fields
}
doSomething(opts)
However, there are a few situations where one piece of code prepares such a configuration but another piece of code needs to adjust just one option extra. It would be nice to be able to pass some Opts instance to a method and let the method modify a value or two.
Since the original trait is based on defs rather than vars, it's easy to override an option's value only if the type of the object is known, like in the example above. If a method receives only an instance of some anonymous subclass of Opts, how can it create another instance or modify the received one so that a call to e.g. option2 could return a different value? The desired operation is similar to what Mockito's spy does, however I feel there should be some less contrived way than using a mocking framework to achieve this effect.
PS: Actually I am a bit surprised by the use of such an interface by the library's authors, so perhaps I'm missing something and there is some completely different way of achieving my goal of building a single set of options from several different places in the code (e.g. some builder object that is mutable and that I can pass around instead of the finished HighchartsOptions)?
I would first check if using the Opts trait (solely) is an absolute necessity. Hopefully it's not and then you just extend the trait, overriding defs with vars, like you said.
When Opts is mandatory and you have its instance that you want to copy modifying some fields, here's what you could do:
Write a wrapper for Opts, which extends Opts, but delegates every call to the wrapped Opts excluding the fields that you want to modify. Set those fields to values you want.
Writing the wrapper for a broad-interface trait can be boring task, therefore you may consider using http://www.warski.org/blog/2013/09/automatic-generation-of-delegate-methods-with-macro-annotations/ to let macros generate most of it automatically.
The shortest, simplest way.
Define a case class:
case class Options(
option1: Int,
option2: String
/* ... */
) extends Opts
and implicit conversion from Opts to your Options
object OptsConverter {
implicit def toOptions(opts: Opts) = Options(
option1 = opts.option1,
option2 = opts.option2
/* ... */
)
}
That way you get all copy methods (generated by compiler) for free.
You can use it like that:
import OptsConverter.toOptions
def usage(opts: Opts) = {
val improvedOpts = opts.copy(option2 = "improved")
/* ... */
}
Note, that Options extends Opts, so you can use it whenever Opts is required. You'll be able to call copy to obtain a modified instance of Opts in every place where you import the implicit conversion.
The simplest solution is to allow the trait to define it's own "copy" method, and allow it's subclasses (or even base class) to work with that. However, the parameters can really only match the base class unless you recast it later. Incidentally this doesn't work as "mixed in" so your root might as well be an abstract class, but it works the same way. The point of this is that the subclass type keeps getting passed along as it's copied.
(Sorry I typed this without a compiler so it may need some work)
trait A {
type myType<:A
def option1: Int
def option2: String
def copyA(option1_:Int=option1, option2_String=option2):myType = new A {
def option1 = option_1
def option2 = option_2
}
}
trait B extends A { me=>
type myType = B
def option3: Double
//callable from A but properly returns B
override def copyA(option1_:Int=option1, option2_:String=option2):myType = new B {
def option1 = option_1
def option2 = option_2
def option3 = me.option3
}
//this is only callable if you've cast to type B
def copyB(option1_:Int=option1, option2_:String=option2, option3_:Double=option3):myType = new B {
def option1 = option_1
def option2 = option_2
def option3 = option_3
}
}

In Scala, is there a way to have two overloaded methods that only differ in whether an implicit is available?

I'm writing a Scala application that accesses a database. Most of the time, there will be a connection available, but sometimes there won't be. What I'd like to do is something like the following:
object User {
def authenticate(username: String, password: String)
(implicit conn: Connection): Option[User] = {
// use conn to grab user from db and check that password matches
// return Some(user) if so, None if not
}
def authenticate(username: String, password: String): Option[User] = {
implicit val conn = DB.getConnection()
authenticate(username, password)
}
}
What I hoped would happen is that, if there's an implicit value of type Connection available, the compiler would use the first method. If not, it would use the second. Unfortunately, I've discovered that the compiler isn't quite that smart or, if it is, I'm not telling it what to do in the right way.
So, my basic question is, is there a way to write a method that expects an implicit argument and then provide an overloaded version of the same method that creates an acceptable value of the implicit parameter's type if there isn't one available.
You might say, "Why would you want to do such a thing? If you can create an acceptable value of the appropriate type, why not just always do it?" And that's mostly true, except that if I have an open database connection, I'd prefer to go ahead and use it rather than creating a new one. However, if I don't have an open database connection, I know where to get one.
I mean, the simple answer is to just give the two methods different names, but I shouldn't have to, gosh-darn-it. But maybe I do...
Thanks!
Todd
You don't need overloaded methods. Just give your implicit parameter a default value, i.e.
object User {
def authenticate(username:String, password:String)(implicit conn:Connection = null): Option[User] = {
val real_conn = Option(conn).getOrElse(DB.getConnection())
// do the rest with the real_conn
}
}
The cleaner solution which I can think of is using nested methods and as someone suggested, default values for implicits.
class Testclass {
def myMethod(a:Int)(implicit b:Option[Int]=None):Int = {
def myMethodInternal(a:Int, b:Int):Int = {
// do something
a+b
}
val toUse = b.getOrElse(30)
myMethodInternal(a,toUse)
}
}
Inside your method you define an myMethodInternal, which takes no implicits but only explicits parameters. This method will be visible only inside myMethod, and you will prepare your second parameter like the following:
val toUse = b.getOrElse(30)
And finally call your method with explicits parameters:
myMethodInternal(a,toUse)