How to mock a nested Function in python having some parameters in it - pytest

I'am new to pytest, I want to mock a function toMockFunction having parameter as a and b and these are initialized from system arguments in another .py file and its present inside a parent Function nestFunction So how to mock this toMockFunction from a pytest file?
# This Function is present in File A.py
def nestFunction():
abc = 1
returned_var = toMockFunction(a, b)
return returned_var
# This Function is present in File B.py
def toMockFunction(a, b):
data = pd.read_csv("a.csv")
### Some operations ###
return data
# This Function is present in File test_A.py
def test_nestFunction():
df = pd.read_csv("someFile.csv")
mocker.patch('somepath.B.toMockFunction', return_value=df)
output = nestFunction()
pd.testing.assert_frame_equal(df,output)
# Present in File A.py
if __name__ == '__main__':
a = sys.arg[1]
b = sys.arg[2]
Here sys.arg[1] and sys.arg[2] are passed as parameters from run config tab
I tried the exact code but the error is - name 'a' and 'b' are not defined
i.e its not able to mock the nested function also its aking for the input parameters

The error has nothing to do with pytest. The error as you describe states, name 'a' and 'b' are not defined. This much is true since the function definition has the following:
def nestFunction():
abc = 1
returned_var = toMockFunction(a, b)
At no point in this function is a or b defined, only abc is defined. Define a and b properly and you should get past this error.

Related

Using fixtures at collect time in pytest

I use testinfra with ansible transport. It provides host fixture which has ansible, so I can do host.ansible.get_variables().
Now I need to create a parametrization of test based on value from this inventory.
Inventory:
foo:
hosts:
foo1:
somedata:
- data1
- data2
I want to write a test which tests each of 'data' from somedata for each host in inventory. 'Each host' part is handled by testnfra, but I'm struggling with parametrization of the test:
#pytest.fixture
def somedata(host):
return host.ansible.get_variables()["somedata"]
#pytest.fixture(params=somedata):
def data(request):
return request.param
def test_data(host, data):
assert 'data' in data
I've tried both ways:
#pytest.fixture(params=somedata) -> TypeError: 'function' object is not iterable
#pytest.fixture(params=somedata()) -> Fixture "somedata" called directly. Fixtures are not meant to be called directly...
How can I do this? I understand that I can't change the number of tests at test time, but I pretty sure I have the same inventory at collection time, so, theoretically, it can be doable...
After reading a lot of source code I have came to conclusion, that it's impossible to call fixtures at collection time. There are no fixtures at collection time, and any parametrization should happen before any tests are called. Moreover, it's impossible to change number of tests at test time (so no fixture could change that).
Answering my own question on using Ansible inventory to parametrize a test function: It's possible, but it requires manually reading inventory, hosts, etc. There is a special hook for that: pytest_generate_tests (it's a function, not a fixture).
My current code to get any test parametrized by host_interface fixture is:
def cartesian(hosts, ar):
for host in hosts:
for interface in ar.get_variables(host).get("interfaces",[]):
yield (host, interface)
def pytest_generate_tests(metafunc):
if 'host_interface' in metafunc.fixturenames:
inventory_file = metafunc.config.getoption('ansible_inventory')
ansible_config = testinfra.utils.ansible_runner.get_ansible_config()
inventory = testinfra.utils.ansible_runner.get_ansible_inventory(ansible_config, inventory_file)
ar = testinfra.utils.ansible_runner.AnsibleRunner(inventory_file)
hosts = ar.get_hosts(metafunc.config.option.hosts)
metafunc.parametrize("host_interface", cartesian(hosts, ar))
You should use helper function instead of fixture to parametrize another fixture. Fixtures can not be used as decorator parameters in pytest.
def somedata(host):
return host.ansible.get_variables()["somedata"]
#pytest.fixture(params=somedata()):
def data(request):
return request.param
def test_data(host, data):
assert 'data' in data
This assumes that the host is not a fixture.
If the host is a fixture, there is hacky way to get around the problem. You should write the parameters to a tmp file or in a environment variable and read it with a helper function.
import os
#pytest.fixture(autouse=True)
def somedata(host):
os.environ["host_param"] = host.ansible.get_variables()["somedata"]
def get_params():
return os.environ["host_param"] # do some clean up to return a list instead of a string
#pytest.fixture(params=get_params()):
def data(request):
return request.param
def test_data(host, data):
assert 'data' in data

How to pass fixture when parametrizing a test

I am trying to parametrize my test.
In the setup method which returns a list, I am calling a fixture (app_config).
Now, i want to call the setup so that the list can be used as a parameter values inside the test.
The problem i am running into is that i cannot pass app_config fixture when calling setup in the parametrize decorator.
def setup(app_config):
member = app_config.membership
output = app_config.plan_data
ls = list(zip(member, output))
return ls
#pytest.mark.parametrize('member, output', setup(app_config))
def test_concentric(app_config, member, output):
....
....
Is there an elegant way to pass setup method in the parametrize decorator or any other way to approach this?
Unfortunately, starting with pytest version 4, it has become impossible to call fixtures like regular functions.
https://docs.pytest.org/en/latest/deprecations.html#calling-fixtures-directly
https://github.com/pytest-dev/pytest/issues/3950
In your case I can recommend not using fixtures and switch to normal functions.
For example, it might look like this:
import pytest
def app_config():
membership = ['a', 'b', 'c']
plan_data = [1, 2, 3]
return {'membership': membership,
'plan_data': plan_data}
def setup_func(config_func):
data = config_func()
member = data['membership']
output = data['plan_data']
ls = list(zip(member, output))
return ls
#pytest.mark.parametrize('member, output', setup_func(app_config))
def test_concentric(member, output):
print(member, output)
....
NB! Avoid the setup() function/fixture name because it will conflict with pytest.runner's internals.

Passing arguments to instantiate object in Pytest

I have a class which I would like to instantiate using different sets of input parameters, comparing a property on the resultant object to a passed in value.
I am using the indirect flag on #pytest.fixture for the arguments which are sent to the class constructor. I am trying to unpack kwargs in the constructor. Unsuccesfully. This is the error:
TypeError: type object argument after ** must be a mapping, not SubRequest
Code:
import pytest
class MyClass:
def __init__(self, a):
self.a = a
#pytest.fixture
def my_object(request):
yield MyClass(**request)
# first element = arguments to MyClass, second element = value to compare test to
TEST_CASES = [({"a":1}, 1)]
#pytest.mark.parametrize("test, expected", TEST_CASES, indirect=["test"])
def test_1(my_object, test, expected):
assert my_object.a == expected
My goal is to have the object arguments and their test value TEST_CASES in one structure for easy inspection
I've suggest you a working example. Problem was in test code design. The parameter indirect should be True. Indirect parametrization with multiple fixtures should be done as described in docs. And fixture got all params in his request.param attribute.
import pytest
class MyClass:
def __init__(self, a):
self.a = a
#pytest.yield_fixture
def test_case(request):
params, expected = request.param
yield MyClass(**params), expected
# first element = arguments to MyClass, second element = value to compare test to
TEST_CASES = [({"a": 1}, 1)]
#pytest.mark.parametrize("test_case", TEST_CASES, indirect=True)
def test_1(test_case):
my_object, expected = test_case
assert my_object.a == expected

Dynamically add CLI arguments in pytest tests

I'd like to run specific tests in pytest with dynamically added CLI arguments, i.e:
class TestHyperML:
def some_test(self):
# Setup some CLI argument such as --some_arg 3 -- some_other_arg 12
my_class = SomeClass()
class SomeClass:
def parse_cli_arguments(self):
# here I want to fetch my arguments in sys.argv.
parameters = {}
name = None
for x in sys.argv[1:]:
if name:
parameters[name] = {'default': ast.literal_eval(x)}
name = None
elif x.startswith('-'):
name = x.lstrip('-')
return parameters
I understand there is a way to do that programatically by running pytest test_something.py --somearg, but I would like to do that programatically from inside the test.
Is it possible ? Thanks !
Thanks to answers posted above, and similar SO questions, here is the solution that I used:
import mock
def test_parsing_cli_arguments(self):
args = 'main.py --my_param 1e-07 --my_other_param 2'.split()
with mock.patch('sys.argv', args):
parser = ConfigParser("config.yaml")
# Inside parser, sys.argv will contain the arguments set here.

coffeescript scope of variable assignment vs property assignment in object's other properties

I'm writing some widgets for Ubersicht. It uses a node.js server and treats each .coffee file as a standalone widget object. I'm having issues defining constant settings to be used throughout one file. Currently I know of two ways to define this type of constant at the top of the file.
# Way 1
foo_1 = true
bar_1 = false
# Way 2
foo_2: true
bar_2: false
Further down in the same file either a property is assigned as a string or as a function. Each of the above two ways of defining an option only works in one of the two property types.
staticProperty: """Output #{foo_1} works here
but output of #{foo_2} doesn't work
"""
methodProperty: (input) ->
if foo_1 # Raises foo_1 is not defined
if #foo_1 # foo_1 is undefined which is expected
if #foo_2 # This works fine
I understand that way 2 add to the object's properties, but I'm not too sure how the way 1 assignment works given that the file is essentially defining an object. Can you explain this?
Also is there a way to define a variable that can be accessed from both places?
We'll look at a big ugly example to see what's going on:
class C
a: 6
b: #::a
c = 11
d: c
#e = 23
f: #e
g: -> #a
h: -> C::b
i: -> c
j: -> #constructor.e
a is a normal old property, in JavaScript it looks like:
C.prototype.a = 6;
b is also a normal old property that is attached to the prototype; here:
b: #::a
# is the class itself so in JavaScript this is:
C.prototype.b = C.prototype.a
and everything works just fine.
c is sort of a private variable. In JavaScript it looks like this:
var C = (function() {
function C() {}
var c = 11;
//...
})();
I've included more JavaScript context here so that you can see c's scope. c is visible to anything inside the definition of C but nowhere else.
d is another property that is on the prototype and looks like this in JavaScript:
C.prototype.d = c
This assignment happens inside the SIF wrapper that is used to build the class so var c = 11 is visible here.
e is a class property and in JavaScript is just:
C.e = 23;
f is another property on the prototype. # is the class itself in this context (just like in b):
f: #e
so we can get at e as #e and the JavaScript looks like:
C.prototype.f = C.e;
The g and h methods should be pretty clear. The i method works because it is a closure inside the SIF that is used to define C:
C.prototype.i = function() { return c; };
The j method works because it uses the standard constructor property to get back to C itself.
Demo: http://jsfiddle.net/ambiguous/tg8krgh2/
Applying all that to your situation,
class Pancakes
foo_1 = true
foo_2: true
We see that you can use either approach if you reference things properly:
staticProperty: """Output #{foo_1} works here
and so does #{#::foo_2}
"""
methodProperty: (input) ->
# foo_1 should work fine in here.
# #foo_1 is undefined which is expected
# #foo_2 works fine
I'm not sure why you're having a problem referencing foo_1 inside your methodProperty, it should work fine and does work fine with the current version of CoffeeScript.