Is it possible to read python documentation at iPython terminal? - ipython

For example, I want to know how the 'shape' function is defined?
v.reshape(1,3)
At a iPython terminal, can I pull out the documentation of this function easily to read?

You can use the ? and ?? help shortcuts:
In [7]: class Foo:
... def bar(self):
... return 5
...
In [8]: f = Foo()
In [9]: f.bar?
Signature: f.bar()
Docstring: <no docstring>
File: /tmp/<ipython-input-7-092982d55a54>
Type: method
In [10]: f.bar??
Signature: f.bar()
Docstring: <no docstring>
Source:
def bar(self):
return 5
File: /tmp/<ipython-input-7-092982d55a54>
Type: method
?f.bar and f.bar? work identically. The source code of functions and modules is only viewable if they're written in Python. Otherwise, you'll just see the docstring.

Related

Question about type hinting in Python 3.9+

I have a method that takes a callback function as an argument.
I want to specify a type hint for the callback function signature.
The problem is the signature of the callback function is:
def callback_function(event, token, *args)
where
type(event) = EventClass
type(token) = str
type(args) = tuple # of str
I could write this as:
callable[[...], returntype]
but I would like to tighten the type checking to be something useful, at least to making sure event and token are specified correctly.
Suggestions, please?
Consider using Protocol for this matter, as answered before.
For your specific case it will look like this:
from typing import Protocol, TypeAlias
returntype: TypeAlias = int
class MyCallable(Protocol):
def __call__(self, event: EventClass, token: str, *args: str) -> returntype: ...
def some_function(callback: MyCallable): ...

python37: Does #dataset call super.__init__(...)?

Suppose I have a complicated class with a lot of inputs. This class is not a dataclass class. Further, if I import it explicitly, I would like it to complain if I do not provide all of the arguments.
However, for interfacing purposes and clean code, I would like to define default values for the constructors and pass around arguments for the complex class as, more or less, a 'defined' dict.
This is a good task for the dataclass, and I have defined a dataclass class containing all the arguments with defaults, and which I modify and manipulate.
#dataclass
ComplicatedClassArgs:
arg1: int
arg2: bool = False
...
My question amounts to: can I write the following, and expect and/or tell the dataclass to call super.init(...) with all the named attributes I have defined?
#dataclass
ComplicatedClassArgs(ComplicatedClass):
arg1: int
arg2: bool = False
def some_meta_arg_manipulation_function(...):
pass
def some_other_arg_related_function(...):
pass
Such that I know I have composed a more advanced inner class behavior with a dataclass entry point?
I might have misunderstood your use case, but it looks to me like inheritance is the wrong tool for the job here. How about a simple #classmethod?
#dataclass
ComplicatedClassArgs:
arg1: int
arg2: bool
#classmethod
def from_dict(cls, kwargs=None):
""""ComplicatedClassArgs factory.
This method provides sensible default values that
may or may not be replaced through the input kwargs.
"""
if kwargs is None:
kwargs = {}
default_params = {
'arg1': 1,
'arg2': False
}
return cls(**{**default_params, **kwargs})
>>> ComplicatedClassArgs()
Traceback (most recent call last):
...
TypeError: __init__() missing .. required positional arguments
>>> ComplicatedClassArgs.from_dict()
ComplicatedClassArgs(arg1=1, arg2=False)

Dynamically add CLI arguments in pytest tests

I'd like to run specific tests in pytest with dynamically added CLI arguments, i.e:
class TestHyperML:
def some_test(self):
# Setup some CLI argument such as --some_arg 3 -- some_other_arg 12
my_class = SomeClass()
class SomeClass:
def parse_cli_arguments(self):
# here I want to fetch my arguments in sys.argv.
parameters = {}
name = None
for x in sys.argv[1:]:
if name:
parameters[name] = {'default': ast.literal_eval(x)}
name = None
elif x.startswith('-'):
name = x.lstrip('-')
return parameters
I understand there is a way to do that programatically by running pytest test_something.py --somearg, but I would like to do that programatically from inside the test.
Is it possible ? Thanks !
Thanks to answers posted above, and similar SO questions, here is the solution that I used:
import mock
def test_parsing_cli_arguments(self):
args = 'main.py --my_param 1e-07 --my_other_param 2'.split()
with mock.patch('sys.argv', args):
parser = ConfigParser("config.yaml")
# Inside parser, sys.argv will contain the arguments set here.

autoreload and function decorator

I am fairly new to decorators but am experiencing unexpected behavior revolving around autoreload in an interactive workflow with decorated functions. Its best explained by example (note these are all cells in a jupyter notebook):
The decorator:
%%file testdec.py
def decorated(func):
print("decorating")
def wrapped(*args, **kwargs):
return func(*args, **kwargs)
return wrapped
Where the decorator is used:
%%file testmod.py
from testdec import decorated
#decorated
def thisfunc():
print("old output")
def _thisfunc1():
print("old output 1")
thisfunc1 = decorated(_thisfunc1)
I would use the following to call the decorated functions:
from testmod import *
thisfunc()
thisfunc1()
outputs:
decorating
decorating
old output
old output 1
Now updating testmod.py with:
%%file testmod.py
from testdec import decorated
#decorated
def thisfunc():
print("new output")
def _thisfunc1():
print("new output 1")
thisfunc1 = decorated(_thisfunc1)
and calling the functions again:
thisfunc()
thisfunc1()
gives the following, note the old output from the first method:
decorating
decorating
old output
new output 1
However, explicitly reimporting from this module:
from testmod import *
thisfunc()
thisfunc1()
results in:
new output
new output 1
Ideally the #decorated function (e.g. with the # and not the second method) would autoreload transparently as the second method does. Is there something I can do to achieve this? What am I missing for decorated functions. For now we're manually disabling decorators when editing interactively in order to have the benefits of autoreload.
Thanks.

How to pretty print object representation in IPython

IPython outputs list representation in a pretty way by default:
In [1]: test_list
Out[1]: [<object_1>,
<object_2>,
<object_3>]
I have object like:
class TestObject(object):
def __init__(self):
self._list = [<object_1>, <object_2>, <object_3>]
def __repr__(self):
return self._list.__repr__()
test_object = Test()
And IPython representation of this is:
In [2]: test_list
Out[2]: [<object_1>, <object_2>, <object_3>]
Is there a way to get list's way of the representation for my object?
To get the pretty-printed repr of a list, or any object, call:
from IPython.lib.pretty import pretty
pretty(obj)
If you want your object to have a pretty repr separate from its normal repr (e.g. so it doesn't need IPython to repr() it), you can define a _repr_pretty_(self, cycle=False) method. The cycle parameter will be true if the representation recurses - e.g. if you put a container inside itself.