How to pass params into timeit.timeit function Python - python-3.7

I have to test my function:
def change_min_and_max(array):
some_code
And I need to use timeit.timeit for it. But I can't pass an array into my function:
import random
array = [random.randint(-1000, 1000) for _ in range(100)]
import timeit
print(timeit.timeit('change_min_and_max(array)', number=100))
name 'change_min_and_max' is not defined

The problem is, that timeit executes the code in a seperate namespace and doesn't know about the function and array definitions. You can specify a namespace with the globals parameter. If you simply want to use the current namespace you can use the globals() function like this:
import random, timeit
def change_min_and_max(array):
...
array = [...]
timeit.timeit("change_min_and_max(array)", globals=globals())

Related

How to use udf functions in pyspark

I am analysing the following piece of code:
from pyspark.sql.functions import udf,col, desc
def error(value, pred):
return abs(value - pred)
udf_MAE = udf(lambda value, pred: MAE(value= value, pred = pred), FloatType())
I know an udf is an user defined function, but I don't understand what that means? Because udfwasn't define anywhere previously on the code?
User Defined Functions (UDFs) are useful when you need to define logic specific to your use case and when you need to encapsulate that solution for reuse. They should only be used when there is no clear way to accomplish a task using built-in functions..Azure DataBricks
Create your function (after you have made sure there is no built in function to perform similar task)
def greatingFunc(name):
return 'hello {name}!'
Then you need to register your function as a UDF by designating the following:
A name for access in Python (myGreatingUDF)
The function itself (greatingFunc)
The return type for the function (StringType)
myGreatingUDF = spark.udf.register("myGreatingUDF",greatingFunc,StringType())
Now you can call you UDF anytime you need it,
guest = 'John'
print(myGreatingUDF(guest))

Why I cannot extend a scipy rv_discrete class successfully?

I'm trying to extend the rv_discrete scipy class, as it is supposed to work in every case while extending a class.
I just want to add a couple of instance attributes.
from scipy.stats import rv_discrete
class Distribution(rv_discrete):
def __init__(self, realization):
self._realization = realization
self.num = len(realization)
#stuff to obtain random alphabet and probabilities from realization
super().__init__(values=(alphabet,probabilities))
This should allow me to do something like this :
realization = #some values
dist = Distribution(realization)
print(dist.mean())
Instead, I receive this error
ValueError: rv_discrete.__init__(..., values != None, ...)
If I simply create a new rv_discrete object as in the following line of code
dist = rv_discrete(values=(alphabet,probabilities))
It works just fine.
Any idea why? Thank you for your help

scala eval function in string form

I'm trying to define a DSL that will dictate how to parse CSVs. I would like to define simple functions to transform values as the values are being extracted from CSV. The DSL will defined in a text file.
For example, if the CSV looks like this:
id,name,amt
1,John Smith,$10.00
2,Bob Uncle,$20.00
I would like to define the following function (and please note that I would like to be able to execute arbitrary code) on the amt column
(x: String) => x.replace("$", "")
Is there a way to evaluate the function above and execute it for each of the amt values?
First, please consider that there's probably a better way to do this. For one thing, it seems concerning that your external DSL contains Scala code. Does this really need to be a DSL?
That said, it's possible to evaluate arbitrary Scala using ToolBox:
import scala.reflect.runtime.universe._
import scala.tools.reflect.ToolBox
val code = """(x: String) => x.replace("$", "")"""
val toolbox = runtimeMirror(getClass.getClassLoader).mkToolBox()
val func = toolbox.eval(toolbox.parse(code)).asInstanceOf[String => String]
println(func("$10.50")) // prints "10.50"

How can I use Scalas runtime reflection to inspect a passed anonymous function?

Assuming I have a method like the following:
def getInfo(func: () => T) = {
//Code goes here.
}
How could I use the runtime reflection of Scala 2.11.1 to inspect the passed anonymous function func?
I'm especially interested in getting an AST (abstract syntax tree) of func and, if possible, the location (line number, file) where the method was first defined.
All I have accomplished so far is to get information about the type of parameter func, not the function itself.
I am aware of the fact that there have been similar questions on SO, but they mainly target other Scala versions.
As Ben mentioned in the comments, this can be done using Scala macros at compile time.
A possible option is to expand the original call to a macro, which queries the necessary information, and then call a internal getInfoMethod, which does something with the information.
Example for getInfo function:
import scala.language.experimental.macros
//getInfo method which gets expanded to macro
def getInfo(func: => Any):Unit = macro FindFreeVars.findMacro
def getInfoInternal(info: Any) {
//Do something with the collected information
}
Example for macro:
//Macro delclaration
def getInfoMacro(c: Context)(func: c.Tree): c.Expr[Unit] = {
import c.universe._
//Extract information, like enclosingPosition or symols from the function tree.
val functionInfo = getFunctionInfo(func)
//Call internal method
c.Expr[List[(String, Any)]](q"getInfoInternal($func, $closedVars)")
}
More on Symbols, Trees and Types for analysis.
More complete example of a macro which finds all variables bound from another scope in a function by applying the same technique.

Scala reflection on function parameter names

I have a class which takes a function
case class FunctionParser1Arg[T, U](func:(T => U))
def testFunc(name1:String):String = name1
val res = FunctionParser1Arg(testFunc)
I would like to know the type signature information on the function from inside the case class. I want to know both the parameter name and type. I have had success in finding the type using the runtime mirror objects, but not the name. Any suggestions?
Ok, let's say you got the symbol for the instance func points to:
import scala.reflect.runtime.universe._
import scala.reflect.runtime.{currentMirror => m}
val im = m reflect res.func // Instance Mirror
You can get the apply method from its type members:
val apply = newTermName("apply")
val applySymbol = im.symbol.typeSignature member apply
And since we know it's a method, make it a method symbol:
val applyMethod = applySymbol.asMethod
It's parameters can be found through paramss, and we know there's only one parameter on one parameter list, so we can get the first parameter of the first parameter list:
val param = applyMethod.paramss(0)(0)
Then what you are asking for is:
val name = param.name.decoded // if you want "+" instead of "$plus", for example
val type = param.typeSignature
It's possible that you think that's the wrong answer because you got x$1 instead of name1, but what is passed to the constructor is not the named function testFunc, but, instead, an anonymous function representing that method created through a process called eta expansion. You can't find out the parameter name of the method because you can't pass the method.
If that's what you need, I suggest you use a macro instead. With a macro, you'll be able to see exactly what is being passed at compile time and get the name from it.