I'm following the documentation in http://mxnet.io/how_to/new_op.html for how to define a new neural network layer in MXNet in python by subclassing themx.operator.CustomOp class. The example is of a loss layer that has no learned parameters. So how do the learned parameters get into the forward and backward methods?
I figured this out. Learned parameters are configured like any other input to the op. They're configured in the list_arguments method. From the docs page on writing custom symbols:
Note that list arguments declares both input and parameter and we
recommend ordering them as ['input1', 'input2', ... , 'weight1', 'weight2', ...]
Related
I'm making the interface package which can input the parameters of the models in the simulation loop.
To connect between the interface package and the simulation model, I used the Controlbus from the Standard Modelica Library Ver. 3.2.2.
Checking model was Okay, but if i simulate the model, the error like the picture below popped up.
And here's the equation related to this model
Omega_e = Omega_d * N_t[N];
Alpha_d = der(Omega_d);
To solve the differential equation, i think the solver need a specific parameter of N_t.
So i put the parameters from the interface model and sent the parameters using the Controlbus component in the Standard Modelica Library.
As in the picture above, i definitely put the parameters.
(Specific values of the parameters are deleted because it's a confidential)
I can't find what is the problem of this error.
Please help me guys.
Thank you very much.
Based on the incomplete model it's a bit tricky to say what happened, but:
Sending parameters through the control-bus (or a connector in general) is a bit complicated and not that encouraged.
It should be possible by declaring the "computed parameter" as parameter Integer N(fixed=false); initial equation N=myBus.N;, and don't have it as parameter in the connector.
If you don't declare it as parameter Dymola will try (and fail) to differentiate it.
If you declare it as parameter in the connector it will not be propagated (as connecting two parameters lead to an assertion).
This is my sample network and idea of trying how to make a new routing instead of just the shortest path (the path i want to follow is via the pink arrows)
What am I missing here to make my predefined function work?
Fixing the basics
As explained here, you can't instantiate a Java List, because it is an interface. You can however instantiate any implementing Class of a List, for example an ArrayList.
With this in mind your code will look like this:
List<Path> myPath = new ArrayList<Path>();
myPath.add(path14);
myPath.add(path8);
myPath.add(path);
myPath.add(path1);
myPath.add(path4);
myPath.add(path13);
return myPath;
So far for the basics.
Where to go from here
To get it to consider your actual source and destination for the route planning, define both as input parameters of type ILocation in the properties of the function.
Now comes the really tricky part: writing your own or importing a routing algorithm that can give you that list of paths automatically based on criteria that you define. This is however a topic too broad for this question. The basic steps will be:
Create a graph that represents your AnyLogic path network
Solve the graph routing problem with a solving algorithm (eg. Dijkstra Algorithm), using the graph, the startpoint and the endpoint
Convert the solution you get from the solver back again to an ArrayList that you can work with in AnyLogic
You can do these steps on your own, eg. by implementing the Dijkstra Algorithm yourself, or you import into AnyLogic one of the available graph solving Java packages like JUNG or Graphhopper. In this article I explain step by step how to do so with JUNG.
I'm porting a large Simulink model from Simulink R2010a → R2017b.
The main model is basically a glue-layer for many interwoven reference models. My objective is to generate a standalone executable out of this main model using Coder.
Parameter tunability in this context is not done via the Signals and Parameters section on the Optimization tab in the Model Configuration Parameters dialog (as is the case in stand-alone models), but rather, via constructing Simulink.Parameter objects in the base workspace, and referencing those in the respective referenced models, or in their respective model workspaces.
Now, AFAIK, in R2010a it was enough to set
new_parameter.RTWInfo.StorageClass = 'Auto';
new_parameter.RTWInfo.CustomStorageClass = 'Define';
to make the parameter non-tunable and convert it into a #define in the generated code. In R2017b, this is no longer allowed; the StorageClass must be 'Custom' if you set a non-empty CustomStorageClass:
new_parameter.CoderInfo.StorageClass = 'Custom'; % <- can't be 'Auto'
new_parameter.CoderInfo.CustomStorageClass = 'Define';
But apparently, this does not make the parameter non-tunable:
Warning: Parameter 'OutPortSampleTime' of '[...]/Rate Transition1' is non-tunable but refers to tunable variables (Simulation_compiletimeConstant (base workspace))
I can't find anything in the R2017b documentation on making parameters non-tunable, programatically; I can only find how to do it in stand-alone models via the dialog, but that's not what I want here.
Can anyone point me in the right direction?
NOTE: Back in the day, Simulink Coder was called Real-Time Workshop (well, Real-time Workshop split into Coder and several other things), hence the difference RTWInfo vs. CoderInfo. Note that RTWInfo still works in R2017b, but issues a warning and gets converted into Coderinfo automatically.
In generated code it should appear as #define, the way you specified it.
https://www.mathworks.com/help/rtw/ug/choose-a-built-in-storage-class-for-controlling-data-representation-in-the-generated-code.html
Btw, yes, it's a bit confusing, because in m-file you specify CustomStorageClass = 'Define';, in GUI you specify Storage class as Define (custom), but in documentation they say Storage Class as Defined.
I am not sure why warning about tunability shows up.
I would like to create my own operations to merge networks. So I've taken a look to the code, and I modified engine/topology.py to create my new operation.
I didn't modified layers/wrappers.py because it's only for RNN and when I modify it I get an error.
Are there other files/classes to modify? Don't I have to do something else somewhere else to specify what to do during the backward pass?
You don't have to change any other files if you have implemented your operation properly with backend operations only. The backend is clever and takes care of the computation of the gradients for the backpropagation by itself.
This means that all parameters that will change over time have to be defined with K.variable, and you only use mathematical operations defined in keras.backend. Otherwise the backend will not be able to perform the backpropagation properly.
Side-note: Instead of modifying the source code of keras, you could implement your own class that extends the Merge class and override the call function for your custom operation.
I am currently coding a new library with several models in it (I am used to Matlab, but not to Simulink). I am able to create a model with block parameters, let's say parameter 'p', and a callback function (initfct) which uses this parameter to compute specific values used inside my model (let say a simple gain K=K(p)).
My problem is that my parameter 'p' and 'K' are available directly on the workspace, what I don't want to. Moreover, if I use twice or more this model in a system, the two models share always the same 'K', which also I don't want to.
So how I can make these variables 'p' and 'K' independent when I use my custom model several time, and to prevent these variables to be viewed in the workspace ?
Should I use "Reference models", but I am not familiar with this feature ... ?
Thanks for you answer,
Michael
Within a callback, gcb returns the path to the block which currently executes the callback. Having the path, you can use get_param to access the parameters.
Just for demonstation purposes, insert the following to the MoveFcn of a delay block:
set_param(gcb,'DelayLength',num2str(randi(10)))
It will randomly change the delay whenever the block is moved.
I am not sure if my answer explains everything you need. It might be that you also need a Mask. If you think this answer is incomplete, please update your question and include a small example model demonstrating your problem.
Thanks, with your help I was able to solve the problem.
To be more specific if someone else has the same problem : you need in your mask to declare also internal variables used by the callback function. Unchecked the relevant options so that they not appear as standard input parameters of your model.
My problem was also to use num2str instead of mat2str (when the gain was a matrix acting on multiple inputs).