Matlab production server add methods dynamically to deployable archive - matlab

i am developing a web site where users can test Matlab methods and their own.
i am using Restful api of Matlab, and i am linking it to nodejs .
I want to know if there is a way to add a Matlab files to the deployable archive
while the matlab server is running.

No, you can't do that.
The workflow is that you get your code, you use MATLAB Compiler SDK to compile the code to a .ctf file, and you deploy the .ctf file to MATLAB Production Server. Once the code is compiled to a .ctf file it is fixed, and can't be modified.
If you need to modify it, you need to modify the code, recompile to a new .ctf file, and redeploy the new file.

Related

Altera FPGA EP4CE55F23C7N

I currently working on an Altera FPGA. In this project, we were given only the soft software source code using Nios II Software Build Tools for Eclipse I like to run and build the original code before making any changes to it. However, only a .pof programming file and a .cdf file was given to us, so when setting the Hardware design setup, please correct me if I’m wrong, I need the SOPC file and .sof file to configure the hardware?. I tried using the .pof or .cdf files in some way but with no success. Is there a way I can compile and build this application somehow using the .pof file or other way and thus be able to create a bsp project and load the .elf file to the FPGA. Or do I need this .sopcinfo and .sof files. Thank you.

How to create stub/services files with MATLAB grpc plugin?

I'm using MatlabWithProtoV3 to create protoc.exe with matlab_out in Windows environment.
I was able to create protoc and when I use
protoc.exe user.proto --matlab_out=./
It only creating matlab files for proto messages (files can be found in the bottom attachment) and it is not creating matlab files for services(client and server)
Then, I read about plugins and included the generator and plugin files to gRPC Source to create Matlab plugin and created the grpc_matlab_plugin.exe successfully.
Now, when I execute
protoc.exe user.proto --matlab_out=./ --grpc_out=./ --plugin=protoc-gen-grpc="D:\grpc\cmake\build\Debug\grpc_matlab_plugin.exe
I'm getting
pb_descriptor_LoginRequest.m: Tried to write the same file twice.
pb_read_LoginRequest.m: Tried to write the same file twice.
pb_descriptor_APIResponse.m: Tried to write the same file twice.
pb_read_APIResponse.m: Tried to write the same file twice.
pb_descriptor_Empty.m: Tried to write the same file twice.
pb_read_Empty.m: Tried to write the same file twice.
error message and no files are getting created.
in gRPC repo, for C++ compiler i could find cpp_plugin.h has some codes to create service related files but similar file is not available for Matlab in here or here
Can you please let me know how to create Matlab files for services?
Attached the files created when I execute the above mentioned commands,
sample_files.zip
Github issue
Thanks
protobuf-matlab is just a protobuf plugin - it generates code to read/write protocol buffer.
Unfortunately it does not implement a gRPC plugin which would build the client stub and server.
If you are able to call your matlab code from another language, you could host the gRPC server externally, e.g. create a gRPC server in dotnet and use COM to call your matlab code.

Export OSB resources without using export wizard on JDeveloper

Using JDeveloper in order to create and manage Oracle Service Bus 12c resources, I am able to export the required resources into a .jar file using the Resources Export Wizard of JDeveloper, selecting one by one those needed, under the tree of each project.
What I want to do though is find a way to export a .jar file based on resources list, given in a file of a commonly used format (JSON, CSV etc), as it can be time saving for a large number of resources. My first thought was to search if JDeveloper provides such way or attempt do this programmatically, yet my search on this has not given me any information of how-to.
Is there an alternative way of doing this?
If you have Oracle OSB 11.1.1.7.0 or higher you can automate the compilation process for OSB at project level using configjar, here's a whole example of an implementation which include: compilation using configjar, automating the task retrieving the code from GIT using Jenkins and a python script.
You can also do it using ANT, here's a good document of Oracle explaining that. (I've tried it, but found easier to use configjar, this is the only option for versions below 11.1.1.7.0).
After creating any of those compilation methods you can create a CSV file, parse it with python and loop the compilation.

Packaging SF service into a single file

I am working through how to automate the build and deploy of my Service Fabric app. Currently I'm working on the package step and while it is creating files within the pkg subfolder it is always creating a folder hierarchy of files, not a true package in a single file. I would swear I've seem a .SFPKG file (or something similarly named) that has everything in one file (a zip maybe?). Is there some way to to create such a file with msbuild?
Here's the command line I'm using currently:
msbuild myservice.sfproj "/p:Configuration=Dev;Platform=AnyCPU" /t:Package /consoleloggerparameters:verbosity=minimal /maxcpucount
I'm concerned about not having a single file because it seems inefficient in sending a new package up to my clusters, and it's harder for me to manage a bunch of files on a build automation server.
I believe you read about the .sfpkg at
https://azure.microsoft.com/documentation/articles/service-fabric-get-started-with-a-local-cluster
Note that internally we do not yet support provisioning a .sfpkg file. This is a feature that will be coming in soon (date TBD). Instead, we upload each file in the application package.
Update (SF 6.1 - April 2018)
Since 6.1 it is possible to create a ZIP file (*.sfpkg) and upload it to an external store. Service Fabric executes a GET operation to download the sfpkg application package. For more infos see https://learn.microsoft.com/en-us/azure/service-fabric/service-fabric-package-apps#create-an-sfpkg
NOTE: This only works with external provisioning, the Azure image store still doesn't support sfpkg files.

How to setup a DotNetNuke Development Environment with Source Control?

My team is developing a new DotNetNuke web application and would like to know what is recommended to setup a development environment with source control and automated builds? We would like to keep the DNN source code separate from our custom modules and extensions source code.
The DotNetNuke Compiled Module template for Visual Studio wants us to store the source code in the DesktopModules directory of the DNN source code and output to the DNN source code bin directory. Is this the recommended structure? I would rather keep the files in different locations, but then it becomes more difficult to run and debug locally as it would require an install of the module for each change. Also, how should an automated build deploy any changes?
How have others set this up? Is there a recommended best practice?
For my source control, I develop modules in their own project. This contains the module code, test code, data provider code (if applicable) and anything else. This is checked into source control like any other project. Note that the module project contains no links to a specific DNN website, and DNN references are made in the project to a common "bin" directory that references your target build. For example, in my projects folder, I have \bin460 , \bin480, \bin510, \bin520 etc. Each of these folders holds a set of binaries for a specific DNN version. That way you can build against a particular version but test against any version you like.
The problem with source-controlling a module in place in a dnn install is
- sometimes not all of the module code is easily isolated under a single parent directory
- doesn't lend well to a PA module approach
- not easy to shift the project to a different DNN Version for development or testing
- easy to inadvertently source control parts of the DNN solution, particularly with integrated VS source control solutions.
This approach compiles quickly because you're not trying to compile the entire project. For test deployment I have a build script that copies the various parts of the module into a target website. This can be done via the compile (link the build script) or just run after you've had a successful compile in a cmd window. My build script has a 'target' environment switch, so that I can say 'dnn520' to deploy the build to my test dnn520 install. Note that you need to manually create the module configuration first before this will work, but this is a one-time effort, and you can use the export feature to create your .dnn module manifest.
To build your module package, invest the time in a comprehensive script which will take the various parts from your source directory, and zip them into an install package. Keep all of the parts in your source control folder, and copy them into a temp directory, then run a command-line zip utility (I use an ancient version of pkzip) to pack it into an installable file.
The benefits of this approach is :
- separation of module code from installed code
- simple way of keeping only the module code in source control (don't have to exclude all the website code)
- ability to quickly test out modules in different dnn versions
- packaging script allows you to quickly and easily build a new version of a module for install testing/deployment
The drawbacks are
- can't use the magic green 'go' button in VS (have to manually attach debugger)
- more setup time than developing in-place
We typically stick to keeping the module code in a folder under DesktopModules and building to the website's bin directory.
In source control, we just map the individual modules, rather than the entire website. Depending on what we're working on, a module may be an entire project in source control, or we may have multiple related modules in the same project, living next to each other.
Automatically deploying changes is somewhat difficult in DNN. It's highly recommended to have a build script that packages your module into an installable form. You can then copy installable packages into the website's Install/Module folder, and get the URL /Install/Install.aspx?mode=InstallResources, which will install any packages in that folder.
In response to bduke's answer. You should, and don't want to build projects in the DesktopModules folder.
That's where all of the source code for the site out of the box goes.
That's where you modules will be "installed" and thus if someone "updates" or re-installs one, then it will be overwritten
It can make upgrading your Application far more difficult. Many developers don't understand that the idea of not touching the original source code files to modify their behavior. BECAUSE it will just be overwritten when you perform an upgrade.
If you want to build modules, create a solution folder called Modules and place your seperate project modules there.
If you want to debug them, make the target debug output point to the web\bin folder.
If you want to install/deploy them. Build it in release mode and install them through the Module/Extension filter.