How to attach a file to buildbot mailnotifier - buildbot

I know I can attach logs to buildbot using a flag. How do I attach another file (a zip for instance)? This doesn't seem to be a default option.

Unfortunatelly Buildbot can not send any attachments in emails (except logs and patches).
But it use the Python Standard Library to work with emails.
You can customize an mail.py file writing own plugin to realize your function:
https://github.com/buildbot/buildbot/blob/master/master/buildbot/reporters/mail.py
See createEmail function
Python docs to work with emails:
https://docs.python.org/3/library/email.html

Related

How to create stub/services files with MATLAB grpc plugin?

I'm using MatlabWithProtoV3 to create protoc.exe with matlab_out in Windows environment.
I was able to create protoc and when I use
protoc.exe user.proto --matlab_out=./
It only creating matlab files for proto messages (files can be found in the bottom attachment) and it is not creating matlab files for services(client and server)
Then, I read about plugins and included the generator and plugin files to gRPC Source to create Matlab plugin and created the grpc_matlab_plugin.exe successfully.
Now, when I execute
protoc.exe user.proto --matlab_out=./ --grpc_out=./ --plugin=protoc-gen-grpc="D:\grpc\cmake\build\Debug\grpc_matlab_plugin.exe
I'm getting
pb_descriptor_LoginRequest.m: Tried to write the same file twice.
pb_read_LoginRequest.m: Tried to write the same file twice.
pb_descriptor_APIResponse.m: Tried to write the same file twice.
pb_read_APIResponse.m: Tried to write the same file twice.
pb_descriptor_Empty.m: Tried to write the same file twice.
pb_read_Empty.m: Tried to write the same file twice.
error message and no files are getting created.
in gRPC repo, for C++ compiler i could find cpp_plugin.h has some codes to create service related files but similar file is not available for Matlab in here or here
Can you please let me know how to create Matlab files for services?
Attached the files created when I execute the above mentioned commands,
sample_files.zip
Github issue
Thanks
protobuf-matlab is just a protobuf plugin - it generates code to read/write protocol buffer.
Unfortunately it does not implement a gRPC plugin which would build the client stub and server.
If you are able to call your matlab code from another language, you could host the gRPC server externally, e.g. create a gRPC server in dotnet and use COM to call your matlab code.

Sending a mail in groovy (without additional libs)

I'm writing several scripts which are shared within the company. I currently have the requirement that under some conditions a mail should be sent.
This is pretty easy by using the javax.mail.* packages which are unfortunately not part of the default library.
The scripts are shared via VCS and mostly used by business people who open them via double click. (groovy files are on their systems automatically opened with grooovy.bat)
It would be a huge effort the install the libs on all their systems. Everywhere Groovy 2.4.5 and Java 8 is configured.
Is there an easier way of sending a mail? For example by
loading the lib dynamically?
or sending the mail without the lib?
thanks in advance
You can fetch libraries dynamically with the #Grab statement, but you have to ensure that your script will be able to load the lib from a repository.
http://www.groovy-lang.org/Grape

Running IPython Notebook viewer locally

Im trying to introduce IPython notebook in my work. One of the ways I want to do that is by sharing my own work as notebooks with my colleagues so they would be able to see how easy it is to create sophisticated reports and share them.
I obviously can't use Notebook viewer since most of our work is confidential. I'm trying to set up notebook viewer locally. I read this question and followed the instructions there, but now that nbconvert is part of IPython the instructions are no longer valid.
Can anybody help with that?
You have a couple of options:
As described above convert to HTML and then serve them using a Simple server e.g python -m "SimpleHTTPServer" You can even set up a little python script that would "listen" in one directory. If changes or new notebooks is added to the directory the script will run nbconvert and move the HTML file to the folder you are serving from. To navigate to the server you are running go to yourip:port e.g. 10.0.0.2:8888 (see the IPython output when you run the IPython notebook command) (If you can serve over the network you might just as wel look into point 2 below)
If your computers are networked you can serve your work over the lan by sharing your IP address and port with your colleagues. This will however give them editing access but should not be a problem? This means that they will navigate to your ipython server and see the ipython notebook and be able to run your files.
Host your notebooks on an online server like Linode etc... entry level servers cheap. Some work is needed to add a password though.
Convert to PDF and mail it to them.
Convert to a slideshow (now possible in Version 1.00) and serve via option 1,2 or just share the HTML file with them.
Let them all run ipython notebook and check your files into a private repo at bitbucket (its free private git repo). They can then get your files there and run it themselves on their own machines.Or just mail it to them. Better yet if they wont make changes share a dropbox folder with everyone. If they run ipython notebook in that folder they will see your files (DANGEROUS though)
Get them in a boardroom and show them. :)

TFS 2010: run powershell script stored in source control

We've started using TFS2010 over at the company I work at. We create e-commerce web applications (shopping sites). I'm creating a custom template to deploy web projects after a build using a build template.
I've looked at the web deploy tool, but MSDN seems to indicate that it can only do initial deployments, and I need to be able to do incremental deployments with the same script.
I'm thinking of using the invokeActivity activity in the template to use powershell to do the job by specifying an FTP script which automatically copies the output of a build to a designated FTP site and then runs the SQL (upgrade) scripts, if needed by using SSH or s powershell remoting interactive session. (possibly specified in a separate SQL script)
There is some unknown for me which I can't get clear through the use of google:
When queuing a build, will I be able to let the user specify a script present in source control ( e.g. $(source)\scripts\ftpscript.ps1 ) as the script which is to be used? Will powershell be able to access/use that file? or should I copy it to the build directory and specify when I run it? (I don't know how to set up the template to get files from source control, so a pointer to some helpful info how to do that would be very much appreciated)
If the previous just doesn't work at all, could I create a folder \scripts\ in my website project, commit that to source control and then use BuildDetail.DropLocationRoot & "\scripts\" as the location for the script and fore a copy of the script files by enabling the force copy option?
To run a PowerShell script I think you can use the InvokeProcess activity which would trigger something like this:
%windir%\system32\windowspowershell\v1.0\powershell.exe "$(SolutionRoot)\test.ps1
And yes, you can reach a script file present in source control using the "SourcesDirectory" keyword.

What would be the best way to use jammit and publish files on amazon S3?

I'm using jammit to package the js and css files for a rails project.
I would like now to upload the files to Amazon S3 and use CloudFront for the delivery.
What would be the best way to deal with new versions ?
My ideal solution would be to have a capistrano recipe to deal with it.
As anyone already done something like that?
You could simply create a capistrano task that triggers the copy to s3 after deploying.
You might use s3cmd as the command line tool for that.
Alternatively you could create a folder mounted by FuseOverAmazon, and configure it as the package_path in your jammit assets.yml. Make sure to run the rake task for generating the asset packages manually or in your deploy recipie.
http://s3tools.org/s3cmd
http://code.google.com/p/s3fs/wiki/FuseOverAmazon