Unable to run experiment on Azure ML Studio after copying from different workspace - powershell

My simple experiment reads from an Azure Storage Table, Selects a few columns and writes to another Azure Storage Table. This experiment runs fine on the Workspace (Let's call it workspace1).
Now I need to move this experiment as is to another workspace(Call it WorkSpace2) using Powershell and need to be able to run the experiment.
I am currently using this Library - https://github.com/hning86/azuremlps
Problem :
When I Copy the experiment using 'Copy-AmlExperiment' from WorkSpace 1 to WorkSpace 2, the experiment and all it's properties get copied except the Azure Table Account Key.
Now, this experiment runs fine if I manually enter the account Key for the Import/Export Modules on studio.azureml.net
But I am unable to perform this via powershell. If I Export(Export-AmlExperimentGraph) the copied experiment from WorkSpace2 as a JSON and insert the AccountKey into the JSON file and Import(Import-AmlExperiment) it into WorkSpace 2. The experiment fails to run.
On PowerShell I get an "Internal Server Error : 500".
While running on studio.azureml.net, I get the notification as "Your experiment cannot be run because it has been updated in another session. Please re-open this experiment to see the latest version."
Is there anyway to move an experiment with external dependencies to another workspace and run it?
Edit : I think the problem is something to do with how the experiment handles the AccountKey. When I enter it manually, it's converted into a JSON array comprising of RecordKey and IndexInRecord. But when I upload the JSON experiment with the accountKey, it continues to remain the same and does not get resolved into RecordKey and IndexInRecord.

For me publishing the experiment as a private experiment for the cortana gallery is one of the most useful options. Only the people with the link can see and add the experiment for the gallery. On the below link I've explained the steps I followed.
https://naadispeaks.wordpress.com/2017/08/14/copying-migrating-azureml-experiments/

When the experiment is copied, the pwd is wiped for security reasons. If you want to programmatically inject it back, you have to set another metadata field to signal that this is a plain-text password, not an encrypted password that you are setting. If you export the experiment in JSON format, you can easily figure this out.

I think I found the issue why you are unable to export the credentials back.
Export the JSON graph into your local disk, then update whatever parameter has to be updated.
Also, you will notice that the credentials are stored as 'Placeholders' instead of 'Literals'. Hence it makes sense to change them to Literals instead of placeholders.
This you can do by traversing through the JSON to find the relevant parameters you need to update.
Here is a brief illustration.
Changing the Placeholder to a Literal:

Related

Azure DevOps - Can we reuse the value of a key in the same variable group?

I have lots of URL values and their keys. But there is no way to batch import the variables and the "value" controls are also not text boxes in the Variables Group page to perform chrome browser extensions assisted find and replace.
If this is possible, what is the syntax to refer to the key?
As in, I have a variable App.URL : www.contoso.com.
I am using the key to substitute value in my next variable like this Login.URL : $(App.URL)\Login and this doesn't work.
GitHub link : https://github.com/MicrosoftDocs/vsts-docs/issues/3902#issuecomment-489694654
This isn't currently available, not sure if it will be. Can you create a task early in your pipeline that sets the variables you need in subsequent tasks/steps? This gives you more control as you can store the script along with your source. You could then use a pipeline variable for the environment you're in and let your script use that to set values appropriately.
See Set variables in scripts in the MS docs.
If it's not possible to re-architect your app to concatenate the url strings in the application, what the previous commenter said about creating a simple script to do that for you would be the way to go. Ie:
#!/bin/bash
#full login url
fullLoginUrl=$APP.URL\$LOGINSUFFIX
echo "##vso[task.setvariable variable=Login.URL]$fullLoginUrl
Otherwise, perhaps playing around with the run time vs compile time variables in YAML pipelines might be worth trying.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#understand-variable-syntax

B&R Automation Studio transfer post event

Is there any way to execute a post project transfer event when transferring a project to a PLC?
I want to automatically change the value of a variable using fx the PVI interface every time I do a transfer.
I am not entirely sure what the usecase is for this. However the easiest way for some kind of post transfer script would be to utilize the Runtime Utility Center (RUC).
In the RUC, you can define instruction lists for a B&R PLC with an online connection. This includes instructions for transferring projects and setting values of process variables (PVs).
For transferring the project with RUC, you need to create a RUC package. This can be done under Settings/Export to Runtime Utility Center. You could also do this from the command line. More details in the help under Project management/Project installation/ Performing project installation/Export RUC Guid: cfe34190-f436-4c14-b06d-3a4ca39be7e7
This will create a zip, which you can then use in your RUC. For transfer command there is a wizard which activates when you double click on the command Transfer to target under Project installation The result is a line in the instruction list, which might look like this:
Transfer "C:\path\to\your\zip\project.zip", "InstallMode=Consistent InstallRestriction=AllowUpdatesWithoutDataLoss KeepPVValues=1 ExecuteInitExit=1"
After transferring you can write your PV. Under Process variable functions in the RUC you can find the command Write process variable. Also here there is a wizard and the result looks like this:
WriteVariable "taskname\VariableName", "USINT", "2"
I am using AS 4.4.6. There might be slight differences when using a other version.

Recovering data from Firebird database partially-encrypted by ransomware

our test server was hacked and they installed a ransomware (Cry36) for which there is no solution to date. We also didn't keep any snapshots up to date (lesion learned).
Since it's only a test server, i am not too worried. But we had stored in our Firebird DB (v2.5) a bunch of work which i would like to save.
Looking at the database in a hex editor, i can see that the data is encrypted up until offset 00006430.
Looking at the structure of the firebird database it says that all the headers are encrypted (Header page, PIP,..., Data page).
All the data is still there.
I've tryed with gfix and even copying the headers from an older version of the db. But while it does fix the db, the headers are wrong and most of the new pages are removed.
Does anyone have any idea how to restore the database or extract the tables?
Regards
I have used this method restoring ransomware files encrypted on hard drives from any ransomware by renaming the file in question back to its original filename and extension. You may be able to apply the same method to revert the data or database back to the pre-encrypted version of the file/s or data/bases.
From my testing:
the ransomed file = is compressed and or simply renamed, the encryption is either not applied actually but only implied or the containing file or renamed file is encrypted but the original file is never touched. Simply rename back to original and you can access the file as you could be for the attack. Example:
This is the Ransomed file:
Adobe Acrobat XI Pro 11.0.20.zip.id[42AF04FF-2275].[supportcrypt2019#cock.li].Adame
This is the Ransomed file, renamed and fixed:
Adobe Acrobat XI Pro 11.0.20.zip
The removed portion of the FileName is:
.id[42AF04FF-2275].[supportcrypt2019#cock.li].Adame
Upon renaming the file, you will be prompted for approval to change the application type/ file type for which the file will be opened (Back to its original state), and what application will open it (its original designation as determined by the FileType preset after the FileName. The reason the file doesn't work when ransomed is the final file extension renaming scheme, whereas in this case .ADAME is not a real file type, but made up, and no program will or can open it. Thus, the file can not be opened as named.
You would need to do this for each file individually, could you post more information on the database file and encryption information as this should work for you as well. The Ransom Methodology should be the same. I can not identify the naming scheme used on your system without more information pertaining to unusual or new/unidentified portions of code injected throughout your instance.
For Renaming multiple files you could try an application such as "Advanced Renamer" for bulk processing.

Jenkins How can i upload a text file and use it as a parameter

I have a txt file that is holding a string inside, I want to be able to use this string in one of my scripts, so I'm wondering if there is a way to set the content of the file as one of the build properties or parameters which I'll be able to use in my scripts it should be the same as using one of the build environment properties.
For example : ${JOB_NAME} which is holding the the job name, so in the same way I want to access the content of the file which is holding some value inside.
Is it possible?
You can upload a file from your computer to the workspace through the File parameter of the job.
You can use Extended Choice plugin parameter, to read value(s) from a file and display them in a dropdown/radio-button/checkbox for the user to select, dynamically, every time the build is triggered.
You can use EnvInject plugin to read value(s) from a file and inject them into the build as environment variables, so that they can be used by the rest of the build steps/scripts.
Your question is very unclear on what your are trying to do. Pick one of the 3 methods above based on what you need, or clarify your question.

How to write an Enterprise Library dataConfig.config file?

I have 'inherited' a test harness application which uses Enterprise Library for its SQL data access. In the app.config file (enterpriselibrary.configurationSettings), it references a "configurationSection" with a path to "dataConfig.config", which is encrypted. I would like to change the database connection properties, but EntLibConfig.exe will not open the dataConfig.config or app.config (I have the FileKeyAlgorithmPairStorageProviderData file).
The test harness application runs, so its configured ok.
I can, in code, using (Microsoft.Practices.EnterpriseLibrary.Data.Configuration.ConfigurationManager.GetConfiguration("dataConfiguration")) read the data configuration, and can navigate all the instances and connection strings (security isn't an issue for this test harness). I can dump everything I need to a hand-crafted XML file (using GetType().AssemblyQualifiedName to get the full name for the classes which read the config file) and then change the app.config to read my new, unencrypted, xml dataConfig file.
All is fine, I can now change my database config settings.
However... given that ConfigurationManager.GetConfiguration("dataConfiguration") returns a fully populated instance of a DatabaseSettings object, is there not a method I can call which will write the XML file (dataConfig.config) for me ?
I appreciate that this is probably a really big hammer way to edit the data configuration, but after half a day of trying, I fell back on the old coding maxim... if you can't find the tool to do what you want, write your own !
Thanks
Well... turns out that its not that hard.
I added a new "configurationSection" to my app.config (dataConfiguration2), with encrypt set to false, with a path pointing to an new empty text file (dataConfiguration.config2). I then copied my encrypted dataConfiguration details using the following code:
using Entlib = Microsoft.Practices.EnterpriseLibrary.Configuration;
using Microsoft.Practices.EnterpriseLibrary.Data.Configuration;
:
DatabaseSettings settings = (DatabaseSettings)Entlib.ConfigurationManager.GetConfiguration("dataConfiguration");
Entlib.ConfigurationManager.WriteConfiguration("dataConfiguration2", settings);
...and it filled the empty file with the (unencrypted) configuration details.