How to write an Enterprise Library dataConfig.config file? - enterprise-library

I have 'inherited' a test harness application which uses Enterprise Library for its SQL data access. In the app.config file (enterpriselibrary.configurationSettings), it references a "configurationSection" with a path to "dataConfig.config", which is encrypted. I would like to change the database connection properties, but EntLibConfig.exe will not open the dataConfig.config or app.config (I have the FileKeyAlgorithmPairStorageProviderData file).
The test harness application runs, so its configured ok.
I can, in code, using (Microsoft.Practices.EnterpriseLibrary.Data.Configuration.ConfigurationManager.GetConfiguration("dataConfiguration")) read the data configuration, and can navigate all the instances and connection strings (security isn't an issue for this test harness). I can dump everything I need to a hand-crafted XML file (using GetType().AssemblyQualifiedName to get the full name for the classes which read the config file) and then change the app.config to read my new, unencrypted, xml dataConfig file.
All is fine, I can now change my database config settings.
However... given that ConfigurationManager.GetConfiguration("dataConfiguration") returns a fully populated instance of a DatabaseSettings object, is there not a method I can call which will write the XML file (dataConfig.config) for me ?
I appreciate that this is probably a really big hammer way to edit the data configuration, but after half a day of trying, I fell back on the old coding maxim... if you can't find the tool to do what you want, write your own !
Thanks

Well... turns out that its not that hard.
I added a new "configurationSection" to my app.config (dataConfiguration2), with encrypt set to false, with a path pointing to an new empty text file (dataConfiguration.config2). I then copied my encrypted dataConfiguration details using the following code:
using Entlib = Microsoft.Practices.EnterpriseLibrary.Configuration;
using Microsoft.Practices.EnterpriseLibrary.Data.Configuration;
:
DatabaseSettings settings = (DatabaseSettings)Entlib.ConfigurationManager.GetConfiguration("dataConfiguration");
Entlib.ConfigurationManager.WriteConfiguration("dataConfiguration2", settings);
...and it filled the empty file with the (unencrypted) configuration details.

Related

"How to embed resources" or "How to access a Resource"

I am struggling with embedded resources or resources in general with Dynamics365. My goal is to add a xml-file as resource to a model and use that resource in some testcode.
I tried to add the xml as resource-element but it seems this does not embedd the xml into the compiled dll so i don't know how to pick up that xml-file in my testcode. Currently my testcode loads the xml from "C:\Temp\test.xml" where i copied my xml to, but thats not a viable solution and i thought adding the xml as resource would be ok. Or is there a better approach to this scenario ?
You can use class SysResource to interact with resources. I used the following code in one of my unit tests to load the content of a file resource into a file and create a CommaStreamIo instance from that file. You should be able to modify that to do your stuff with an xml file.
ResourceNode textFileResourceNode = SysResource::getResourceNode(resourceStr(MyTextFileResourceName));
str textFilename = SysResource::saveToTempFile(textFileResourceNode);
CommaStreamIo commaStreamIo = CommaStreamIo::constructForRead(File::UseFileFromURL(textFilename));
Also take a look at reading a resource into a string.
You could also take a look at how some of the standard resources are used. For example, there are several .xslt file resources that are used to transform bank statement formats.

Is there a way to know filenames generated with MultiResourceItemWriter?

I'm writing a spring-batch application with spring-boot support and I'm looking for a way to know which files were generated by MultiResourceItemWriter. The first solution I have in mind is to have a folder for only the files generated and check the content, but if there is something already implemented on spring-batch would be great!
The intention is to encrypt and then upload each file to an sftp server.
The file names generated by the MultiResourceItemWriter are the combination of the resource name + the suffix created by the ResourceSuffixCreator. For example, if you create the writer like the following:
MultiResourceItemWriter<String> writer = new MultiResourceItemWriter<>();
writer.setResource(new FileSystemResource(new File("data.txt")));
writer.setResourceSuffixCreator(index -> "part" + index);
Then the generated files will be data.txt.part1, data.txt.part2, etc.
MultiResourceItemWriter doesn't perform write directly but delegate this job to other components.
All those components are ResourceAwareItemWriterItemStream implementors so you may write a ResourceAwareItemWriterItemStreamDelegate, intercept setResource() method and store resource into current step execution-context as a collection.
If you want to pass this list of resources to next steps you may use an ExecutionContextPromotionListener.

Unable to run experiment on Azure ML Studio after copying from different workspace

My simple experiment reads from an Azure Storage Table, Selects a few columns and writes to another Azure Storage Table. This experiment runs fine on the Workspace (Let's call it workspace1).
Now I need to move this experiment as is to another workspace(Call it WorkSpace2) using Powershell and need to be able to run the experiment.
I am currently using this Library - https://github.com/hning86/azuremlps
Problem :
When I Copy the experiment using 'Copy-AmlExperiment' from WorkSpace 1 to WorkSpace 2, the experiment and all it's properties get copied except the Azure Table Account Key.
Now, this experiment runs fine if I manually enter the account Key for the Import/Export Modules on studio.azureml.net
But I am unable to perform this via powershell. If I Export(Export-AmlExperimentGraph) the copied experiment from WorkSpace2 as a JSON and insert the AccountKey into the JSON file and Import(Import-AmlExperiment) it into WorkSpace 2. The experiment fails to run.
On PowerShell I get an "Internal Server Error : 500".
While running on studio.azureml.net, I get the notification as "Your experiment cannot be run because it has been updated in another session. Please re-open this experiment to see the latest version."
Is there anyway to move an experiment with external dependencies to another workspace and run it?
Edit : I think the problem is something to do with how the experiment handles the AccountKey. When I enter it manually, it's converted into a JSON array comprising of RecordKey and IndexInRecord. But when I upload the JSON experiment with the accountKey, it continues to remain the same and does not get resolved into RecordKey and IndexInRecord.
For me publishing the experiment as a private experiment for the cortana gallery is one of the most useful options. Only the people with the link can see and add the experiment for the gallery. On the below link I've explained the steps I followed.
https://naadispeaks.wordpress.com/2017/08/14/copying-migrating-azureml-experiments/
When the experiment is copied, the pwd is wiped for security reasons. If you want to programmatically inject it back, you have to set another metadata field to signal that this is a plain-text password, not an encrypted password that you are setting. If you export the experiment in JSON format, you can easily figure this out.
I think I found the issue why you are unable to export the credentials back.
Export the JSON graph into your local disk, then update whatever parameter has to be updated.
Also, you will notice that the credentials are stored as 'Placeholders' instead of 'Literals'. Hence it makes sense to change them to Literals instead of placeholders.
This you can do by traversing through the JSON to find the relevant parameters you need to update.
Here is a brief illustration.
Changing the Placeholder to a Literal:

Recovering data from Firebird database partially-encrypted by ransomware

our test server was hacked and they installed a ransomware (Cry36) for which there is no solution to date. We also didn't keep any snapshots up to date (lesion learned).
Since it's only a test server, i am not too worried. But we had stored in our Firebird DB (v2.5) a bunch of work which i would like to save.
Looking at the database in a hex editor, i can see that the data is encrypted up until offset 00006430.
Looking at the structure of the firebird database it says that all the headers are encrypted (Header page, PIP,..., Data page).
All the data is still there.
I've tryed with gfix and even copying the headers from an older version of the db. But while it does fix the db, the headers are wrong and most of the new pages are removed.
Does anyone have any idea how to restore the database or extract the tables?
Regards
I have used this method restoring ransomware files encrypted on hard drives from any ransomware by renaming the file in question back to its original filename and extension. You may be able to apply the same method to revert the data or database back to the pre-encrypted version of the file/s or data/bases.
From my testing:
the ransomed file = is compressed and or simply renamed, the encryption is either not applied actually but only implied or the containing file or renamed file is encrypted but the original file is never touched. Simply rename back to original and you can access the file as you could be for the attack. Example:
This is the Ransomed file:
Adobe Acrobat XI Pro 11.0.20.zip.id[42AF04FF-2275].[supportcrypt2019#cock.li].Adame
This is the Ransomed file, renamed and fixed:
Adobe Acrobat XI Pro 11.0.20.zip
The removed portion of the FileName is:
.id[42AF04FF-2275].[supportcrypt2019#cock.li].Adame
Upon renaming the file, you will be prompted for approval to change the application type/ file type for which the file will be opened (Back to its original state), and what application will open it (its original designation as determined by the FileType preset after the FileName. The reason the file doesn't work when ransomed is the final file extension renaming scheme, whereas in this case .ADAME is not a real file type, but made up, and no program will or can open it. Thus, the file can not be opened as named.
You would need to do this for each file individually, could you post more information on the database file and encryption information as this should work for you as well. The Ransom Methodology should be the same. I can not identify the naming scheme used on your system without more information pertaining to unusual or new/unidentified portions of code injected throughout your instance.
For Renaming multiple files you could try an application such as "Advanced Renamer" for bulk processing.

Error showing at tAccessInput component

While using tAccessInput component in my job, it showing error like,
It may not be a database that your application recognizes, or the file may be corrupt.
But, it all the connections and database/table names are valid in my job.
What can be the problem. and how can i resolve it.
Metioning the exact DBVersion is more important for tAccessInput/tAccessOutput.
Like Access 2003/Access 2007 and associated database file name.