Biml Book Chapter 3, Unable to Generate SSIS Packages from BIML via BimlExpress 2018 - biml

I am going through The BIML Book.
I'm in chapter 3, page 90 (based on my PDF, purchase from APress).
I'm using SSDT/Visual Studio 2013 (Target SSIS Version: 2014)
BIML Express 2018
I've got 3 BIML files that I've manually troubleshot:
1-2-CreateEnvironment.biml
1-2-CreateBimlTableObject.biml
x-2-CreateLoadPackage.biml
I've already done build 1, to generate (and subsequently executed) the DeployTable.dtsx file.
I'm trying to get all of the staging load packages described at the bottom of page 90, and in figure 3-34.
The issue is that the packages never get generated. They never show up in my solution.
The BIML Compiler output window shows:
Expanding Biml
Biml expansion completed.
There are no errors in VS.
I've tried running VS as admin (thought maybe it possibly a permissions issue to write to disk)
I've added an additional BIML file for logging the BIML Compiler details to a file.
That BIML is as follows:
<## template tier="1" #>
<## import namespace="Varigence.Utility.Logging" #>
<#
var loggingManager = new LoggingManager(Logging.LoggingMode.File) {
IsEnabled = true, LogFilePath = #"C:\temp\log.txt" };
LoggingManager.RegisterDefaultLoggingManager(loggingManager);
#>
Just wondering what I'm doing wrong, if anything. Do I need to use VS 2017?

Finally! Starting to grok BIML and its tools. It's a different paradigm, IMO.
I was able to resolve the issue. It had to do with an invalid connection string parameter. Classic "Extra space/missing a space" issue.
The best thing from this, apart from getting it to work, was I was able to use Intellisense in the BIML file to emit results in the BIML Preview pane, showing that the collection of table nodes returned by Connection.GetDatabaseSchema() had a Count() of 0 (See line starting with <!--" below).
<## template tier="20" #>
<## import namespace="Varigence.Biml.CoreLowerer.SchemaManagement" #>
<## code file="DebuggerUtilities.cs" #>
<#
var sourceConnection = RootNode.OleDbConnections["Source"];
var includedSchemas = new List<string>{"HumanResources","Person","Production","Purchasing","Sales"};
var importResult = sourceConnection.GetDatabaseSchema(includedSchemas, null,ImportOptions.ExcludeForeignKey | ImportOptions.ExcludeColumnDefault |ImportOptions.ExcludeViews);
#>
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Tables>
<!--<#=importResult.TableNodes.Count()#>-->
<# foreach (var table in importResult.TableNodes) { #>
<Table Name="<#=table.Name#>" SchemaName="Staging.<#=table.Schema#>">
<Columns>
<# foreach (var column in table.Columns) { #>
<#=column.GetBiml()#>
<# } #>
<Column Name="LoadDateTime" DataType="DateTime2"/>
</Columns>
</Table>
<# } #>
</Tables>
</Biml>
So the preview was emitting
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
<Tables>
<!--0-->
</Tables>
</Biml>
Which led me back to the Connection.

Related

Pester's XML report contains only the file name, not full path

I'm new to Pester, I was testing the things locally then i got this problem. When i generate the XML report in pester it contains this line
<package name="code"><class name="code/cal" sourcefilename="cal.ps1">
but when i generate the HTML file using ReportGenerator, it gives "file does not exist (any more)". So i change the above code snippet it started giving the result without any error. i.e.,
<package name="code"><class name="code/cal" sourcefilename=".\code\cal.ps1">
i have below directory structure
root
|_ code #contains all the main scripts
|_ test #contains all the test scripts
I even tried adding this code in the AfterAll block, but the XML file gets generated only after its execution. so it's giving a file not found error.
AfterAll{
$script_name = "cal.ps1"
$script_path = (".\code\" + "cal.ps1")
(Get-Content "coverage.xml").replace( $script_name , ( $script_path + $script_name ) ) | Set-Content "coverage.xml"
}
So, is there any way for doing that change automatically?
Thanks in advance

Publish SSRS by Octopus

I'm building the set up to deploy my SSRS reports through Octopus Deploy, I found out one Octopus Library and I'm working on it, but I've had some issues:
1º ---- Message error: (The path is alright, but it keeps with the same warning)
WARNING: Unable to find datasource SalesDrivers in /Sales Drivers/Data Sources
2º ---- The method doesn't exist
Method invocation failed because [Microsoft.PowerShell.Commands.NewWebserviceProxy.AutogeneratedTypes.WebServiceProxy3er_ReportService2005_asmx_wsdl.ReportingService2005] doesn't contain a method named 'LoadReportDefinition'.
The powershell function from the template\library that is throwing the error can been seen below:
#region Update-ReportParamters()
Function Update-ReportParameters($ReportFile)
{
# declare local variables
$ReportParameters = #();
# necessary so that when attempting to use the report execution service, it doesn't puke on you when it can't find the data source
$ReportData = (Remove-SharedReferences -ReportFile $ReportFile)
# get just the report name
$ReportName = $ReportFile.SubString($ReportFile.LastIndexOf("\") + 1)
$ReportName = $ReportName.SubString(0, $ReportName.IndexOf("."))
# create warnings object
$ReportExecutionWarnings = $null
# load the report definition
Write-Host "*********************************************"
#Write-Host $ReportData
#(Remove-SharedReferences -ReportFile $ReportFile)
#Write-Host $ReportExecutionWarnings
$ExecutionInfo = $ReportExecutionProxy.LoadReportDefinition($ReportData, [ref] $ReportExecutionWarnings);
# loop through the report execution parameters
foreach($Parameter in $ExecutionInfo.Parameters)
{
# create new item parameter object
$ItemParameter = New-Object "$ReportServerProxyNamespace.ItemParameter";
# fill in the properties except valid values, that one needs special processing
Copy-ObjectProperties -SourceObject $Parameter -TargetObject $ItemParameter;
# fill in the valid values
$ItemParameter.ValidValues = Convert-ValidValues -SourceValidValues $Parameter.ValidValues;
# add to list
$ReportParameters += $ItemParameter;
}
# force the parameters to update
Write-Host "Updating report parameters for $ReportFolder/$ReportName"
if ($IsReportService2005) {
$ReportServerProxy.SetReportParameters("$ReportFolder/$ReportName", $ReportParameters);
}
elseif ($IsReportService2010) {
$ReportServerProxy.SetItemParameters("$ReportFolder/$ReportName", $ReportParameters);
}
else { Write-Warning 'Report Service Unknown in Update-ReportParameters method. Use ReportService2005 or ReportService2010.' }
}
Anyone knows how I could sort it out?
I have solved a similar problem but took a slightly different approach. Rather than using powershell and octopus directly I used the useful open source tool RSBuild to deploy the reports. It is pretty easy to bundle up the rsbuild.exe executable (it is tiny) and a deploy.config along with your reports inside the octopus package. Then you can use octopus's substitution feature to rewrite the config file and Powershell function to execute the executable. This also has the advantage that you can deploy easily without octopus, the config for data sources and reports is declarative in XML rather than procedural in Powershell and the smarts of your scripted deployment can live alongside your reports rather than buried in Octopus.
So my config looks a bit like:
<?xml version="1.0" encoding="utf-8" ?>
<Settings>
<Globals>
<Global Name="CollapsedHeight">0.5in</Global>
</Globals>
<ReportServers>
<ReportServer Name="RS1" Protocol="http" Host="${ReportServer}" Path="${ReportServerPath}" Timeout="30" />
</ReportServers>
<DataSources>
<DataSource Name="Source1" Publish="true" Overwrite="true" TargetFolder="Data Sources" ReportServer="RS1">
<ConnectionString>data source=${ReportServer};initial catalog=${DatabaseName}</ConnectionString>
<CredentialRetrieval>Store</CredentialRetrieval>
<WindowsCredentials>False</WindowsCredentials>
<UserName>${RepotrUser}</UserName>
<Password>${ReportsPassword}</Password>
</DataSource>
</DataSources>
<Reports>
<ReportGroup Name="Details" DataSourceName="Source1" TargetFolder="Reports"
ReportServer="RS1" CacheTime="10080">
<Report Name="BusinessReportABC">
<FilePath>reports\BusinessReportABC.rdl</FilePath>
</Report>
<!--More reports here-->
</ReportGroup>
</Reports>
</Settings>
My deployed octopacked artefacts contain RSBuild.Core.dll, RSBuild.exe, deploy.config and the reports files
Then I simply call the executable using powershell:
PS> rsbuild deploy.config

Entity Framework Power Tools Reverse Engineer Code First into a folder

I'm using Entity Framework Power Tools Reverse Engineer Code First to generate my POCO classes, mapping files, and context from the database. I was able to change the T4 templates to generate a different namespace based on my database schema, but I am not able to find how to create a folder based on the tables schema and place the related POCO classes in the folder.
Could somebody help?
Thanks
The folders for the model (and the mappings) are hard-coded in the tool. Reverse-engineering EfPowerTools.dll shows the following lines in method ReverseEngineerCodeFirst of ReverseEngineerCodeFirstHandler:
string str3 = str2 + ".Models";
string path1_1 = Path.Combine(directory.FullName, "Models");
string str4 = str3 + ".Mapping";
string path1_2 = Path.Combine(path1_1, "Mapping");
So, too bad, you can't change the name and location of these folders.
I'd have to add another answer as I have tried the approach suggested in my previous one and that didn't work. I have changed EF Power Tools in order to output files to different folder or project.
You need to install the following EF Power Tools Extension (https://entityframework.codeplex.com/SourceControl/network/forks/khorvat/EFPowerToolsEx)
Use this code to accomplish the export
var efHost = (EfTextTemplateHost)Host;
var code = new CodeGenerationTools(this);
var dte = efHost.DTE;
EnvDTE.Project ModelProject = null;
foreach(EnvDTE.Project dteProject in dte.Solution)
{
if (dteProject.Name.Equals("YourModelProjectName"))
ModelProject = dteProject;
}
var ModelProjectDirectory = new FileInfo(ModelProject.FullName).Directory;
var ModelProjectNamespace = (string)ModelProject.Properties.Item("RootNamespace").Value;
string ModelNameSpace = ModelProjectNamespace + ".Model";
string outputPath = Path.Combine(ModelProjectDirectory + ModelExportPath + #"Generated\I" + efHost.EntityType.Name + ".cs");
Directory.CreateDirectory(Path.GetDirectoryName(outputPath));
if (ModelProject.DTE.SourceControl.IsItemUnderSCC(outputPath) && !ModelProject.DTE.SourceControl.IsItemCheckedOut(outputPath))
ModelProject.DTE.SourceControl.CheckOutItem(outputPath);
File.WriteAllText(outputPath, this.GenerationEnvironment.ToString());
ModelProject.ProjectItems.AddFromFile(outputPath);
this.GenerationEnvironment.Clear();
With this you will be able to export output to another file, folder and even a project.
Update
As mentioned in other answer this approach won't work. So the answer is no longer applied.
You can try resolving the output path and create a folder by doing the following:
<## import namespace="System.IO" #>
var efHost = (EfTextTemplateHost)Host;
var outputPath = Path.Combine(Path.GetDirectoryName(efHost.TemplateFile), "YourFolder");
if (!Directory.Exists(outputPath))
Directory.CreateDirectory(outputPath);
Now to output to different folder you can try using the GenerationEnvironment similar to this:
<## dte processor="T4Toolbox.DteProcessor" #>
<## TransformationContext processor="T4Toolbox.TransformationContextProcessor" #>
<## assembly name="System.Xml" #>
<## assembly name="EnvDTE" #>
<## import namespace="T4Toolbox" #>
ProcessOutputTemplate template = new ProcessOutputTemplate(this.GenerationEnvironment.ToString());
template.Output.File = outputPath;
template.Render();
this.GenerationEnvironment.Clear();
Note: this approach requires the T4 Toolbox installed in the VS 2012/13 - http://www.olegsych.com/t4toolbox/ (http://www.olegsych.com/t4toolbox/gettingstarted/)
I have modified the EFPowerTool extension to support the namespace based directory structure creation. Created a pull request on EF 6.x project at codeplex. Also I have created an experimental branch on github for testing purpose.(There surely are room for fixes/enhancement which can be added and tested before sending updated pull request)
You can download the extension installer with the proposed fix from here(see install dir in source).

TFS Command Line not executing when output redirected

Why is the 'Resolve Conflict' window not showing when i redirect the output to a text file as shown below
d:\tfstest\tf resolve >myfile.txt
It shows when i choose not to redirect the output as follows:
d:\tfstest\tf resolve
Why is this happening????
When you use a redirect it doesn't show UI because using redirects is something you normally do in a script which chains things together. So instead the expectation is that you call tf resolve for each conflict and pass the advanced parameters to tell it how to handle these conflicts (in combination with /noprompt):
Microsoft (R) TF - Team Foundation Version Control Tool, Version
11.0.60315.1 Copyright (c) Microsoft Corporation. All rights reserved.
Resolves conflicts between changed items in your workspace and the
latest or destination versions of items on the server.
tf resolve [itemspec]
[/auto:(AutoMerge|AutoMergeForced|TakeTheirs|KeepYours|
OverwriteLocal|DeleteConflict|KeepYoursRenameTheirs)]
[/preview] [/recursive] [/newname:path] [/noprompt]
[(/overridetype:overridetype | /converttotype:converttype)]
[/properties:(#valuefile|name1=value1[;name2=value2;name3=#valuefile;
...]
[/login:username,[password]]

From Msi , how to get the list of files packed in each feature?

We have used wix to create Msi. Each Msi will be having 1 or 2 or 3 features such as Appserver feature, Webserver feature and DB server feature.
Now i was asked to get the list of config files presented in each feature.
It is tough to find the list of web.config files associated with each feature through wxs file.
Is it possible find the list of files associated with a feature with particular search pattern?
For ex. Find all the web.config files packed in Appserver feature.
Is there any way easy way ( querying or some other automated script such as powershell) to get the list?
Wix comes with a .NET SDK referred to as the DTF ("deployment tools foundation"). It wraps the windows msi.dll among other things. You can find these .NET Microsoft.Deployment.*.dll assemblies in the SDK subdirectory of the Wix Toolset installation directory. The documentation is in dtf.chm and dtfapi.chm in the doc subdirectory.
As shown in the documentation, you can use this SDK to write code which queries the msi database with SQL. You will be interested in the Feature, FeatureComponents and File tables.
If you haven't explored the internals of an MSI before, you can open it with orca to get a feel for it.
You can do it by making slight modifications to the Get-MsiProperties function described in this PowerShell article.
Please read the original article and create the prescribed comObject.types.ps1xml file.
function global:Get-MsiFeatures {
PARAM (
[Parameter(Mandatory=$true,ValueFromPipelineByPropertyName=$true,HelpMessage="MSI Database Filename",ValueFromPipeline=$true)]
[Alias("Filename","Path","Database","Msi")]
$msiDbName
)
# A quick check to see if the file exist
if(!(Test-Path $msiDbName)){
throw "Could not find " + $msiDbName
}
# Create an empty hashtable to store properties in
$msiFeatures = #{}
# Creating WI object and load MSI database
$wiObject = New-Object -com WindowsInstaller.Installer
$wiDatabase = $wiObject.InvokeMethod("OpenDatabase", (Resolve-Path $msiDbName).Path, 0)
# Open the Property-view
$view = $wiDatabase.InvokeMethod("OpenView", "SELECT * FROM Feature")
$view.InvokeMethod("Execute")
# Loop thru the table
$r = $view.InvokeMethod("Fetch")
while($r -ne $null) {
# Add property and value to hash table
$msiFeatures[$r.InvokeParamProperty("StringData",1)] = $r.InvokeParamProperty("StringData",2)
# Fetch the next row
$r = $view.InvokeMethod("Fetch")
}
$view.InvokeMethod("Close")
# Return the hash table
return $msiFeatures
}