List Analysis Services Tabular instance tables in PowerShell - ssas-tabular

I need to list Tabular SSAS (Compatibility version 1500) tables from PowerShell.
Invoke-ASCmd command in Sql Server PowerShell package looks promising, however I'm a bit lost in documentation.
I can see that the following query from examples lists datasources of a tabular instance:
Invoke-ASCmd -Database:"Adventure Works DW 2008R2" -Query:
"<Discover xmlns='urn:schemas-microsoft-com:xml-analysis'>
<RequestType>DISCOVER_DATASOURCES</RequestType>
<Restrictions></Restrictions><Properties></Properties>
</Discover>"
It looks like RequestType parameter is what I'm after; I didn't find any documentation on it so tried guessing DISCOVER_TABLES, LIST_TABLES and TABLES which were rejected.
TMSL (which is what 1500 supports according to this link) has commands for altering and deleting tables, however I cannot find anything on querying or listing.
Dynamic Management Views sound like a possible solution however I cannot figure out the syntax.
From "Script Administrative Tasks in Analysis Services":
You can create a standalone MDX script file that queries data or system information. For example, Dynamic Management Views (DMV) that expose information about local server operations and server health are accessed via the MDX Select statement.
Found this discussion and tried
Invoke-ASCmd -Server "localhost" -Database:"database" -Query:"SELECT * FROM DBSCHEMA_TABLES"
however am getting an error
-1055522771 "Either the user X does not have permission to access the referenced mining model, DBSCHEMA_TABLES, or the object does not exist."

I use this to show all tables in tabular model database:
<Discover xmlns='urn:schemas-microsoft-com:xml-analysis'>
<RequestType>TMSCHEMA_TABLES</RequestType>
<Restrictions>
<RestrictionList>
<SystemFlags>0</SystemFlags>
</RestrictionList>
</Restrictions>
<Properties>
<PropertyList>
<CATALOG>YOUR_TABULAR_MODEL_DATABASE_NAME</CATALOG>
</PropertyList>
</Properties>
</Discover>
Hope this helps. For full reference see here or here.

Related

Making a determination whether DB/Server is down before I kick-off a pipeline

I want to check whether the database/Server is Online before I kick off a pipeline. In the database is down I want to cancel the pipeline processing. I also would like to log the results in a table.
format (columns) : DBName Status Date
If the DB/Server is down then I want to send an email to concerned team with formatted table showing which DB/Servers are down.
Approach:
Run a query on each of the servers. If there is a result, then format output as shown above. I am using ADF pipeline to achive this. My issue is how do I combine various outputs from different servers.
For e.g.
Server1:
DBName: A Status: ONLINE runDate:xx/xx/xxxx
Server2:
DBName: B Status: ONLINE runDate:xx/xx/xxxx
I would like to combine them as follows:
Server DBName Status runDate
1 A ONLINE xx/xx/xxxx
2 B ONLINE xx/xx/xxxx
Use this to update the logging table as well as in the email if I were to send one out.
Is this possible using the Pipeline activities or do I have to use mapping dataflows?
I did similar work a few weeks ago. We make an API where we put all server-related settings or URL endpoint which we need to ping.
You don't require to store username-password (of SQL Server) at all. When you ping the SQL server, it will timeout if it isn't online. If it's online it will give you password related error. This way you can easily figure out whether it's up and running.
AFAIK, If you are using azure-DevOps you can use your service account to log into the SQL server. If you have set up an AD to log into DevOps, this thing can be done in the build script.
Both way you will be able to make sure whether SQL Server is Up and Running or not.
You can have all the actions as tasks in a yaml pipeline
You need something like below:
steps:
task: Check database status
register: result
task: Add results to a file
shell: "echo text >> filename"
task: send e-mail
when: some condition is met
There are several modules to achieve what you need. You need to find the right modules. You can play around with the flow of tasks by registering results and using the when clause.

How to insert data into my SQLDB service instance (Bluemix)

I have created an SQLDB service instance and bound it to my application. I have created some tables and need to load data into them. If I write an INSERT statement into RUN DDL, I receive a SQL -104 error. How can I INSERT SQL into my SQLDB service instance.
If you're needing to run your SQL from an application then there are several examples (sample code included) of how to accomplish this at the site listed below:
http://www.ng.bluemix.net/docs/services/SQLDB/index.html#run-a-query-in-java
Additionally, you can execute SQL in the SQL Database Console by navigating to Manage -> Work with Database Objects. More information can be found here:
http://www.ng.bluemix.net/docs/services/SQLDB/index.html#sqldb_005
s.executeUpdate("CREATE TABLE MYLIBRARY.MYTABLE (NAME VARCHAR(20), ID INTEGER)");
s.executeUpdate("INSERT INTO MYLIBRARY.MYTABLE (NAME, ID) VALUES ('BlueMix', 123)");
Full Code
Most people do initial database population or migrations when they deploy their application. Often these database commands are programming language specific. The poster didn't include the programming language. You can accomplish this two ways.
Append a bash script that would call your database scripts that you uploaded. This project shows how you can call that bash script from within your manifest file as part of doing a CF Push.
Some languages like offer a file type or service that will automatically get used to populate the database on initial deploy or when your migrate/synch the db. For example Python Django offers a "fixtures" file that will automatically take a JSON file and populate your database tables

How do I use Puppet's ralsh with resource types provided by modules?

I have installed the postgresql module from Puppetforge.
How can I query Postgresql resources using ralsh ?
None of the following works:
# ralsh postgresql::db
# ralsh puppetlabs/postgresql::db
# ralsh puppetlabs-postgresql::db
I was hoping to use this to get a list of databases (including attributes such as character sets) and user names/passwords from the current system in a form that I can paste into a puppet manifest to recreate that setup on a different machine.
In principle, any puppet client gets the current state of your system from another program called Facter. You should create a custom Fact (a module of Facter), and then included into your puppet client. Afterwards, I think you could call this custom Fact from ralsh.
More information about creating a custom Fact can be found in here.
In creating your own Fact, you should execute your SQL query and then save the result into particular variable.

Is the Entity Framework compatible with SQL Azure?

I cannot get the Entity Framework to work against SQL Azure. Is this just me or is it not intended to be compatible? (I have tried the original release of EF with VS2008 as well as the more recent VS2010 Beta 2 version)
To check this I created the simplest scenario possible. Add a single table to a local SQL Server 2008 instance. The table has two columns, a primary key of type integer and a string column. I add a single row to the table with values of (1, foobar). I then added exactly the same setup to my SQL Azure database.
Created a console application and generated an EF model from the local database. Run the application and all is good, the single row can be returned from a trivial query. Update the connection string to connect to the SQL Azure and now it fails. It connects to the SQL Azure database without problems but the query fails on processing the result.
I tracked the initial problem down using the exception information. The conceptual model had the attribute Schema="dbo" set for the entity set of my single defined entity. I removed this attribute and now it fails with another error...
"Invalid object name 'testModelStoreContainer.Test'."
Where 'Test' is of course the name of the entity I have defined and so it looks like it's trying to create the entity from the returned result. But for some unknown reason cannot work out this trivial scenario.
So either I am making a really fundamental error or SQL Azure is not compatible with the EF? And that seems just crazy to me. I want to use the EF in my WebRole and then RIA Services to the Silverlight client end.
While I haven't done this myself I'm pretty sure that members on the EF team have, as has Kevin Hoffman.
So it is probably just that you went astray with one step in your porting process.
It sounds like you tried to update the EDMX (XML) by hand, from one that works against a local database.
If you do this most of the changes will be required in the StorageModel element in the EDMX (aka SSDL). But it sounds like you've been making changes in the ConceptualModel (aka CSDL) element.
My guess is you simply need to replace all references to the dbo schema in the SSDL with whatever schema is the SQL Azure schema.
Hope this helps
Alex
To answer the main question - Yes, at least the Entity Framework v4 can be used with SQL Azure - I haven't honestly tried with the initial version (from the .Net Framework 3.5. SP 1).
A little while back I did a complete project and blogged about the experience: http://www.sanderstechnology.com/?p=9961 Hopefully this might help a little bit!
Microsoft's Windows Azure documentation contains How to: Connect to Windows Azure SQL Database Using the ADO.NET Entity Framework.
After creating your model, these instructions describe how to use SQL Azure with the Entity Framework:
Migrate the School database to SQL Database by following the instructions in How to: Migrate a Database by Using the Generate Scripts Wizard (Windows Azure SQL Database).
In the SchoolEFApplication project, open the App.Config file. Change the connection string so that it connects to your SQL Database.
<connectionStrings>
<add name="SchoolEntities"
connectionString="metadata=res://*/SchoolDataModel.csdl|res://*/SchoolDataModel.ssdl|res://*/SchoolDataModel.msl;provider=System.Data.SqlClient;provider connection string="Data Source=<provideServerName>.database.windows.net;Initial Catalog=School;Integrated Security=False;User ID=<provideUserID>;Password=<providePassword>;MultipleActiveResultSets=True;Encrypt=True;TrustServerCertificate=False""
providerName="System.Data.EntityClient"/>
</connectionStrings>
Press F5 to run the application against your SQL Database.

Unable to import data into ArcSDE (9.3.1) and PostgreSQL (8.3.0)

I've just installed ArcGIS Server Enterprise Advanced with ArcSDE and PostgreSQL, on a virtual Windows Server 2008 box.
After installing, I've been trying to import a feature class (stored in a shapefile) into the geodatabase.
In order to do this I've created a connection to ArcSDE (not a direct database connection) using ArcCatalog -> Database Connections -> Add Spatial Database Connection. I've tested the connection successfully.
However, when I run the tool "Feature Class to Geodatabase", I get the following error message: Failed to convert DNorthEnergyRiskMaps\RiskMapsLibraryTests\Resources\ProbabilityTools\TestFacies.shp. ERROR 000210: Cannot create output Database Connections\s2008NE.sde\arcgis.sde.TestFacies
Failed to execute (CopyFeatures).
According to this blog post, this error is a generic "catch-all".
The blog post suggests some debugging steps which I've followed. I've had ArcMap create an intercept file. However, I'm non-the-wiser after looking at it (users at the ESRI forum says there are no errors in the intercept file). Maybe someone with more experience could interpret it better...
Also, I've scanned through the ArcSDE and PostgreSQL logs... The only reported errors are in the latter log; multiple SELECT queries are failing because the target tables doesn't exist. Some examples: 2009-09-29 13:33:38 CEST ERROR: relation "sde.sdb_surveydatasets" does not exist
2009-09-29 13:33:38 CEST STATEMENT: SELECT 1 FROM arcgis.sde.SDB_SurveyDatasets WHERE 1 = 0
2009-09-29 13:33:38 CEST ERROR: relation "sde.sch_dataset" does not exist
2009-09-29 13:33:38 CEST STATEMENT: SELECT 1 FROM arcgis.sde.SCH_DATASET WHERE 1 = 0
Help would be much appreciated.
Yes, ArcView is restricted to editing in file and personal geodatabases. You need an ArcEditor or higher license to edit ArcSDE.
See the section "Editing with ArcView" on this page.
Try the 'Feature Class to Feature Class' geoprocessor tool instead of 'Feature Class to Geodatabase'. Sometimes the individual geoprocessor tools execute differently or report errors differently.
If that doesn't work, try creating a new feature class directly in the SDE workspace and import the schema from the shapefile. Once it is successfully created, import data into the feature class from the shapefile.
I recommend trying to create a new feature class from scratch and seeing if that works in your PostgreSQL environment first and then work on importing.