Clean up of old dev models in OLAP folder of SSAS server - ssas-tabular

I have several SSAS servers that I maintain. When a user creates a tabular database in Visual Studio, SSDT creates a development copy of their database with this naming convention: ModelName_UserID_GUID. Many users have left the group, but these dev copies are still in the \OLAP folder. Is there an official way to clean these out, or can I just delete the folders on the server? I don't have storage issues at the moment, but I don't like crud lurking in my servers.

#Mrizza,
Easy way would be to login to the analysis service instance and right click on the unwanted cubes and delete them.
It should also clean up the backend spaces :)

Related

Is there any way to move individual entity from one server to another in Master data services?

I have master data model with some entity and it is deployed on production server.
Now i have created 2 more new entity in development server and wanted to move only these two entity.
If anyone has any idea please share with me.
Thanks !
You have two options.
Web-app(easiest): On your Dev server, go to System Administration. Click on Deployment and create a package. You then deploy this package by going on the production server, follow the same steps, but choose deploy instead of create under the 'deployment' button.
The alternative is to use the MDSModelDeploy.exe. You can find it on the server by going to the appropriate folder. Generally it's in this path: C:\Program Files\Microsoft SQL Server\130\Master Data Services\Configuration.
I recommend you use this method, as you have more control. You can choose to deploy with data, or without or clone your model. You can read more here ([https://learn.microsoft.com/en-us/sql/master-data-services/deploy-a-model-deployment-package-by-using-mdsmodeldeploy][1])
I can also recommend you consider the ModelPackageEditor when your model starts getting big. Then you have control over what you need to deploy, as in entities, views, business rules etc.
You need to have a deployment strategy in place, because if your development and production is not exactly the same, then you run into deployment errors. It normally happens when you create, for example business rules on the environment to which you are deploying and it is not on your dev environment. MDS uses copious amounts of id's and if the models are not in sync, then you run into problems.

Is it possible to deploy code to an Azure Cloud Service without a build step?

We've got some legacy on-premise apps that we're evaluating moving off-site, and we are evaluating all our options. I understand that Azure Web Sites would be a lot easier to setup, but at this point it looks like may need some of the additional control that Cloud Services gives us.
However, everything I've read about Cloud Services so far demonstrates how you build an app and then deploy the build to the cloud. Similarly, you can connect to a Visual Studio Online repository, define builds in VSO, and after a commit, a build is performed and the build is deployed to the cloud.
However, in our case some of our pages are Classic ASP pages. In the event that one of these pages changes, I have not been able to figure out a workflow that allows us to deploy the updated files. Remember, classic ASP files do not have a "build" process; it's like a powershell script that is interpreted at runtime.
There is no Visual Studio solution or project involved with these apps. It's just a package of files we want to upload. For a "proof of concept" I decided to start with the simplest possible "app," a simple "hello.txt" file, and I have not been able to figure out a way to deploy this without "wrapping" it in a Visual Studio solution.
I was hoping that I could use, e.g., Publish-AzureServiceProject, however this appears to need a ServiceDefinition.csdef file, and again, I'm not sure how to do this without setting up solution in Visual Studio--a solution that wouldn't be used for anything.
I have a feeling I'm missing something and just need to find the appropriate publish settings file, or proper use of an Azure cmdlet. Is there a straightforward way to publish a package of files to an Azure Cloud Service?
Josh, you will need to package the files into a deployable package. This can be achieved using the cspack commandline tool and a hand-crafted definition file. Your ASP files would be treated as 'content' in this case.
The easiest way would be just to create stub Visual Studio Solution and include a 'Cloud Service' project to which you add all the ASP files. This way all your files will be redeployed in the event that your Web Roles require recycling by the Azure fabric.
While this might seem like a big overhead if you need to tweak just a single file, it is the correct way to manage PaaS deployments in Azure. If this process doesn't work for you then you should consider moving to an IaaS VM you fully manage yourself.
One thing that may be helpful is to realize that the web role in Cloud Services are just VM's using IIS. For that reason, you can connect to them just like any other server, via RDP, FTP, etc. Our team often bypasses the overhead of simple things, like deploying a new CSS file, an image, etc. by simply copying it in the old school way.
Again, not sure if this helps you, but old school techniques work just as well. :-)

SQL Server Data Tools and Edmx

So we're using the new SSDT Microsoft released, pretty cool stuff. We are keeping a database project under version control with all the schemas, and an offline database for development and we can later deploy on SQL Azure database. We;re using EF in development, so my question is where would the edmx fit in, should we update the edmx file from the offline database or from the online SQL Azure directly, whats the best practice on this?
I would say that in your case "the production database is the truth", so I would update from SQL Azure. There's no right answer tho really.
Incidentally, in the early betas of SSDT it was possible to have a reference from an EDMX to a SSDT project thus your source code became the truth (which, in my opinion, is preferable) and the EDMX knew it was always working against "the truth". Unfortunately they ditched this feature and there are no signs of it returning.
For the EF to work correctly the EDMX file has to be in-synch with the database you are connecting to. It's hard to answer your question without knowing the development process you follow but I would imagine you use Sql Azure in production and develop against an on-premises database. Therefore one copy of the Edmx file will be used on production server. In the development environment you have a "living" copy of the edmx file that is changed as needed when the local database changes. When you get to the point you when you are ready to ship you deploy your app (include the edmx file) to a production environment that uses Sql Azure.
If, in your development environment, you update the edmx file from the SQL Azure then stuff will break or will not work correctly if the schema of the database in Azure is different from schema of your local database.

Discovering sqlserver installed on the machine

I'm going to deploy an application click once and wondered what the best technique for dealing with the database, since the click once install sqlserver to me I was wondering how I determine the sqlserver installed so that when the application is first run, see it and create the sqlserver database.
I would not use the windows install to distribute this application because it will have multiple versions of update, and would be easier with clickonce.
Which flavor of SQLServer are you using? Here's some helpful information.
If it is SQLServer Express, you need to add the database to your project and set the build action to 'content' and set 'copy to output directory' to 'copy if newer'. This will ensure the database is included in the deployment.
Next, go into the prerequisites dialog and select SQLServer Express. When the user runs the setup.exe, it will check and see if it is installed, and if not, will install it.
If you want the newest version of SQLServer Express, you can find how to create a bootstrapper package here -- Microsoft doesn't provide one, but this article provides the XML you need and provides links to the SQLServer Express downloads.
If you are using SQLCE, you need to attach the database (*.sdf) to your project and set the properties as noted above. However, you do not need to publish this as a prerequisite, you can just include the dll's in your project as noted here.
When your publish a new version, if the database has changed, ClickOnce will put the new database in the DataDirectory and put the old one in the \pre subfolder of the DataDirectory, and you have to write code to handle that. This sounds appealing, but I think it's dangerous. If you even so much as open your database to look at the structure, it will change the date/time stamp, and ClickOnce will think it's new and publish it, and you will get calls from your customer about their data missing, unless you handle this.
So I usually recommend you copy the database to LocalApplicationData when the user first installs your application, and handle any updates to the structure programmatically after that. There is an article about how to do that here.

Restoring Ingres Database from one system to another system

We want to restore the database that we have got from the client as backup in our development environment, we are unable to restore the database successfully, can any one help us to know the steps involved in this restore process? Thanks in Advance.
Vijay, if you plan to make a new database out of checkpoints (+journals) made on another (physical) server, then I must disappoint you - it is going to be a painful process. Follow these instructions http://docs.actian.com/ingres/10.0/migration-guide/1375-upgrading-using-upgradedb . The process is basically the same as upgradedb . However, if architecture of the development server is different (say backup has been made on a 32bit system, and development machine is, say POWER6-based) then it is impossible to make your development copy of the database using this method.
On top of all this, this method of restoring backups is not officially supported by Actian.
My recommendation is to use the 'unloaddb' tool on the production server, export the database in some directory, SCP that directory to your development server, and then use the generated 'copy.in' file to create the development database. NOTE: this is the way supported by Actian, and you may find more details on this page: http://docs.actian.com/ingres/10.0/migration-guide/1610-how-you-perform-an-upgrade-using-unloadreload . This is the preferred way of migrating databases across various platforms.
It really depends on how the database has been backed up and provided to you.
In Ingres there is a snapshot (called a checkpoint) that can be restored into a congruent environment, but that can be quite involved.
There is also output from copydb and unloaddb commands which can be reloaded into another database. Things to look out for here are a change in machine architecture or paths that may have been embedded into the scripts.
Do you know how the database was backed up?