What are automation tools used for datawarehouse testing - sql-server-2008-r2

In my recent interviews i have come across a common question, did you do automation on your datawarehouse test scripts?
I googled about this but didn't get any specific tool name(s) which are used for automating DWH tests.My test scripts are SQL queries which checks for counts, sum(), and unable to understand how the automation is possible on sql scripts.
Has anyone in the group has done automation, if yes then which tools are used to do so?
Your Help much appreciated.
Regards,
Geeme

Fitnesse - Dbfit is one of best free tool available in market.

Related

Custom Tool for Confluence to Shapoint Online and Azure DevOps Migration

We were using Confluence in our company, but now the management has decided to move to SP Online and Azure DevOps. I am looking to write a custom tool, using Powershell and RestAPI to do this job. I understand there are many feature gaps including modern page UI in SP, text formatting etc.
Has anyone worked on something similar like this? If yes, then what was your approach? What are the possible issues I should be prepared for? Me being a non coder would highly appreciate if someone can share a code snippet too.

Generating MIS reports and dashboards using opensource technologies

Am need of your suggestion for scenario below :
one of our clients has 8 postgres DB servers used as OLTP and now wants to generate MIS reports/dashboards integrating all the data in the servers.
- There are around 100 reports to be generated
- There would be around 50k rows added to each of these databases
- the reports are to be generated for once every month
- they are running all there setup in baremetals
- they don't want to use hadoop/spark , since they think the maintainabilty will be higher
- they want to use opensource tech to accomplish this task
with all said above, one approach would be to write scripts to bring aggregated data into one server
and then manually code the reports with frontend javascript.
is there any better approach using ETL tools like Talend,Pentaho etc.
which ETL tool would be best suited for this ?
community editions of any ETL tool would suffice the above requirement..?
I know for the fact that the commercial offering of any of the ETL tools will not be in the budget.
could you please let me know your views on this.
Thanks in Advance
Deepak
Sure Yes. I did similar things successfully a dozen time in my life.
My suggestion is to use Pentaho-Data-Integrator (or Talend) to collect the data in one place and then filter, aggregate and format the data. The data volume is not an issue as long as you have a decent server.
For the reports, I suggest to produce them with Pentaho-Report-Designer so that they can be send by mail (with Pentaho-DI) or distributed with a Pentaho-BI-server.
You can also make javascript front end with Pentaho-CDE.
All of these tools are mature, robust, easy to use, have a community edition and well supported by the community.

Scheduled export of data from Dynamics CRM

I hope someone can help me. I would like to find a way to do a scheduled export of select data from Microsoft Dynamics CRM (Online)
Preferably to a CSV file and have the export automated at a recurring time (at least once a day) so it does the export to a specified location without any user interaction.
I'm aware of Scribe for example but that is very expensive and I need a cheaper solution. Any ideas for scheduled and unattended exporting from Dynamics?
As #Guido Preite mentioned, your best bet is to get the CRM SDK. Since cost is an issue with turn-key third-party software, the SDK is a good alternative if you have a little time to get familiarized with it. There are a lot of good examples straight from MSDN and the SDK documentation to get you up and running quickly, start here. Basically what you could do is create a simple console app that queries the data you need, then save it off to a file. This could then be scheduled via Task Scheduler.
Scribe is a good solution, but isn't cheap as you say.
I've used KingswaySoft to do scheduled data imports and exports with CRM. See http://www.kingswaysoft.com/products/ssis-integration-toolkit-for-microsoft-dynamics-crm.
It's a good product and is cheaper than Scribe. No coding is required, although you'll need some experience of SQL Server Integration Services (SSIS).
Flatly lets you auto-export data from Microsoft Dynamics Online to CSV (placed in cloud storage) every 10 minutes, hourly or daily. It takes 5 minutes to setup. flatly.io
Disclosure: I work at Flatly.

IBM DB2 9.7 free monitoring tools

Any good advice on what tools to use to monitor a DB2 database? I have used the db2top command but was wondering if there are more verbose tools out there. Our DB is running on Linux64
As tools, you can use the memory tracker db2mtrk, Problem determination db2pd. But also you can use many other things, such as:
function tables
administrative views
get snapshot
create and active event monitors.
It really depends on what you are goin to do.
Also, the IBM Optim Performance Expert is a good tool to find bttlenecks or issues in the Database. Also the DBI panther brother. And finally the Data Studio Web Console is a basic tool to monitor few elements.
What do you want to monitor? probably with the help of a cron that executes a script you can do many things.

I'm designing a thick UI diagnostic tool, should it have a direct integration to PowerShell

My team and I are designing a diagnostic test tool as part of our next product. The test tool will exercise a request/response API and display asynchronous events. As part of the diagnostic tool suite, we will also be providing cmdlets for the entire product API.
Is it worth embedding PowerShell execution into the tool UI ? What are other development teams doing ?
The scripts can still run stand alone in any PowerShell window or tool. From a user's perspective, they would gain the ability to launch scripts from our UI. And, since the UI can be monitoring the same devices that the scripts act on, it brings some unity to the execution of a script and monitoring of the results. Embedding script execution brings more work to the project and I'm not sure how we want to handle displaying the results of the scripts.
Do most PowerShell users expect to run their scripts from their own shell environments or within tools that come from their product vendors ? Note, our diagnostic tool will not be automatically generating scripts for the users as some Microsoft tools do (it might be valuable for inexperienced PowerShell users, but we are expecting most scripts to be fairly simple, like executing a command on a series of devices).
Fortunately embedding the PowerShell engine and execute commands/scripts and getting the results back is pretty trivial. That said, I'm not sure you scenario is one where I would embed PowerShell. You ask if folks prefer to run scripts from their own shells or from within the Tool Vendors environment. I can't speak for everybody but the shells and editors that I use support some nifty features for debugging, code folding, syntax highlighting, multiple runspaces, etc. I'm not sure you would want to go through the effort to provide similar capabilities.
One reason to embed PowerShell is to execute the same PowerShell cmdlets as part of your core diagnostics and monitoring engine. That way you don't have to duplicate functionality between your diagnostic tool app engine and the cmdlets that your customers use for automation. It sounds like the code you use to do the diagnostics and monitoring in the app is different than the code in the cmdlets? Or is there common code shared between the app and the cmdlets?
Another reason to embed PowerShell is to allow the app itself to be scriptable but this doesn't appear to fit your scenario.
Another reason to embed PowerShell is if you are implementing a new host - ie you provide some unique editing or shell functionality. Some apps that do this are PowerGUI (which allows you to launch scripts IIRC) and PowerShell Plus.
Yet another reason I have embedded PowerShell in an application is because I knew I could get certain results in much less code than the equivalent C# code. This is a weaker reason and I probably wouldn't do this in a commercial app but I have used this for one-off programs.
I agree with both Jaykul and Keith Hill - the answer is yes.
There are several approaches you could use. But in general, I'd recommend you a) create key cmdlets as part of the UI for your app and b) you build the GUI on top of PowerShell (in the same way the Exchange team has done.
Doing this follows Microsoft's lead (all applications have to have a PowerShell interface) that is also being taken up by others (e.g. VMware, and even Symantec leverage PowerShell in their applications.
Creating cmdlets (and possibly a provider) is pretty straightforward - there's a great cmdlet designer recently released (see http://blogs.msdn.com/powershell/archive/2009/10/16/announcing-open-source-powershell-cmdlet-and-help-designer.aspx) for this tool.
Hope this helps!
Yeah, the main reason I'd consider actually embedding PowerShell in that scenario is if your UI could generate PowerShell scripts for the actions the users take in the UI, so they could see what was happening, and easily learn how to automate it. That would require designing the UI based on PowerShell from the beginning ... so it sounds to me like you're better off just providing the cmdlets and samples ;)