How to export bug list (or any custom query) from TFS to Excel from a command line? - command-line

I need to export list of bugs from our Team Foundation Server to Excel. It's trivial to do it manually, but I need a command line version since the task needs to be automated.
Anyone knows how to do this?

To answer your origional question:
Add a new query in TFS, create your query and click save. This should give you an option to save the query either on the sever or locally. If you chose to save it locally and then change the extension from WIQL to .txt you will have the query available to you :-)
I hope you are aware that you have the option of using the 'Team' tag in your Excel/Project ribbon. Having said that, you can create a macro in excel that consumes the refresh or publish button on the team ribbon. Have a look at this sample macro http://blogs.msdn.com/b/teams_wit_tools/archive/2007/03/15/how-to-invoke-tfs-add-in-controls-from-macro-code.aspx
HTH.
Cheers, Tarun

Related

Can Google Actions be imported/exported from the Actions Console?

I'm just starting to learn how to build an Action with Scenes and Intends. Very basic. I see that prompts can be defined in JSON. Is it possible to export the whole Action in JSON format for me to edit it outside of the UI? and then reimport it?
The idea is that if a have a very simple Action (with lots of scenes, but easy intends) I can define the whole action in a much simpler format and automatically create the JSON that could be imported. It would be much faster for me than defining each scene in the Actions Console.
Sort of.
You can use the gactions command line tool to export and import the configuration files. These completely represent the same things that you can edit using the web-based graphical editor.
However... these files are in YAML, not JSON. Semantically, they're identical, so you will still be able to create something that generates the files.
To download your configuration into the current directory you'd use the gactions pull command and specify your project ID with the --project-id parameter:
gactions pull --project-id some-project-4242
You'll see that the Scenes and Intents each have their own folders under custom where you'll be doing your editing.
Once you've made the changes, you can upload the configuration with
gactions push
(Note that you can't specify the project ID, since this is in one of the configuration files.)
You can then reload the test simulator and test your changes.

DB2: Procedures comments are not deployed via Data Studio

I create a new stored procedure including comments
etc. /* this is a comment */. But if i have a look at the source tab of this procedure in data studio this line is gone.
What's going wrong here?
kind regards
Ralf
You can also consider using -- to show comments instead of /* */
The command line (CLP) preserves comments by default with current Db2 versions, i.e. you don't have to use Data Studio for deployment activities.
In Data-Studio, remember to tick the 'Deploy Option' "Deploy source to the database", as by default it is not ticked. In my 4.1.3 version of Data Studio, this preserves comments in SQL procedures.
Losing comments was reported in an old version of Data Studio with a workaround.
The workaround for the issue is to use the Routines Editor to create the stored procedure, then select the DEPLOY button. That will preserve comments.
Data Project Explorer
Right click on Stored Procedures
Select New->Stored Procedure
Select any of the templates and click Finish
Edit the generated coded and press the DEPLOY button, top right.
If you still have a problem and if your company has an IBM support contract then open a ticket with IBM to get it resolved.

SqlPackage - How do I stop it from turning off my query store?

I have a database project in Visual Studio 2017. Our database project is managed just like any other library of code where multiple developers can update the project as necessary. To ease the pain of deployments, I have built a custom deployment task in our TFS 2018 (vNext) Build process that is a powershell script that calls sqlPackage.exe. SqlPackage compares our compiled database project (*.dacpac file) to our target database (in Dev, QA, etc.). I have the custom step configured so that it will write the expected changes to disk so I have a record of what was changed, then sqlPackage runs a second pass to apply the changes to the intended target database.
My DBA enabled the Query Store in our SQL 2016 Database. During my sqlPackage deployment, one of the initial steps is to turn the query store off, this makes my DBA unhappy. He wants the ability to compare pre and post deployment changes but if the query store gets turned off, we lose the history.
I have tried several of the switches in the documentation (https://msdn.microsoft.com/en-us/library/hh550080(v=vs.103).aspx#Publish%20Parameters,%20Properties,%20and%20SQLCMD%20Variables) but I can't seem to find the magic parameter.
How do I stop SqlPackage from turning off the query store?
My current script:
sqlPackage.exe /Action:Script /SourceFile: myPath\MyDatabaseName.dacpac" /OutputPath:"myPath\TheseAreMyChangesThatWillBeApplied.sql" /TargetConnectionString:"Integrated Security=true;server=MyServer;database=MyDatabase;" /p:DropObjectsNotInSource=false /p:DropPermissionsNotInSource=false /p:DropRoleMembersNotInSource=false /p:BlockOnPossibleDataLoss=True /Variables:"CrossDatabaseRefs=CrossDatabaseRefs
Is there a better way? I am surprised that I had to write a custom TFS Build Task to do this. Which makes me think that I might be doing it the hard way. (But this has worked pretty well for me for the last several years). I love database projects, I love that they enforce references and ensure that we don't break other objects when a column is dropped (for instance).
Any insight would be greatly appreciated.
Either disable the scripting of database properties using /p:ScriptDatabaseOptions=false, or update the database properties in the project to reflect the desired Query Store settings.
To set the Query Store settings in the project, right-click the database project in Solution Explorer and open Properties. From there, in the Project Settings tab find the "Database Settings..." button. In the Database Settings dialog, click the Operational tab and scroll down to find the Query Store settings.
Apparently, all we needed to do was add a post deployment script to re-enable the Query Store. Hope this helps someone out there...
USE Master
ALTER DATABASE [MyDbName] SET QUERY_STORE = ON

Reuse recipe in google dataprep

I am trying to use existing recipe from one dataset to another. Unfortunately, i am unable to locate the steps by steps process in the google cloud documentation.
Could someone assist with the steps?
Thank you!
There are some ways you can do that. For example, if you want to keep your recipe/dataset combinations in separate flows, you could make a copy of the flow (there's an option in its menu), go into the flow, choose "Replace" from the menu of the dataset linked to the recipe you want to reuse and give your new dataset as input. If, on the other hand, you'd rather keep everything together in the same flow, you could go into that flow and choose to make a copy of the recipe you want to reuse. Then, from that copy's menu, choose "Change input".

Talend Open Studio - How to create brand new project

I need to create a brand new empty TALEND OPEN STUDIO project and then simply import some jobs from another already existing project.
Can someone help me on this please.
Many thanks !
Open Talend Studio
You will be prompt with a small window which will have few options like
Select an existing Project
Create a new Project
and other options. You can click on Create a new project and then you can a give a name to that project.
Then your Talend window will be opened. On the left side of it you will have many options like Business models, job designs, contexts etc. Right click on Job designs and you will get the options of import items along with other options like export items etc. Click on it. You will get a window asking for your root directory of all the jobs. If you already have some archived files in a working environment then you can do that also, but I guess here it is not that case. For a normal job go with select root directory option. Select your directory where you have jobs. You can also import any particular job also. You have few buttons on right side saying Select all, Deselect All etc. These options are for parameter files and schema repositories etc. associated with your Job which you are importing. You can select or deselect according to your need. That's pretty much to import a simple Job.
There are many other options we will have to consider while importing jobs and creating new projects according to environment where we are working with Talend but for using Talend Open Studio for simple Jobs, all the above instructions would be fine.
PS - I am sorry I can't provide any graphics for it due to some reasons. But if you follow along with above instructions, it will work.