Crystal Reports Server folder backup - crystal-reports

In SAP Crystal Server Java BI Launch Pad inside Documents > Folders > Public Folders I've created a new folder and imported (many many) reports to it.
Don't know what happened to the server but all of it's settings have been reset, and the folder is no longer there - so I need to recreate it, and import all my reports back (a proccess that takes a long time since I need to import one report at a time).
Is there a way to backup/export that folder as a backup? so next time a thing like this happens I can just restore my folder with all the reports from that backup file?
Thanks

You can use Promotion Management (aka LifeCycle Management or LCM as it was known in previous iterations) to export any set of objects (documents, universes, …). You can access this (web) application via the CMC.
Instead of live-to-live promotion (i.e. from one live system to another live system), you'd choose for a live-to-BIAR promotion, where the selected content would be exported to an .lcmbiar file. An .lcmbiar file is basically a ZIP archive which consists of the actual objects (e.g. reports, …) as you would find them on the FRS share as well as metadata information.
To restore an .lcmbiar file, you'd upload it back to the server through Promotion Management.
Alternatively, you could use the CLI to generate the file, without having to go through the web interface. This is especially useful if your promotion job consists of a very large (100+) number of objects or the selection criteria are quite specific.
You can find more information on Promotion Management in Business Intelligence Platform Administrator Guide, chapter 15.

Related

How do I add columns to SharePoint to show a files version count and the size of the sum of those versions?

We've got an issue with SharePoint where we keep running out of space on our Team Drive. We know why, it’s because with versioning turned on all changes are stored and with some of our larger files (e.g. MS Access databases) that can quickly add up.
What I need is a way to either add some columns to show a count of the number of versions a file has and the size of the sum of those versions, so I can then sort and find the offending files quickly. Or are there any other solutions to quickly track down which files are adding bloat due to versioning? Powershell? I should add that I don't have network admin, only local admin on the machine.
This is the interface I'm working with, there are obviously files within each folder that these columns would reference:
This would require to write custom code to update as report.Please refer for more details to update according to your requirement.
https://www.sharepointdiary.com/2013/01/document-versions-size-report-powershell.html

SqlPackage - How do I stop it from turning off my query store?

I have a database project in Visual Studio 2017. Our database project is managed just like any other library of code where multiple developers can update the project as necessary. To ease the pain of deployments, I have built a custom deployment task in our TFS 2018 (vNext) Build process that is a powershell script that calls sqlPackage.exe. SqlPackage compares our compiled database project (*.dacpac file) to our target database (in Dev, QA, etc.). I have the custom step configured so that it will write the expected changes to disk so I have a record of what was changed, then sqlPackage runs a second pass to apply the changes to the intended target database.
My DBA enabled the Query Store in our SQL 2016 Database. During my sqlPackage deployment, one of the initial steps is to turn the query store off, this makes my DBA unhappy. He wants the ability to compare pre and post deployment changes but if the query store gets turned off, we lose the history.
I have tried several of the switches in the documentation (https://msdn.microsoft.com/en-us/library/hh550080(v=vs.103).aspx#Publish%20Parameters,%20Properties,%20and%20SQLCMD%20Variables) but I can't seem to find the magic parameter.
How do I stop SqlPackage from turning off the query store?
My current script:
sqlPackage.exe /Action:Script /SourceFile: myPath\MyDatabaseName.dacpac" /OutputPath:"myPath\TheseAreMyChangesThatWillBeApplied.sql" /TargetConnectionString:"Integrated Security=true;server=MyServer;database=MyDatabase;" /p:DropObjectsNotInSource=false /p:DropPermissionsNotInSource=false /p:DropRoleMembersNotInSource=false /p:BlockOnPossibleDataLoss=True /Variables:"CrossDatabaseRefs=CrossDatabaseRefs
Is there a better way? I am surprised that I had to write a custom TFS Build Task to do this. Which makes me think that I might be doing it the hard way. (But this has worked pretty well for me for the last several years). I love database projects, I love that they enforce references and ensure that we don't break other objects when a column is dropped (for instance).
Any insight would be greatly appreciated.
Either disable the scripting of database properties using /p:ScriptDatabaseOptions=false, or update the database properties in the project to reflect the desired Query Store settings.
To set the Query Store settings in the project, right-click the database project in Solution Explorer and open Properties. From there, in the Project Settings tab find the "Database Settings..." button. In the Database Settings dialog, click the Operational tab and scroll down to find the Query Store settings.
Apparently, all we needed to do was add a post deployment script to re-enable the Query Store. Hope this helps someone out there...
USE Master
ALTER DATABASE [MyDbName] SET QUERY_STORE = ON

Version Control with Jaspersoft Server Queries / Input Controls

As anyone who has used Jasper Reports would know, a single report in Studio does not always encapsulate everything that a report is. There are Input Controls that can exist outside of the report as well as Queries that may be responsible for populating the controls.
Jaspersoft does not make it easy to obtain and version the entities that exist outside of the normal report JRXML files that you design in Studio. As of now, the only way I've been able to even somewhat accomplish this is like so:
Create Query and Input Control in DEV environment
Export the ZIP file that Jasper generates for said Query and Control
Extract the files and blow away or hand-modify the connection information in the Data Source, because Jasper packages everything associated with an item.
Add necessary files to SCM.
ZIP files up.
Import ZIP file in to QA environment, and then Staging, and so on.
While this can be somewhat automated, it seems like anyone working with Jasper has to jump through hoops to actually version their reports. Am I missing something, or is this really just the nature of things in this space?

Simple and easy to use tool for managing different versions of files

I want to manage different sets of file versions locally on a machine without using complex version control tools like TFS/Git/SVN...etc. here is my use case:
I have a Windows virtual machine that contains many xml, xslt, xsl, txt...etc. files, the virtual machine gets updated with every release of my product.
Often I need to analyze errors in this virtual machine, so I change many files and run the product and start analyzing, let us call these file changes FileChangeSet1.
based on the results above I need to change other files and maybe some of the files in FileChangeSet1 and do another test.
again based on the results, I need to change more files, eventually I end up with FileChangeSet1, FileChangeSet2...FileChangeSet(n)
I want to:
be able to switch between these file change sets easily and quickly, e.g. have a GUI that shows my my tree of FileChangeSets then click one of them and all files of that change are used.
create file change sets from other file change sets e.g. copy FileChangeSet1 in FileChangeSet2 and change only one file in set 2
I don't want to configure and install a complex version/source control system like TFS/Git/SVN where I have to create a database of all my files first.
Making snapshots of the virtual machine is not an option because it is extremely slow.
I think you would not have much advantage with version control tools even because they are made to version text files. For binary files, I think you would end up like managing several diffent copies of the binary files anyway (at least for older tools such as CVS and SVN).
If you are running in linux, you may want to use cmp/diff tools. Take a look on incremental diff and diff tools such as patchutils.
Consider also to create a checksum of huge files to avoid comparing them for nothing.
ps. also take a look on this - http://jojodiff.sourceforge.net/ - haven't tried but it seems simple to use and promising.
Mercurial is the right tool for me. With it I can solve my business case easily as follows:
Install mercurial on Windows, it integrates in the Windows file explorer.
Create a local version control mercurial database by right clicking my root folder.
Now I can open all my files under my root folder in different text editors e.g. notepad++ and modify these files.
When I want to save/remember a specific status I simply commit the files to mercurial by right clicking the root folder, I can provide a commit note.
Later I can change my files in a different way and test how my system reacts to them, again I can commit these files locally.
Over time I have a history of change sets in Mercurial, I can go back to any change set, branch it, merge it...etc.
I have a huge and complex system that contains thousands of files, my root folder is actually the C:\ drive, I can easily and quickly make out of c: a version control database using mercurial.
All with a simple and intuitive GUI, no command line learning needed.

Best practices for deploying data to a custom folder

Sometimes when we issue an upgrade to our application we need to install some files to the application's Data folder. We want to make it possible for the users to move this folder to a place of their liking. But how to deal with this at install time?
I was thinking of deploying to the user's AppData folder and have the application somehow check there for new files at startup.
Any advice or references would be very welcome!
We use InnoSetup for a VB6 application if that matters for your answer.
Generally the best solution I've found is to allow the user to move the folder from within the application.
This allows the application to keep track of where its data is being kept (by adding a reference to it in a file or registry entry which it accesses at load time) and to access it seamlessly in the future.
Your update routines can then also access this information to determine where to place the update files.
Alternatively, make sure the folder name is as distinctive as possible and add a search routine to look for the directory in a number of sensible places at load time. Then write your manual specifying that the data folder can be moved to one of those locations ONLY.
Wouldn't the users just run an update or patch package? I'm not sure why they'd want or need to see such files. It's pretty rare for commercial software to offer users the options of where to store program settings and other internal-use files.
Give some thought to this before putting a lot of stuff into users' roaming profiles. You might want LocalAppData instead.