I'm looking for a way I can bulk update media items currently with Sitecore Powershell, but happy to here of any other suggestions.
I need a way of swapping out the blob value of a Sitecore media item for another media item. Is this even possible with Sitecore Powershell?
With 4.0 updated this week there will be a great way to do this using the new Remoting module functionality that Michael added as part of this issue. If you cannot hang on for a few more days feel free to contact me directly and I can pass the release binaries earlier.
I wouldn't honestly send remote files to server with the previous implementations as the files were going through serious serialization/deserialization in the process.
However if your files are already on the server there is another way that ingests files from the server file system into the media library that you can check out in a gist I've written a while back here.
Related
When you export Batch configuration in Kofax 10.2 through UI it generates a cab file.
There a bunch of binary files like dlls in that cab file. That kind of kills an ability to store it in version-control system.
Having those configuration file in version control would allow better/easier code sharing/testing/deployment/automation.
So I have 3 questions:
Is there way to export version-control friendly batch configuration?
Is there way to integrate Kofax with version control directly?
Are there any plans to add this functionality in future versions?
Thanks.
Unfortunately the short answers to all of your questions are No.
Despite the fact that it has no granularity, you should store the whole cab file in source control, since that is what you would use if you needed to restore your configuration to a previous state.
Within the cab file, the primary item that holds the batch configuration is the admin.xml file. If you really felt the need you could extract the contents of the cab file and also store these in source control. If you were to diff versions of the admin xml you may be able to determine context about what changed in the batch class. However you would still only be able to restore the full cab file.
Additionally, you mentioned dlls in the cab file, so I assume that you have Validation Scripts or something similar. Not only the built dlls, but also the source code would be within the cab in folders like Scripts\00000001[DocumentClassName]. So again, keeping the extracted contents in source control might be a good way to be able to diff changes, etc. But you still do need to keep the full cab since that is the only way you can import the batch class configuration.
Everything that Stephen said in his answer, and...
For some of the types of configuration management, version control, and troubleshooting tasks in the Kofax environment, I have found Beyond Compare by Scooter Software to be supremely helpful in comparing the contents of two .cab files and reconciling differences between them.
I'm speaking specifically of comparing cab files containing Kofax batch classes, which also contain the document class information for the document types in the batch class, as well as other things like assigned users, etc.
This will work best if your cab files have only one batch class in each, the same one, e.g., before and after cab snapshots, for the same batch class.
In Beyond Compare (BC) (I'm using the 4.x version), from Windows Explorer you select one .cab file for the left side, and the .cab file you are comparing it to for the right side. BC will show you the files inside each cab file, and as Stephen said, the admin.xml is the one with the details.
You can actually copy XML lines from one side to the other in BC, and save the result, but the real value is in seeing what settings changed between versions of the batch class.
If Kofax had some sort of scriptable automation API for the admin module, that would be amazing and potentially enable many of the capabilities you describe, but if Kofax does have such an API, I am unaware of it. I'm currently running Kofax Capture 10.1.
In Kofax version 11, they did add some features for keeping versions of batch classes automatically for you, so you can audit changes that were made in the admin module. Didn't notice anything about an automation API for the admin module in Kofax 11.
I am working in a company who is using TFS'12 for keeping track of PDF versions. The problem is that they are running out of space VERY quickly. I suppose this is because TFS use SQL Server and SQL Server treats every pdf as a BLOB object.
The question is: is it possible to use TFS for doing version control on files that do not contain code (images, pdfs, videos, etc)?
It is quite possible to use TFS for this purpose. Make sure that the space situation is due to increase in data space, not log space.
We've decided to create a custom bootstrapper for our deployment solution. We are currently re-writing and re-designing our deployment strategy for all of our products. Sadly, none of us are deployment experts.
Here's what we have so far:
A. The MSI packages will be authored in InstallShield. We will use whatever feature Installshield offers (IIS integration, COM registration, Registry, etc). The dialog's created by InstallShield will not be used (that is what the bootstrapper is for). The MSIs will be installed silently.
B. Whenever we need to write CA's for stuff that InstallShield can't handle, we will be writing them in managed code (C#) using DTF. We will be creating a "Custom Action Framework" that will "standardize" how we use custom actions.
C. We will create a custom bootstrapper (the "setup.exe") in C# to "handle" the installation.
We have decided to go with a multiple MSI approach and use MSI transaction to "chain" the installation from the boostrapper (inspired from Office 2007 installer)
The boostrapper that we are envisioning to create is inspired from Visual Studio's and SQL Server's bootstrapper. The boostrapper will be responsible for the following:
Prerequisite installation: Each application require a pre-requisite. These pre-requisites are listed in an XML file placed on the same folder as the MSI (inspired from Office 2007 installer) along with other metadata. Depending on current state of the system, the boostrapper will decide which pre-requisite to be installed or not.
Feature selection: We are planning to structure the "internal" MSI's feature in such a way that it will not be appropriate to be displayed right away to the end-user. We will have feature labeled as "Core_Files", or "Vista_Only" or "64bit_Only". Depending on the metadata on the XML file (on item 1) and the target system, the bootstrapper will be responsible in "populating" a "feature tree" that the user can customize (also inspired from Office 2007 bootstrapper).
Pre-installation Checks: The bootstrapper will be responsible in checking if the system is ready to receive the installation. For instance, if a machine needs to reboot prior to installation or if the user needs to manually install a service pack, patch or a windows component. Anything that needs to be done that needs user intervention should be displayed here. Think of it as a check list (a listbox) with checks and exes. (Inspired from SQL server's bootstrapper). The "rules" will be written in C#.
Application Configuration: For application that needs to be "configured" prior to installation. These "parameters" (user configuration) will be passed to the respective MSI via MSI Properties.
Actual Installation: The bootstrapper will then perform the installation. Proper "transaction" should be observed when necessary. All "products" that should be grouped together shall be displayed as one product in Add/Remove Programs (by messing with the ARP entries). Also, proper progress shall be reported by each MSI being installed.
-- That's what we have so far.
I think there are a couple of out-of-the-box solutions for creating a custom bootstrapper like dotNetInstaller and BMG. We've look into it but it's not as flexible as we've hoped. There's also BURN but we're not sure if it's ready for primetime.
So here we are... we've decided to create our own custom bootstrapper.
Question:
Are we crazy? Shouldn't we be creating our own bootstrapper? Which ideas listed above are not realistic? Is there a better approach?
Any input regarding our situation will be greatly appreciated. Also, if you have any questions, please don't hesitate to ask.
Frankly, Burn isn't going to be done for at least a year. You already have InstallShield and IMO it has the best off the shelf bootstrapper currently available. I'd scope your requirements back and make it fit the box. Pretty much everything I read from you can be done using InstallShield if you learn to push it to it's limits.
I would go for Burn anyway or some already existing solution.
I'm sure that after some time you'll face new problems that you can't now really imagine.
If you face them, that means that Burn's developers have already faced them and probably got them solved. If not, Burn has a large community that will fix the potential bug faster than you.
Focus on the software you're developing, not on writing installer/bootstrapper.
If I were in your shoes, I would give a burn a try. I'd get me a couple of days and see if it meets my requirements.
I do alot of bugfixing and implementing new features for several different customers. These customers all report their bugs, change requests and new feature request into our Trac system.
Sometimes these requests result in me creating some SQL change scripts, sometimes there are Excel documents or Access databases with testdata, Word documents from the customer and so on. Alot of files that are used to fix one ticket and then can be deletede when the ticket is closed.
I usualy do this by creating folders in the filesystem like this: /customerXX/TicketNNNNN and then just dumping everything in there.
How do you organize your workfiles? Have you found some fantastic tool to do this?
I would say for scripts or files that are related to a particular ticket, the best thing to do would be to attach the file to that ticket in your issue tracking software - almost all issue trackers that I've worked with will allow you to do this. That way, you can look back and a) see exactly what you did in case something goes wrong, or b) do exactly the same thing if the issue comes up again later. That's almost certainly the best place to keep files with extra info from the customer, too (or at least the first place most people will look).
For frequently re-used scripts that aren't specific to a particular ticket, I would create a scripts/ or bin/ directory in the associated project, and keep them in there.
I also have a small handful of useful files that I keep in src/misc/ off my home directory, with things like SQL queries to get readable "explain" output out of Oracle and such, that aren't specific to any particular project. The number of these is small enough that subdirectories aren't necessary, though - I suspect if you ended up with a large number of these files, many of them could/should be moved to specific projects or your issue tracking system.
JIRA has been quite helpful for this at my site. It supports issue tracking, file attachments,and you can easily customize and categorize your projects and issues.
I use Fogbugz and I add all file to the case. I believe that no matter what application you use, The important is to keep this files for future references. If your bug-tracking tool does not let you attach file then add the files to the version control.
We use CaWeb4 and find it very easy to use for our bug tracking.
Is it possible to use PowerShell to script out SQL Server Reporting Services rdl files in SQL Server 2008? If so, can someone provide a code example of doing this? This would be a useful replacement for using a 3rd party tool to script out RDL files created by business users outside of my Business Intelligence department.
CLARIFICATION OF THE TERM "SCRIPT OUT"
By "script out", I mean I would like to automatically generate the underlying RDL file for each report on the server. For instance, when you code report in BIDS, you are generating a RDL file. When you deploy the file to the server, the file is somehow imported into the SQL Server ReportServer database and it is no longer a separate physical RDL file. I would like to extract all the reports from the server in a RDL file format.
I've used the RSScripter tool to extract the reports as RDL files, so I know it is possible using tools other than PowerShell. I would specifically like to know if it is possible to do it using PowerShell and, if so, get a sample of the code to do it.
CLARIFICATION ON WHY I WANT TO GENERATE RDL VERSIONS OF REPORTS
Why is it important to "script out" the reports to RDL files? I would like to check-in the RDL files to my source control system once a night to keep track of all reports created by users outside of my Business Intelligence department. I already keep track of all reports generated by my department since we develop our reports in BIDS, but I can't keep track of versioning history on reports built in the online Report Builder tool.
CLARIFICATION ON WHY POWERSHELL AND NOT SOMETHING ELSE
Curiosity. I have a problem that I know can be solved by one of two methods (API or RSSCripter) and I would like to know if it can be solved by a 3rd method.
Opportunity to expand my problem solving toolbet via PowerShell. Using PowerShell to solve this problem may provide the foundation for learning how to use PowerShell to solve other problems that I haven't tried to solve yet.
PowerShell is easier to understand for my team and me. In general, my team members and I can understand PowerShell code more easily than .NET code. Although I know this problem can be solved with some .NET code using the API (that's how RSScripter works after all), I feel it will be easier for us to code and maintain a PowerShell script. I also realize a PowerShell script will probably use .NET code, but I'm hoping PowerShell will already be able to treat the reports like objects in some way so I won't have to use the Reporting Services API to extract the files.
RSScripter doesn't support 2008 yet. In the past, I've used RSScript to script out reports. Unfortunately, it doesn't appear to support 2008 yet. This means I have to write code against the API right now since that's the only way I present know how to extract the files in an automated unattended manner.
a little late, but here you go
This PowerShell script :
1. Connects to your report server
2. Creates the same folder structure you have in your Report Server
3. Download all the SSRS Report Definition (RDL) files into their respective folders
https://sqlbelle.wordpress.com/2011/03/28/how-to-download-all-your-ssrs-report-definitions-rdl-files-using-powershell/
PowerShell doesn't provide any native, PowerShell-esque functionality for this, no. You can do this in PowerShell (as noted in the previous answer) only because PowerShell can access the underlying Framework classes. As you noted in your comment to the previous answer, it's no different from using the API in C# or VB.
The SQL Server team has not yet provided much in the way of PowerShell-specific stuff. They're primarily relying on .NET and T-SQL as "scripting languages."
I just realized the Content column in the ReportServer.dbo.Catalog contains the definition in an Image format. I wrote the following code to convert it to readable text:
SELECT CONVERT(VARCHAR(MAX), CONVERT(NVARCHAR(MAX), CONVERT(XML, CONVERT(VARBINARY(MAX), Content))))
FROM [ReportServer].[dbo].[Catalog]
WHERE Type = 2
With the above code, I can now automate writing the results to a flat file and then import the file into my version control system.
Anything that supports .Net can do this. See this Stackoverflow posting for some links to the API docs. The process is actually fairly straightforward - the API has a call to upload or download the .rdl file.
Report models are a bit more fiddly. You have to get the dependent reports (again an API call) and re-connect the data source if you upload a new report model. Again, not terribly strenuous.
Powershell should do this fine. I've variously done this with IronPython and C#. There's a also a tool called rs.exe that takes a vb.net script, tops and tails it with some includes and compiles and runs it behind the scenes.