Jaspersoft Server V5.5 Catalogue Import Creates Directories in Repository But No Reports, Import Controls, Or Queries Are Imported - jasper-reports

I have a zip file created using the Export function in Jaspersoft Server V5.5. This zip file contains a catalogue of all reports, input controls, and queries from a development server.
When I try to import the catalogue to V5.5 on my production server by logging in as superuser and going to "Manage > Server Settings > Import", selecting the zip file, and clicking Submit, I receive an "Import Succeeded" message.
When I check the repository however, most of the directories have been created (but not the input control or queries folders) but none of the actual reports, input controls or queries have been imported.
What am I doing wrong? Do I need to modify any of the files within the zip file?

Related

Trying to import CSV file to Analysis Services Tabular project

I have Microsoft Office Professional Plus 2019 and Access database engine both 64-bit installed on my machine and was trying to import a CSV file to Analysis Services Tabular project. However keep getting the error message shown below.
"Failed to save modifications to the server. Error returned: 'OLE DB or ODBC error: The Microsoft Access database engine cannot open or write to the file 'BatchInfo.csv'. It is already opened exclusively by another user, or you need permission to view and write its data.; 3051."
Any help is appreciated please.
Either you or someone else has the file open. Sometimes you have to completely close out of the program that created the file, or the last program used to read, write, or save the file.
If it's something on your computer, you may be able to find what has it opened with the Process Explorer tool. Install it, and then use Find > Find Handle or DLL... and search on "BatchInfo.csv".

Users are not showing up when extracted as .BAR File

I'm currently working on Row Level Security and Trying to study the File Contents of the OBIEE Dashboards once exported as .BAR File .
The issue I'm facing is that the newly created User Names are not showing up in the contents of the Extracted .BAR File .
( Folder Path : <.BAR File Root> \content\catalog\root\users )
To add in more, I created two new users namely : User_1 and User_2 .
And I implemented Row Level Security for both . After Extracting it as .BAR and unzipped it , the User_1 Folder is present in the Folder Location and User_2 Folder is not present .
Note : Role Level Security is working fine with both Users as they log in .
I have tried to down the services and again started up the services but haven't made any differences .
I request to give any suggestions to resolve this issue or any other ways to make the changes made in OBIEE reflect .
Thanks in Advance .
Users and credentials are not exported in BAR files
They must be exported/imported separately using the weblogic console
To export security data to a file:
In the left pane, select Security Realms and then select the realm whose security data you want to export (for example, myrealm).
Expand Migration > Export.
In the Export Directory on Server field, specify the directory on the Administration Server to which to export the security data.
Click Save.
The security data is saved in a file in the location you specified.
It looks like you are exporting user folders, not users
oracle docs: https://docs.oracle.com/cd/E24329_01/apirefs.1211/e24401/taskhelp/security/ExportDataFromSecurityRealms.html
The OBIEE have mentioned that they have removed the Support for Extracting the .BAR Files of the Dashboards from Version 12.2.1.4 .Now I'm able to export the .BAR Files in the Version 12.2.1.3 where they have given a Disclaimer that they will stop the support to extract the .BAR Files in next Version . So , If you are planned to export .BAR Files use Version before 12.2.1.4 .

Pentaho Biserver-ce 4.8: Export / Import Datasources

We've created some datasources (and customized their measures and categories) and some reports on DEV environment and we want to "deploy" them on PRODUCTION environment. How could we do it?
Copying the datasources files (*.xmi, *.mondrian) directly to /opt/pentaho/biserver-ce-4.8.0/biserver-ce/pentaho-solutions/admin/resources/metadata doesn't work.
Copying the report files to pentaho-solutions seems to work (after creating "manually" the datasource).

Where does Enterprise-Architect store user defined Scripts?

I'm going to create a script for my EA-Project. To do so, it is necessary to create a new "group" and within this group you can add own scripts.
The local scripts I have found on my harddisk. They reside in EA-install-dir/Scritps.
But where can I find my additional scripts?
EA scripts are stored in one of three locations: in the installation directory, in the project itself and in MDG Technologies.
Scripts in the installation directory are available in any project you access from that machine. They show up in the EA script group Local Scripts.
Scripts can also be stored in the project itself. Each EA project is a database (an .EAP file simply contains a JET database), and scripts stored in the project are found in the table t_scripts, as are the script groups you define to organize them.
This is where scripts land when you create them, and while you can export a script from the editor to a file (Save As), AFAIK there is no way to import them in the corresponding manner. But you don't need to save the script to a file in order to use it, and EA doesn't use the file, only the entry in t_scripts.
Scripts from t_scripts are only available in the project where they are stored. If that project is accessed by several users (.EAP file on network drive or external database repository), they can all use the scripts regardless of the machine from which they access the project.
Finally, scripts can be included in an MDG Technology, which is EA's way of bundling adaptations that are primarily modelling-related (eg UML profiles and document templates, as opposed to Add-Ins which contain arbitrary functionality). When deployed, an MDG technology consists of an XML file in which the scripts (and all other bundled adaptations) can be found.
MDG-deployed scripts are available in any EA session where you have enabled that MDG Technology (Settings - MDG Technologies), and appear in a script group with the same name as the MDG Technology. (The script group EAScriptLib is in fact an MDG Technology.) If the MDG Technology is deployed on a network drive, you can use the scripts from any machine and in any project.
I stumbled upon this when searching for a way to easily export and import my scripts, but I found an easier way :
Project -> Data Management -> Export Reference Data...
Then check "Automation scripts" in the window that appeared and click export, you'll have an xml containing your custom scripts.
To import them in another project : Project -> Data Management -> Import Reference Data...
The "Data Management" menu could be elsewhere depending on your EA version (12 here)
For EA 9.x it is Project->Model Export/Import->Import Reference Data
For EA 13 and later it's Configure -> Model -> Transfer -> Export Reference Data, then choose Automation Scripts near the bottom of the list.

How to use version control with JasperReports

We're about to start development of a number of reports using Jasper Server Reports version 3.7.0 CE.
Does anyone have any recommendations as to how best to manage version control with this development, given that the structure of the report units is managed in the database and through either iReport or the web front end?
In fact you can import/export to a directory structure using the js-import/js-export scripts, but then you can't edit these files directly with iReport.
Does anyone have any pointers?
This is problematic. I have established a subversion repository to allow standard reports delivery to be versioned but it is a real pain because jasper does not make this even a little bit easy.
I created a maven project with an assembly descriptor so that "src/main/xml/resources/Reports,adhoc,Domains, etc" can be packaged up in a zip that is pushed to our maven repository.
The biggest problem is that you can't just develop adhoc and input controls merely by modifying XML files. The developer has to import what is in source control into a working jasper server, modify the reports or add new ones (after making sure that his organization and datasources are configured) and once he's satisfied that the report(s) works, export the resources to a directory or zip file, manually modify all references in the exported files from datasources and organization specific resource locations back to "generic" before checking in his changes.
When importing into jasper, the same process has to be done in reverse. The generic paths and organization values have to be converted to the developer's organization so they can be easily imported/updated and he can prove out that the full "round trip" works properly before checking in.
To make the export/subversion checkin easier, I created an ant build file which lives in the maven project's root dir. The build prompts (or will read a properties file) to determine the exported zip location, the organization id of the exported tree. It then opens exported zip file from jasper, explodes it, performs text replacements on the files, resets the "createdDate" and "updatedDate" elements to something standard (so that the developer does not end up checking in files that haven't actually changed since jasper does not preserve the date values), and then copy the files into the subversion tree.
For the import process (from the subversion tree into jasper) we have a script that takes as input the organization id and then modifies the versioned xml files to the appropriate values so that the entire tree can be easily imported/updated into their organization.
The reason this level of complexity is required is to allow us to create the same standard reports in a multi-tenant environment, plus jasper's notion of deploying reports is absolutely bizarre. I'm not sure it would be possible to make this process more difficult if you were intending to do so.
If I was in your position I would have established this kind of process:
end of development session: export all reports to a directory structure in a project under version control
commit the project
before next development session: synchronize the project with svn repository
import directory structure to Jasper Server Reports
continue development
Not sure if someone found posted the solution.
This is what I have done for existing reports.
export reports from jasper server
modify file names from .data to .jrmxl
modify subreport calling to add extension (like in A.jrxml should have subreport name as B.jrxml
modify add .jrmxl to datafile,label and name in report unit xml files.
If you are creating new report on jasper server, it simple
give .jrxml to name and label while adding jrxml file. thats it.
Now you can work same files in local and import same to jasper server.