Using Jaspersoft Studio, my .jrxml-file contains a subdataset, which I want to have a different data adapter than the main dataset.
This data adapter is stored in a JasperServer repository, where I also want to publish the .jrxml-file.
To do that, I used the property
<property name="net.sf.jasperreports.data.adapter" value="repo:/[path to adapter]"/>
inside of <subDataset>.
This should be an absolute path to the data adpter in the repository, however when I tested it in Jaspersoft Studio I got the following error:
Resource not found at: JasperReports_Server/some/folders/repo:/[path to adapter]
Between JasperReports_Server and repo: the path to the location where I publish my .jrxml-files was included.
Do I need additional property-tags inside <subDataset> or is the repo-string missing something?
Related
If I copy an API/content type from one Strapi project to another it works, but if I try to add another Content type I get an error.
I use MongoDB. I used mongodump/mongorestore to get transfer the data. The databases are named the same.
Error message: "an error has occured"
The new content type is listed under CONTENT TYPES on the navigation bar, but not within content type builder.
What is the correct way to copy an api?
Detailed description:
Computer 1: Startet new strapi project with "strapi new". Choose Mongodb. Added content types after "strapi start". Created content types and data. Testet and used together with react. Mongodumped db. Uploaded project to private git repository. Git ignored node_modules folder.
Computer 2: Cloned git repository to another computer where I also use strapi. Yarn to install dependencies. Yarn is uptodate. Startet strapi successfully. Needed to add new username/password. Same as used on the other computer. The content of the content types were empty. Mongorestore to db. Content types now have the same data as in computer 1. Needed to set roles & permission because that was not transferred. Testet together with react. Tried to add a new content type thru content type builder. First noticed in green something like "content type saved successfully". Normally this takes longer. Reloaded page and error occured.
We have TFS version control in which the following folder structure is adopted.
The following is the folder structure that we use:
In the above image, delta is the TFS server name, TempProjects is the project collection and DetBarShapeEngine is one of the team project. Main is the branch name and ProjectSource is a special folder under which all files managed by development team are stored. We label at the ProjectSource level and hence it is a kind of root folder for every project source.
When I ran tf labels /collection:"https://delta:443/tfs/TempProjects" /owner:*, I saw a huge list of labels from all the project collections including DetBarShapeEngine.
I modified the above command line to include the team project in the project collection URL tf labels /collection:"https://delta:443/tfs/TempProjects/DetBarShapeEngine" /owner:*, I receive:
TF31002: Unable to connect to this Team Foundation Server .... ....
Technical information (for administrator): The remote server returned
an error: (404) Not Found.
I tried increasing the depth of the path of the project collection URL till ProjectSource, but I still get the same 404 error.
Question:
How do I get labels of a sub folder under a team project?
Note:
I cannot change this folder structure for any reason what so ever.
We use TFS 2013.
After struggling a lot, I stumbled on a blog showing sample about using scope in some other command. I tried to use that scope string to TF labels command and it was a success!
Here is the command line that I used:
tf labels /collection:"https://delta:443/tfs/CadsProjects" *#$/DetBarShapeEngine/Main/ProjectSource /owner:*
Hi all I am very new to SSIS. I have got SSIS package developed by some other guy this package reads data from flat files and stores to database after mapping.
Flow:
1) First package extract records from flat file and stores in table.
2) Then it calls child package using Execute package tasks.
3) Then child package do some calculations and update the database table.
SSIS is using Environment variable to get database information.
Every thing is working fine but now I want to deploy this package to my client's server.
Ques: Do I need to copy and paste files from bin folder and paste on clients machine?
What I Tried: I copy files from bin folder and placed on my local computer. Then I create a job in MSSQL and run the job. Package runs perfectly. But Later I changed location of my project and problem starts job stops working.
Issue: Error says location of child package is not available(As I changed position of my project files)
Kindly suggest what to do.
I am going to make several assumtions here so please correct me if I get any wrong.
The problem I am guessing is that on your Package.dtsx within the connection manager this is currently linked to the package location within the project folder. In this case you are wanting to change it to another location, however the package in the connection manager is still pointing to the project location.
If I were you I would do the following:
Create a string variable
PackageFolderPath - C:\CurrentPackagePath\DBPackage.dtsx
Now what you want to do is go to the package within the connection manager and under the properties add an expression for ConnectionString with the following: “#[User::PackageFolderPath] If you evaluate the expression it should give you the location you setup in your variables.
Please note however that if you want this to work on the development system then setup the package to the project location.
Now once you have those setup, copy the files across the new server and under the SQL agent job to go the Set Values tab and within here you want to add the following:
\Package.Variables[User::PackageFolderPath].Properties[Value]
Under the value you want to put wherever the package is now located
This now should pickup the new location of the package when it is run.
A better way to do this would be to make use of the deployment utility and using an XML configuration variable on the package. However this way should work.
I am developing J2EE application with crystal report. When deploying in tomcat server working fine but when am deploying in weblogic getting following error. I don't know how to fix. Any advice?
The viewer was unable to find the resources required to render the
report. Please check the following to resolve the issue.
Verify that ../../crystalreportviewers120/ is accessible to your WebApp and is the correct path to the viewer resources.
You may customize this location by altering the crystal_image_uri and crystal_image_use_relative properties in the web.xml.
Validate that the file crv.js exists at ../../crystalreportviewers120/js/crviewer/crv.js.
When using The Report Viewer JSP Wizard to create a JSP to call the report viewer, automatically a folder named cristalreportviewers is created under WebContent, which contains all resources required to publish the rpt file. It is required to declare its location to the web.xml file. The Developer Guide for Report Viewer declares how to setup both crystal_image_uri and crystal_image_use_relative properties: You need to declare the relative path to that folder (and its right name) for the first property, and set the interpretation of the crystal_image_uri to be relative to the web page, application, or server, for the second property. The name of folder changed from a version to another: While message refers to cristalreportviewers120, in CR 2011 it is named whitout the "120" suffix. Put the right name in the crystal_image_uri parameter.
Copy the crystalreportviewers120 directory (found in C:\Program Files\Common
Files\Business Objects\3.0) to a subdirectory of the same name directly
underneath the Project directory (as a peer to WEB-INF). Ensure that all
contents, both files and subdirectories, are copied—there should be about 150
files in all.
Reference:-
Crystal Reports XI for J2EE Startup Guide.pdf
We're about to start development of a number of reports using Jasper Server Reports version 3.7.0 CE.
Does anyone have any recommendations as to how best to manage version control with this development, given that the structure of the report units is managed in the database and through either iReport or the web front end?
In fact you can import/export to a directory structure using the js-import/js-export scripts, but then you can't edit these files directly with iReport.
Does anyone have any pointers?
This is problematic. I have established a subversion repository to allow standard reports delivery to be versioned but it is a real pain because jasper does not make this even a little bit easy.
I created a maven project with an assembly descriptor so that "src/main/xml/resources/Reports,adhoc,Domains, etc" can be packaged up in a zip that is pushed to our maven repository.
The biggest problem is that you can't just develop adhoc and input controls merely by modifying XML files. The developer has to import what is in source control into a working jasper server, modify the reports or add new ones (after making sure that his organization and datasources are configured) and once he's satisfied that the report(s) works, export the resources to a directory or zip file, manually modify all references in the exported files from datasources and organization specific resource locations back to "generic" before checking in his changes.
When importing into jasper, the same process has to be done in reverse. The generic paths and organization values have to be converted to the developer's organization so they can be easily imported/updated and he can prove out that the full "round trip" works properly before checking in.
To make the export/subversion checkin easier, I created an ant build file which lives in the maven project's root dir. The build prompts (or will read a properties file) to determine the exported zip location, the organization id of the exported tree. It then opens exported zip file from jasper, explodes it, performs text replacements on the files, resets the "createdDate" and "updatedDate" elements to something standard (so that the developer does not end up checking in files that haven't actually changed since jasper does not preserve the date values), and then copy the files into the subversion tree.
For the import process (from the subversion tree into jasper) we have a script that takes as input the organization id and then modifies the versioned xml files to the appropriate values so that the entire tree can be easily imported/updated into their organization.
The reason this level of complexity is required is to allow us to create the same standard reports in a multi-tenant environment, plus jasper's notion of deploying reports is absolutely bizarre. I'm not sure it would be possible to make this process more difficult if you were intending to do so.
If I was in your position I would have established this kind of process:
end of development session: export all reports to a directory structure in a project under version control
commit the project
before next development session: synchronize the project with svn repository
import directory structure to Jasper Server Reports
continue development
Not sure if someone found posted the solution.
This is what I have done for existing reports.
export reports from jasper server
modify file names from .data to .jrmxl
modify subreport calling to add extension (like in A.jrxml should have subreport name as B.jrxml
modify add .jrmxl to datafile,label and name in report unit xml files.
If you are creating new report on jasper server, it simple
give .jrxml to name and label while adding jrxml file. thats it.
Now you can work same files in local and import same to jasper server.