Jenkins clearcase plugin config spec - loading files - plugins

I am using jenkins to build my project. My source code is in clearcase. I want to get the latest code from latest local code and make a build. All the setup worked
ClearCase repository Structure - 2 projects.
Main_Projects
-Project_1
--Project_1_Ver1
---Project_1_Ver1_TST
----Project_1_Ver1_LCL
--Project_1_Ver2
---Project_1_Ver2_TST
----Project_1_Ver2_LCL
-ProjectTwo
--Project_2_Ver1
---Project_2_Ver1_TST
----Project_2_Ver1_LCL
JENKINS Project - Source Management
Base ClearCase
view tag: Jenkins_Project_1_Ver1_LCL
view path: Jenkins_Project_1_Ver1_LCL
This is working config spec
element * CHECKEDOUT
element * .../Project_1_Ver1_LCL/LATEST
element * .../Project_2_Ver1_LCL/LATEST
element * /main/LATEST
Load Rules
\Proj1
\proj2
I had to point to new version for one project and I updated the config spec as follows
view tag: Jenkins_Project_1_Ver2_LCL
view path: Jenkins_Project_1_Ver2_LCL
element * CHECKEDOUT
element * .../Project_1_Ver2_LCL/LATEST
element * .../Project_2_Ver1_LCL/LATEST
element * /main/LATEST
Load Rules
\Proj1
\proj2
After this change, jenkins clearcase plugin does not get the code for the Project_1_Ver2_LCL stream any more.

In case you have a parent folder labelled with V1 only and not V2 (which would make all V2 sub-elements inaccessible), add a V1 rule (after the V2 one)
element * .../Project_1_Ver2_LCL/LATEST
element * .../Project_1_Ver1_LCL/LATEST
Another approach would be to make sure the baseline V2 is a full one, not an incremental one. That way, its associated label would be present on all the elements of the component.

Related

Github latest release is not same as a newest release

This is the situation.
30 minutes ago, I made a release note with tag name v4.2.4
then, just now I make a new release note with tag name 2022-07-18-0013 (this tag name is just about date, my company sometime
use this style version)
As far as I know latest release meaning that the newest release note, But In my case only semantic version(v4.2.4) can have latest tag.
why this is happened?
I can not find any rules about only semantic version has privilege to get a latest.
(I want to know why this is happened, because I use latest release Github API)
------------- EDIT ----------------
git log --oneline print result below
0bc82b8 Merge pull request #166 from devstefancho/feature/0718_test1
2e85d9a add
6cc313e add
4c7e5b2 Merge pull request #165 from devstefancho/feature/0717_test2
f018fca test
b403615 Merge pull request #163 from devstefancho/feature/0717_test2
e7dd66f test
git log --graph --oneline
* 0bc82b8 Merge pull request #166 from devstefancho/feature/0718_test1
|\
| * 2e85d9a add
|/
* 6cc313e add
* 4c7e5b2 Merge pull request #165 from devstefancho/feature/0717_test2
|\
| * f018fca test
* | b403615 Merge pull request #163 from devstefancho/feature/0717_test2
|\|
| * e7dd66f test
|/
------------------- Solved --------------------
Thanks for the great answer, finally figure out!
Reason : same day timestamp
If tags are not created in a same day, then the newest(by time) tag will be latest tag
This information was provided by a GitHub staff member:
Releases are based on Git tags, which mark a specific point in your repository’s history. The sort order of tags is as follows:
Tags are sorted by the timestamp of the underlying commit that they point to
If those commits are created on the same day, then the sorting is based on Semantic Versioning of the name of the tag (https://semver.org/)
If the Semantic Versioning is the same, they are sorted by second of creation
Pre-release versions have a lower precedence than the associated normal version.

Jenkins - tagging a build fails with NoHeadException

When I try to tag a specific build through Jenkins, I get the following error:
ERROR: Error tagging repo 'refs/remotes/origin/master' :
org.eclipse.jgit.api.errors.NoHeadException: Tag on repository without
HEAD currently not supported hudson.plugins.git.GitException:
org.eclipse.jgit.api.errors.NoHeadException: Tag on repository without
HEAD currently not supported at
org.jenkinsci.plugins.gitclient.JGitAPIImpl.tag(JGitAPIImpl.java:509)
at
hudson.plugins.git.GitTagAction$TagWorkerThread.perform(GitTagAction.java:199)
at hudson.model.TaskThread.run(TaskThread.java:129) Caused by:
org.eclipse.jgit.api.errors.NoHeadException: Tag on repository without
HEAD currently not supported at
org.eclipse.jgit.api.TagCommand.call(TagCommand.java:137) at
org.jenkinsci.plugins.gitclient.JGitAPIImpl.tag(JGitAPIImpl.java:507)
... 2 more Trying next branch Completed
When trying to tag in the workspace it works fine, HEAD is in fact attached, git refs look fine, could this be an issue that when Jenkins is trying to tag it is looking in the wrong working directory?
Is there a way to pull more verbose logs with how it's trying to tag?
FYI - using the Jenkins 2.81, and swarm Linux agents, latest Git plugin.
Consider the actual code throwing the exception:
try (RevWalk revWalk = new RevWalk(repo)) {
// if no id is set, we should attempt to use HEAD
if (id == null) {
ObjectId objectId = repo.resolve(Constants.HEAD + "^{commit}"); //$NON-NLS-1$
if (objectId == null)
throw new NoHeadException(
JGitText.get().tagOnRepoWithoutHEADCurrentlyNotSupported);
Double-check your configuration: see "Jenkins Git plugin detached HEAD". You need to make sure Jenkins actually does checkout a branch.
Try for instance to add a simple test step with a git status in it, to validate that.

Static resource reload with akka-http

In short: is it possible to reload static resources using akka-http?
A bit more:
I have Scala project.
I'm using App object to launch my Main
class.
I'm using getFromResourceDirectory to locate my resource
folder.
What I would like to have is to hot-swap my static resources during development.
For example, I have index.html or application.js, which I change and I want to see changes after I refresh my browser without restarting my server. What is the best practise of doing such thing?
I know that Play! allows that, but don't want to base my project on Play! only because of that.
Two options:
Easiest: use the getFromDirectory directive instead when running locally and point it to the path where your files you want to 'hotload' are, it serves them directly from the file system, so every time you change a file and load it through Akka HTTP it will be the latest version.
getFromResourceDirectory loads files from the classpath, the resources are available because SBT copies them into the class directory under target every time you build (copyResources). You could configure sbt using unmanagedClasspath to make it include the static resource directory in the classpath. If you want to package the resources in the artifact when running package however this would require some more sbt-trixery (if you just put src/resources in unmanagedClasspath it will depend on classpath ordering if the copied ones or the modified ones are used).
I couldn't get it to work by adding to unmanagedClasspath so I instead used getFromDirectory. You can use getFromDirectory as a fallback if getFromResourceDirectory fails like this.
val route =
pathSingleSlash {
getFromResource("static/index.html") ~
getFromFile("../website/static/index.html")
} ~
getFromResourceDirectory("static") ~
getFromDirectory("../website/static")
First it tries to look up the file in the static resource directory and if that fails, then checks if ../website/static has the file.
The below code try to find the file in the directory "staticContentDir". If the file is found, it is sent it back to the client. If it is not found, it tries by fetching the file from the directory "site" in the classpath.
The user url is: http://server:port/site/path/to/file.ext
/site/ comes from "staticPath"
val staticContentDir = calculateStaticPath()
val staticPath = "site"
val routes = pathPrefix(staticPath) {
entity(as[HttpRequest]) { requestData =>
val fullPath = requestData.uri.path
encodeResponse {
if (Files.exists(staticContentDir.resolve(fullPath.toString().replaceFirst(s"/$staticPath/", "")))) {
getFromBrowseableDirectory(staticContentDir.toString)
} else {
getFromResourceDirectory("site")
}
}
}
}
I hope it is clear.

XML Import Warning: Informatica

I am getting the following warning message while I tried to import XML file in Informatica repository.
Warning: Unexpected condition at: Wcursor.cpp: 305
Contact Informatica technical support for assistance
Continuing may result in damage to your repository.
The XML file is around 70mb and has got around 4500 objects in it. I am migrating an entire application from one server to another.
Not sure why this issue happens. I tried several times and from other client system as well, but no luck.
For importing the XML via command line using "pmrep" command, we need control file. But I dont have any control file for this XML. So cant go with that option.
It would be great if somebody can help me sort out this issue.
Details:
Infa version 9.1
Mounted on Unix environment.
Had the same problem back some time ago. XML parsing takes a lot of memory and / or GUI can't handle it. My solution was to use pmrep command line tool. Worked for me - my workflow was composed of around 3600 objects afair.
If you don't have a control file - create one! Here's a very simple template:
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE IMPORTPARAMS SYSTEM "impcntl.dtd">
<!--IMPORTPARAMS This inputs the options and inputs required for import operation -->
<!--CHECKIN_AFTER_IMPORT Check in objects on successful import operation -->
<!--CHECKIN_COMMENTS Check in comments -->
<!--APPLY_LABEL_NAME Apply the given label name on imported objects -->
<!--RETAIN_GENERATED_VALUE Retain existing sequence generator, normalizer and XML DSQ current values in the destination -->
<!--COPY_SAP_PROGRAM Copy SAP program information into the target repository -->
<!--APPLY_DEFAULT_CONNECTION Apply the default connection when a connection used by a session does not exist in the target repository -->
<IMPORTPARAMS CHECKIN_AFTER_IMPORT="YES" CHECKIN_COMMENTS="PMREP_IMPORT_TEST" RETAIN_GENERATED_VALUE="NO" COPY_SAP_PROGRAM="NO" APPLY_DEFAULT_CONNECTION="NO">
<!--FOLDERMAP matches the folders in the imported file with the folders in the target repository -->
<FOLDERMAP SOURCEFOLDERNAME="YOUR FIRST SOURCE FOLDER NAME" SOURCEREPOSITORYNAME="REP_DEV" TARGETFOLDERNAME="YOUR FIRST SOURCE FOLDER NAME" TARGETREPOSITORYNAME="REP_TEST"/>
<FOLDERMAP SOURCEFOLDERNAME="YOUR SECOND TARGET FOLDER NAME" SOURCEREPOSITORYNAME="REP_DEV" TARGETFOLDERNAME="YOUR SECOND TARGET FOLDER NAME" TARGETREPOSITORYNAME="REP_TEST"/>
<!--Import will only import the objects in the selected types in TYPEFILTER node -->
<!--TYPENAME type name to import. This should comforming to the element name in powermart.dtd, e.g. SOURCE, TARGET and etc.-->
<!--RESOLVECONFLICT allows to specify resolution for conflicting objects during import. The combination of specified child nodes can be supplied -->
<RESOLVECONFLICT>
<!--TYPEOBJECT allows objects of certain type to apply replace/reuse upon conflict-->
<!--TYPEOBJECT = ALL conflict resolution for ALL types of objects -->
<TYPEOBJECT OBJECTTYPENAME="ALL" RESOLUTION="REPLACE"/>
<!--SPECIFICOBJECT allows a particular object(name, typename etc.) to apply replace/reuse upon conflict -->
<!--NAME Object name-->
<!--EXTRANAME Source DBD name - required for source object to identify uniquely-->
<!--OBJECTTYPENAME Object type name-->
<!--FOLDERNAME Folder which the object belongs to-->
<!--REPOSITORYNAME Repository name that this object belongs to-->
<!--RESOLUTION Resolution to apply for the object in case of conflict-->
<!--SPECIFICOBJECT NAME="your_object" OBJECTTYPENAME="your_object_type" FOLDERNAME="your_source_folder" REPOSITORYNAME="your_source_repo" RESOLUTION="REPLACE"/-->
</RESOLVECONFLICT>
</IMPORTPARAMS>

Documentation Generation is disabled

I did all that is specified in the tutorial - Doxygen Plugin.
Here is the sonarqube-4.5.1/conf/sonar.propeties file doxygen entries:
# Doxygen
sonar.doxygen.generateDocumentation=enable
sonar.doxygen.deploymentPath=D:\Downloads\sonarqube-4.5.1\web
sonar.doxygen.deploymentUrl=http://localhost:9000/sonar/documentation
The output of the sonarqube runner:
16:07:16.265 INFO - ANALYSIS SUCCESSFUL
16:07:16.266 DEBUG - Post-jobs : org.sonar.plugins.doxygen.DoxygenPostJob#28bda649
16:07:16.266 INFO - Executing post-job class org.sonar.plugins.doxygen.DoxygenPostJob
16:07:16.271 INFO - === SUPPRESS PREVIOUS GENERATION ===
16:07:16.395 INFO - === DOXYGEN EXECUTION ===
16:07:16.396 INFO - ### Generating configuration ###
16:07:16.427 INFO - ### Generating documentation ###
Also, in the specified \web folder there is a documentation folder which seems to contain the correct doxygen documentation output.
Yet I keep getting this Documentation Generation is disabled. message in the SonarQube web interface:
UPDATE
This is what my sonar-project.properties file contains now ― using a unix-style path:
#Doxygen
sonar.doxygen.generateDocumentation=enable
sonar.doxygen.deploymentPath=/Downloads/sonarqube-4.5.1/web
sonar.doxygen.deploymentUrl=http://localhost:9000/sonar/documentation
The output remains the same, same issue.
What do I need to do in order to see the documentation in the web server interface?
This seems to be a server linkage problem, because the documentation is being generated correctly, at this location: /Downloads/sonarqube-4.5.1/web/documentation.
I have also found this content:
core,true,sonar-core-plugin-4.5.1.jar|9289fc1067c31372c0b020aa01163087
emailnotifications,true,sonar-email-notifications-plugin-4.5.1.jar|bb35818e4a655a3ba2cff2afc65a296b
findbugs,false,sonar-findbugs-plugin-2.4.jar|bb0bf263ef1e0d56f569878732060cc9
java,false,sonar-java-plugin-2.4.jar|a105d018165ddeb2c0f5074100768660
cpd,true,sonar-cpd-plugin-4.5.1.jar|e11ff5066c9e2308036838510d87a6fe
dbcleaner,true,sonar-dbcleaner-plugin-4.5.1.jar|a444b3b4571791e1cde146ffa5132ee4
design,true,sonar-design-plugin-4.5.1.jar|0c6476994a44904307cfa8b8a08bbddd
doxygen,false,sonar-doxygen-plugin-0.1.jar|d86e1ab81c3ac34e6b31aa1da28d7f72
l10nen,true,sonar-l10n-en-plugin-4.5.1.jar|c21d53f67901cf6df3da1b4dd48a441b
in sonarqube-4.5.1\web\deploy\plugins\index.txt.
It looks like doxygen has a false associated with it. If I try to edit it (to true) and restart the server nothing changes. The file is overwritten at by the sonar-runner.
sonar.doxygen.generateDocumentation is a project property, not a server property. You have to set it in your "sonar-project.properties" file if you run your analysis with the SonarQube Runner or in your pom.xml file if you run the analysis with Maven.
Here is how I solved this:
Stopped the sonar-qube server.
Replaced the old sonar-doxygen-plugin-0.1.jar, from /Downloads/sonarqube-4.5.1/extensions/plugins, with the updated doxygen plugin from here https://github.com/SonarCommunity/sonar-doxygen/releases/download/1.0/sonar-doxygen-plugin-1.0-SNAPSHOT.jar.
Commented out the old configuration entries for doxygen from the project sonar-project.properties file. And replaced them with the follwoing entries:
sonar.doxygen.url=http://localhost:8000/
sonar.doxygen.enable=true
Used a simple python script to post the documentation html on that server (http://localhost:8000/).
Restarted the sonar-qube server.
Run the sonar-runner.bat again.
The documentation is in its place now.