Eclipse e4 RCP part activation event sent multiple times - eclipse

In my Eclipse e4 RCP application, I'm using annotations to get notified when a Part is activated, using the approach proposed here :
http://www.vogella.com/tutorials/Eclipse4ModelEvents/article.html
#Inject
#Optional
public void subscribeTopicPartActivation(#UIEventTopic(UIEvents.UILifeCycle.ACTIVATE) Event event) {
Object element = event.getProperty(EventTags.ELEMENT);
if (!(element instanceof MPart)) {
return;
}
MPart part = (MPart) element;
System.out.println("Part activated: " + part.getLabel());
}
It works fine, but I've noticed that the activate event is triggered more than once for the same part in cases where we would expect a single activate event (i.e for exemple simple switch to the part...). The event message sent seems to be exactly the same (same target, same topic). Am I missing something ? Is it a normal behavior for the event framework ?

Yes, it appears to be normal. There is a difference in the 'tags' value of the part between the two events. In the second event 'active' has been added to the tags.
This will show the difference:
System.out.println("part " + part.getElementId() + " tags " +
part.getTags().stream().collect(Collectors.joining(", ")));
You can also use the EPartService addPartListener method to listen for just part changes.

Related

WebFlux - ignore 'cancel' signal

Tried to get the answer from other SO q&a, directly from reactor's documentation and spring webflux documentation but somehow I am not sure still, how to achieve this :(
Say I have simple controller as follows:
#RestController
class DemoController {
#GetMapping("/demo/{text}")
public Flux<String> getDemo(#PathVariable String text) {
return Flux.fromArray(text.split(""))
.map(String::toUpperCase)
.delayElements(Duration.ofSeconds(1L))
.doOnNext(s -> System.out.println("S: " + s + " -> " + LocalDateTime.now().getSecond()))
.doOnCancel(() -> System.out.println("Cancelled"));
}
}
and I would like the processing to ignore "cancel" signal, that normally comes when client disconnects. Tried to achieve that via nginx's proxy_ignore_client_abort on; but that does not work properly.
I can .subscribe(...) to a custom subscriber, that in example ignores cancel signals, but than I saw the processing is executed twice - which is quite obvious and not anticipated.
Another option, is that I can extends Flux<T> and just override:
#Override
public void cancel() {
System.out.println("Cancelled / ignored");
}
but that somehow looks more like hacking.
To sum up
Could someone please advise, what is the "proper" way of running some reactive pipeline, triggered via controller, that can ignore cancel signal?
The current webflux behavior is perfect. However your requirement seems to be different that you always want to emit even when the subscriber is no longer is interested in listening to the data. In this case, just make the source HOT.
#RestController
class DemoController {
#GetMapping("/demo/{text}")
public Flux<String> getDemo(#PathVariable String text) {
return Flux.fromArray(text.split(""))
.map(String::toUpperCase)
.delayElements(Duration.ofSeconds(1L))
.doOnNext(s -> System.out.println("S: " + s + " -> " + LocalDateTime.now().getSecond()))
.doOnCancel(() -> System.out.println("Cancelled"))
.cache(); // hot source
}
}

Eclipse Validator: Listen to inherited Resource

I have some custom (logic-)validators for Eclipse, but all are facing the same Problem:
The actual implementation of some logic can take place in any of the parent classes.
So, whenever a resource in question changes, I can went up the inheritance tree to see if the required code takes place in any of the parent classes - if not, i'm placing a marker into the validated file itself.
This marker ofc. gets verified, whenever changes to the resource in question are made. But how can i trigger a revalidation if one of the parent classes changes?
Is it possible to place some kind of "validator-callback" into other files, which will trigger an validation of the given file?
i.e. Three Classes: A extends B, B extends C - Now, the validator notes on A, that neither A nor B nor C is extending X - which is required for A due to annotations. (Logic-Contract)
Now the Error-Marker in A should be re-evaluated as soon as B or C are modified. (Currently i'm using a delta-builder which ofc. just will invoke validation whenever A changes...)
In a nutshell: I want to place a "marker", that gets re-validated whenever one out of X resources change.
After I had some time to play around, I finally found a 99% solution to this issue.
Good news: It is possible.
Bad news: It is DIY. I was not able to find any useful method to achieve what I want.
The following post should outline the solution regarding my problem. There might be better, more efficent or easier solutions, but unfortunately the documentation around the validation framework of eclipse is very thin - so i'd like to share my solution - take it, improve it - or ignore it :)
First, you should understand the actual problem I tried to solve. Therefore I'm providing short snippets, without going into too much detail of the validator or my code-lineout:
In my project, I'm using Hibernate - and therefore a lot of classes annotated with #Entity. When using Hibernate along with LazyLoadingone needs to (manually) ensure, that PersistentBags are initialized when accessed.
In our application, there is a method called initializeCollection(Collection c) in the class LazyEntity, which handles everything around it.
This leads to a logic contract my validator should validate:
Whenever there is a class annotated with #Entity, AND this class is using FetchType.LAZY on any of its collections, there musst be two contraints met:
A.) the class - or any of its ancestors - needs to extend LazyEntity.
B.) the getter of the collection in question needs to call initializeCollection() before returning the collection.
I'll focus on point B, because this is, where the validation problem kicks in: How to revalidate an actual Entity, when it's ancestor changes?
I modified the actual validation method, to have two IFiles as Attributes:
public void checkLazyLoadingViolations(IFile actualFile, IFile triggeredFile) {
//validation of actualFile takes place here
}
The delta-builder as well as the incremental builder are invoking this method like this:
class LazyLoadingResourceVisitor implements IResourceVisitor {
public boolean visit(IResource resource) {
if (resource instanceof IFile) {
checkLazyLoadingViolations((IFile) resource, (IFile) resource);
}
// return true to continue visiting children.
return true;
}
}
Now - in a first step - the validator itself is taking care to validate the actualFile and bubbling up the inheritence tree to validate any parent file as well. If the validator hits the top-most-parent, without finding the required extension, an additional marker is placed.
The placement of the markers is happening with the following method. In case the file, where the marker should be placed differs from the file on which eclipse has invoked the validation, the IMarker.USER_EDITABLE Attribute is used to store a path to the file, on which the validation has been invoked (in order to retrigger validation):
/**
* Creates a Marker in the give IFile, considering if it should be a direct
* marker, or a dependency marker, which our Validator can resolve later.
*
* #param actualFile
* The file that is currently validated.
* #param triggeredFile
* The file on which eclipse invoked the validation.
* #param message
* The message for the Marker.
* #param lineNumber
* the line number, where the Marker should appear.
* #param severity
* the severity of the marker.
* #param callbackPath
* The full path to the file that should be validated, when the
* marker is revalidated.
*/
public void addMarker(IFile actualFile, IFile triggeredFile,
String message, int lineNumber, int severity, String callbackPath) {
try {
IMarker marker = actualFile.createMarker(MARKER_TYPE);
marker.setAttribute(IMarker.MESSAGE, message);
marker.setAttribute(IMarker.SEVERITY, severity);
marker.setAttribute(IMarker.LINE_NUMBER, lineNumber == -1 ? 1
: lineNumber);
if (callbackPath != null) {
marker.setAttribute(IMarker.USER_EDITABLE, "callback:"
+ triggeredFile.getFullPath());
}
} catch (CoreException e) {
// eclipse made a boo boo.
}
}
Now, the validation errors are set: Each actual class contains its Errors - and in case the validation fails in the inheritance tree, the parent contains a marker as well:
(AdminUser extends TestUser in this Example)
When the validator gets triggered on the "parent" file due to changes, it graps all markers, and checks, if the marker provides the callback Attribute. If so, the validator invokes the validation of the resource instead of just validating the parent:
IMarker[] markers = null;
try {
markers = actualFile.findMarkers(IMarker.PROBLEM, true,
IResource.DEPTH_INFINITE);
for (IMarker m : markers) {
// Our marker type?
if (m.getType().equals(MARKER_TYPE)) {
// dependent resource defined?
if (m.getAttribute(IMarker.USER_EDITABLE) != null) {
if (m.getAttribute(IMarker.USER_EDITABLE)
.toString().startsWith("callback:")) {
String otherFile = m
.getAttribute(IMarker.USER_EDITABLE)
.toString().replace("callback:", "");
// call validation for that file as well.
// (recursion)
String relOther = otherFile.split(Pattern.quote(actualFile.getProject().getFullPath().toString()))[1];
IFile fileToValidateAsWell = actualFile
.getProject().getFile(relOther);
//remove old markers!
deleteMarkers(fileToValidateAsWell);
//revalidate - which has no impact but triggering the root-resource again!
//this will recreate the just deleted markers if problem is not resolved.
checkLazyLoadingViolations(
fileToValidateAsWell, triggeredFile);
}
}
}
}
} catch (CoreException e1) {
}
Finally, a change to the root file in the inheritance tree causes the leaf file to be revalidated:
And ofc. when calling the initializeCollection-Method properly, there is no error at all :)
The only trade off so far is: IF the parent-file is valid - and gets modified to invalid - it will not re-trigger the validation of the leaf, because there is no error-marker containing any callback information.
The error then will appear the first time a full-build is performed. - For the moment, I can live with that.
Logic in a nutshell
If the validated resource is a leaf:
Place markers in the leaf
Place marker in the top-most-parent, while adding a callback link as marker attribute.
If the validated resource is a root
Remove markers
call validation on the linked leaf provided by the callback link inside the existing marker, which in trun will be case 1 and recreate all markers if applicable.

How to restrict a component to add only once per page

How to restrict a CQ5/Custom component to add only once per page.? I want to restrict the drag and drop of component into the page when the author is going to add the same component for the second time into the same page.
One option is to include the component directly in the JSP of the template and exclude it from the list of available components in the sidekick. To do so, add the component directly to your JSP (foundation carousel in this example):
<cq:include path="carousel" resourceType="foundation/components/carousel" />
To hide the component from the sidekick, either set:
componentGroup: .hidden
or exclude it from the list of "Allowed Components" using design mode.
If you need to allow users to create a page without this component you can provide a second template with the cq:include omitted.
Thanks Rampant, I have followed your method and link stated.
Posting link again : please follow this blog
It was really helpful. I am posting the implementation whatever I have done.
It worked fine for me. One can definitely improve the code quality, this is raw code and is just for reference.
1.Servlet Filter
Keep this in mind that,if any resource gets refereshed, this filter will execute. So you need to filter the contents at your end for further processing.
P.S. chain.doFilter(request,response); is must. or cq will get hanged and nothing will be displayed.
#SlingFilter(generateComponent = false, generateService = true, order = -700,
scope = SlingFilterScope.REQUEST)
#Component(immediate = true, metatype = false)
public class ComponentRestrictorFilter implements Filter {
public void init(FilterConfig filterConfig) throws ServletException {}
#Reference
private ResourceResolverFactory resolverFactory;
public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain)
throws IOException, ServletException {
WCMMode mode = WCMMode.fromRequest(request);
if (mode == WCMMode.EDIT) {
SlingHttpServletRequest slingRequest = (SlingHttpServletRequest) request;
PageManager pageManager = slingRequest.getResource().getResourceResolver().adaptTo(PageManager.class);
Page currentPage = pageManager.getContainingPage(slingRequest.getResource());
logger.error("***mode" + mode);
if (currentPage != null )) {
ComponentRestrictor restrictor = new ComponentRestrictor(currentPage.getPath(), RESTRICTED_COMPONENT);
restrictor.removeDuplicateEntry(resolverFactory,pageManager);
}
chain.doFilter(request, response);
}
}
public void destroy() {}
}
2.ComponentRestrictor class
public class ComponentRestrictor {
private String targetPage;
private String component;
private Pattern pattern;
private Set<Resource> duplicateResource = new HashSet<Resource>();
private Logger logger = LoggerFactory.getLogger(ComponentRestrictor.class);
private Resource resource = null;
private ResourceResolver resourceResolver = null;
private ComponentRestrictorHelper helper = new ComponentRestrictorHelper();
public ComponentRestrictor(String targetPage_, String component_){
targetPage = targetPage_ + "/jcr:content";
component = component_;
}
public void removeDuplicateEntry(ResourceResolverFactory resolverFactory, PageManager pageManager) {
pattern = Pattern.compile("([\"']|^)(" + component + ")(\\S|$)");
findReference(resolverFactory, pageManager);
}
private void findReference(ResourceResolverFactory resolverFactory, PageManager pageManager) {
try {
resourceResolver = resolverFactory.getAdministrativeResourceResolver(null);
resource = resourceResolver.getResource(this.targetPage);
if (resource == null)
return;
search(resource);
helper.removeDuplicateResource(pageManager,duplicateResource);
} catch (LoginException e) {
logger.error("Exception while getting the ResourceResolver " + e.getMessage());
}
resourceResolver.close();
}
private void search(Resource parentResource) {
searchReferencesInContent(parentResource);
for (Iterator<Resource> iter = parentResource.listChildren(); iter.hasNext();) {
Resource child = iter.next();
search(child);
}
}
private void searchReferencesInContent(Resource resource) {
ValueMap map = ResourceUtil.getValueMap(resource);
for (String key : map.keySet()) {
if (!helper.checkKey(key)) {
continue;
}
String[] values = map.get(key, new String[0]);
for (String value : values) {
if (pattern.matcher(value).find()) {
logger.error("resource**" + resource.getPath());
duplicateResource.add(resource);
}
}
}
}
}
3.To remove the node/ resource
Whichever resource you want to remove/delete just use PageManager api
pageManeger.delete(resource,false);
That's it !!! You are good to go.
None of the options looks easy to implement. The best approach I found is to use the ACS Commons Implementation which is very easy and can be adopted into any project.
Here is the link and how to configure it:
https://github.com/Adobe-Consulting-Services/acs-aem-commons/pull/639
Enjoy coding !!!
you can't prevent that without doing some massive hacking to the ui code, and even then, you've only prevented it from one aspect of the ui. there's still crxde, and then the ability to POST content.
if this is truly a requirement, the best approach might be the following:
have the component check for a special value in the pageContext object (use REQUEST_SCOPE)
if value is not found, render component and set value
otherwise, print out a message that component can only be used once
note that you can't prevent a dialog from showing, but at the very least the author has an indication that that particular component can only be used once.
It sounds like there needs to be clarification of requirements (and understanding why).
If the authors can be trained, let them manage limits of components through authoring and review workflows.
If there is just 1 fixed location the component can appear, then the page component should include the content component, and let the component have an "enable" toggle property to determine if it should render anything. The component's group should be .hidden to prevent dragging from the sidekick.
If there is a fixed set of locations for the component, the page component can have a dropdown of the list of locations (including "none"). The page render component would then conditionally include the component in the correct location. Again, prevent dragging the component from the sidekick.
In the "hard to imagine" case that the component can appear anywhere on the page, added by authors, but limited to only 1 instance - use a wrapper component to manage including the (undraggable) component. Let the authors drag the wrapper on the page as many times as they want, but the wrapper should query the page's resources and determine if it is the first instance, and if so, include the end component. Otherwise, the wrapper does nothing.
In our experience (>2years on CQ), implementing this type of business rules via code creates a brittle solution. Also, requirements have a habit of changing. If enforced via code, development work is required instead of letting authors make changes faster & elegantly.
None of these options are that great. If you truly want a robust solution to this problem (limit the number of items on the page without hardcoding location) then the best way is with a servlet filter chain OSGI service where you can administer the number of instances and then use a resource resolver to remove offending instances.
The basic gist is:
Refresh the page on edit using cq:editConfig
Create an OSGI service implementing javax.servlet.Filter that encapsulates your business rules.
Use the filter to remove excess components according to business rules
Continue page processing.
For more details see here:
Using a servlet filter to limit the number of instances of a component per page or parsys
This approach will let you administer the number of items per page or per parsys and apply other possibly complex business rules in a way that the other offered solutions simply cannot.

Tridion Workflows - How to get the Component at the Activity in Event Handler

I need to get the component associated to a Activity at the event system.
I try to get the component ID using:
public void OnActivityInstanceFinishPost(ActivityInstance activityInstance, string finishMessage, string nextActivity, string dynamicAssignee)
{
if (activityInstance.ProcessInstance.ProcessDefinition.Title.Equals("Component Process IESE"))
{
if (activityInstance.ActivityDefinition.Title.Equals("Create or Edit Component"))
{
WFE workflow = tdse.GetWFE();
try
{
Component comp = (Component)activityInstance.ProcessInstance.Item;
XMLReadFilter filter = new XMLReadFilter();
String processHistoryId = activityInstance.ProcessInstance.ID.Replace("131076", "131080");
ProcessHistory hist = (ProcessHistory)tdse.GetObject(activityInstance.ProcessInstance.ID, EnumOpenMode.OpenModeView, Constants.URINULL, filter);
}
catch (Exception e)
{ }
}
}
}
we try different options:
Component comp = (Component)activityInstance.ProcessInstance.Item;
But this solution returns a null.
Then I found in internet the next solution:
XMLReadFilter filter = new XMLReadFilter();
String processHistoryId = activityInstance.ProcessInstance.ID.Replace("131076", "131080");
ProcessHistory hist = (ProcessHistory)tdse.GetObject(activityInstance.ProcessInstance.ID, EnumOpenMode.OpenModeView, Constants.URINULL, filter);
Component comp = hist.Item as Component;
But the ProcessHistory object is null.
How can I determine the component associated to the activityInstance?
Thank you.
After reviewing the functionality needed by Guskermitt, I've shown him a neater way to do what he needs to do. In short, EventSystem is not needed in this case.
His goal is to send an email after a component has been approved, the approach will be the following:
Add to workflow a new automatic activity.
Create a new .NET assembly, in this case a C# class to do what he needs to do.
Register the assembly in the GAC.
Add logic in the new automatic activity in workflow to use the .NET assembly.
2#
[ProgId("WfHelper")]
[ComVisible(true)]
public class Helper
{
public void SendMail(string workItemId)
{
var session = new Session();
.
.
.
4#
dim helper
set helper = CreateObject("WfHelper")
call helper.SendMail(CurrentWorkItem.ID)
set helper = nothing
FinishActivity “Email has been sent"
ActivityInstance has a WorkItems property (inherited from Activity) that contains a reference to your Component.
OnActivityInstanceFinishPost means that your activity is finished. Therefore there is no more work item associated with it. However, you are getting the process instance and the work item associated with that. If you get null there, then it suggests your workflow process is done and the component has moved out of workflow. From looking at your code, it is quite likely that your ProcessInstance is completed (it won't be null, but it won't have any item associated with it).
I suspect that you've read this post http://www.tridiondeveloper.com/autopublishing-on-workflow-finish suggesting to look in the history. Have you looked into the history via the CM GUI, is the history item there? If it isn't, that's why you get null. A workflow process gets moved to history when it is completed. So double check that you are indeed on the last workflow activity before looking at the history.
By looking at your code, the error seems to be that you are trying to get a history object using activityInstance.ProcessInstance.ID. GetObject() should return an item, but your cast to a ProcessHistory should break and then you quietly eat the exception. You need to pass in the History ID, not the ProcessInstance ID as follows:
ProcessHistory hist = (ProcessHistory)tdse.GetObject(processHistoryId, EnumOpenMode.OpenModeView, Constants.URINULL, filter);

How to hide the output of a console view?

I'm writing an Eclipse plug-in in which the user can interact with another process via the Console view (in this case, an interpreter), for example, evaluate expressions and so on.
Sometimes the program needs to ask the interpreter for certain values. These interactions however, shouldn't be shown in the console view to the user.
I have following instances:
private IProcess process;
private ILaunch launch;
private IStreamsProxy proxy;
the queries my program do are made via adding an IStreamListener to the proxy:
proxy.getOutputStreamMonitor().addListener(new IStreamListener(){
#Override
public void streamAppended(String response, IStreamMonitor arg1) {
doSomeStuffWiththeRepsonse(response);
}
});
while the listener is listening to the OutputStreamMonitor of the proxy, I don't want the response to pop up in the console view of the plugin.
How can I do that?
Okay, here is how I did it.
The launch system of Eclipse works as follows:
1. Implement a ILaunchConfigurationDelegate, the only method in this interface is launch, which recieves an ILaunchConfiguration, a mode, an ILaunch and an IProgressMonitor.
In my program, launch starts an inferiorProcess using DebugPlugin.exec() using a commandline argument. Then a new Process is created by calling DebugPlugin.newProcess() with the ILaunch, the inferiorProcess, the name for the interpreter and some attributes.
This method creates a new RuntimeProcess and adds it to the ILaunch and vice versa.
2. Define a LaunchConfigurationType by using the extension point org.eclipse.debug.core.launchConfigurationTypes and add it to the plugin.xml:
<extension
point="org.eclipse.debug.core.launchConfigurationTypes">
<launchConfigurationType
delegate="myplugin.MyLaunchConfigurationDelegate" (1)
id="myplugin.myExternalProgram" (2)
modes="run" (3)
name="MyExternalProgram" (4)
public="false"> (5)
</launchConfigurationType>
</extension>
The extension point gives the exact path to the ILaunchConfigurationDelegate class created as above (1) and an unqiue identifier (2) to retrieve the instance of ILaunchConfigurationType from the LaunchManager used to launch the program. (3) defines the modes it can run as, run and debug. The name (4) is later shown in the top bar of the console view. If you only want to access and launch your external program programmatically in your plug-in (and not via the Run drop-down menu) (5) must be set to false.
3. Create a class that stores the Instances of IProcess, ILaunch and IStreamsProxy and which calls apropiate methods to start the process and to write onto the streamsproxy. A method for starting the process could look like this:
// is the process already running?
public boolean isRunning() {
boolean result = false;
try {
if (this.process != null) {
result = true;
this.process.getExitValue();
result = false;
}
}
catch (DebugException exception) {
}
return result;
}
// start the process
public void start() {
try {
if (!isRunning()) {
// get the ILaunchConfigurationType from the platform
ILaunchConfigurationType configType = DebugPlugin.getDefault().getLaunchManager().getLaunchConfigurationType(myplugin.myExternalProgram);
// the ILaunchConfigurationType can't be changed or worked with, so get a WorkingCopy
ILaunchConfigurationWorkingCopy copy = configType.newInstance(null, "myExternalProgram");
this.launch = copy.launch(ILaunchManager.RUN_MODE, new NullProgressMonitor());
IProcess[] processes = this.launch.getProcesses();
if (processes.length > 0) {
// get the IProcess instance from the launch
this.process = this.launch.getProcesses()[0];
// get the streamsproxy from the process
this.proxy = this.process.getStreamsProxy();
}
}
}
catch (CoreException exception) {
}
if (isRunning())
// bring up the console and show it in the workbench
showConsole();
}
public void showConsole() {
if (this.process != null && this.process.getLaunch() != null) {
IConsole console = DebugUITools.getConsole(this.process);
ConsolePlugin.getDefault().getConsoleManager().showConsoleView(console);
IWorkbenchPage page = PlatformUI.getWorkbench().getActiveWorkbenchWindow().getActivePage();
IViewPart view = page.findView("org.eclipse.ui.console.ConsoleView");
if (view != null)
view.setFocus();
}
}
Now to the initial problem of the question
The IStreamsListener of the console view, which listens to the OutputStreamMonitor of the IStreamsProxy could not be retrieved and thus not being stopped of listening. Prints to the console could not be prevented. OutputStreamMonitor doesn't provide methods to get the current listeners. It is not possible to just subclass it and override/add some methods, because the important fields and methods are private.
http://www.java2s.com/Open-Source/Java-Document/IDE-Eclipse/debug/org/eclipse/debug/internal/core/OutputStreamMonitor.java.htm
Just copy the code and add a get-method for the fListeners field and change some method modifiers to public.
In order to get your own OutputStreamMonitor into the system, you need to create your own IStreamsProxy. Again only subclassing wont work, you need to copy the code again and make some changes.
http://www.java2s.com/Open-Source/Java-Document/IDE-Eclipse/debug/org/eclipse/debug/internal/core/StreamsProxy.java.htm
Important:
public class MyStreamsProxy implements IStreamsProxy, IStreamsProxy2 {
/**
* The monitor for the output stream (connected to standard out of the process)
*/
private MyOutputStreamMonitor fOutputMonitor;
/**
* The monitor for the error stream (connected to standard error of the process)
*/
private MyOutputStreamMonitor fErrorMonitor;
(...)
public MyStreamsProxy(Process process) {
if (process == null) {
return;
}
fOutputMonitor = new MyOutputStreamMonitor(process
.getInputStream());
fErrorMonitor = new MyOutputStreamMonitor(process
.getErrorStream());
fInputMonitor = new InputStreamMonitor(process
.getOutputStream());
fOutputMonitor.startMonitoring();
fErrorMonitor.startMonitoring();
fInputMonitor.startMonitoring();
}
The only thing remaining is providing your own IProcess that uses your IStreamsProxy. This time subclassing RuntimeProcess and overriding the method createStreamsProxy() is enough:
public class MyProcess extends RuntimeProcess {
public MyProcess(ILaunch launch, Process process, String name,
Map attributes) {
super(launch, process, name, attributes);
}
#Override
protected IStreamsProxy createStreamsProxy() {
String encoding = getLaunch().getAttribute(DebugPlugin.ATTR_CONSOLE_ENCODING);
return new MyStreamsProxy(getSystemProcess());
}
}
MyProcess is integrated by creating a new instance of it in the launch method in the ILaunchConfigurationDelegate instead of using DebugPlugin.newProcess().
Now it is possible to hide and expose the output of the console view.
/**
* Storage field for the console listener
*/
private IStreamListener oldListener;
/**
* Hides the output coming from the process so the user doesn't see it.
*/
protected void hideConsoleOutput() {
MyOutputStreamMonitor out
= (MyOutputStreamMonitor) this.process.getStreamsProxy().getOutputStreamMonitor();
List<IStreamListener> listeners = out.getListeners();
// the console listener
this.oldListener = listeners.get(0);
out.removeListener(this.oldListener);
}
/**
* Reverts the changes made by hideConsoleOutput() so the user sees the response from the process again.
*/
protected void exposeConsoleOutput() {
MyOutputStreamMonitor out
= (MyOutputStreamMonitor) this.process.getStreamsProxy().getOutputStreamMonitor();
out.addListener(oldListener);
this.oldListener = null;
}
The hide and expose methods have to be called before any other listeners are added. There might be a better solution, however, this works.
Previous answer does the trick and I was going with something similar first after hours of trying to solve this. Finally I ended up doing something a bit simpler, but also somewhat nastier...basically:
...
ILaunch launch = launcconf.launch(
ILaunchManager.RUN_MODE, monitor);
DebugUIPlugin.getDefault().
getProcessConsoleManager().launchRemoved(launch);
...
So, I'm basically telling the console manager listener methods that this lauch has already been removed and it removes the console. Seems to do the trick for me atleast.
i don't want the response to pop up in the console view of the plugin. how can i do that?
Well since that is your actual concern, then just toggle the button on the console view called "Show console when standard output changes". Way more of a simpler approach than all of this, and it can be turned back on/off.