I am looking for a way to (re)deploy an exploded bundle (meaning not jarred but in a folder) to a running Apache Felix OSGi container from within Eclipse, preferably using a launch task.
I found this question, which has an answer that comes close but it depends on typing commands into a Gogo shell, which is not convenient for long-term development use. I'd like to use Eclipse's launch task mechanism for this, but if there are alternatives that are equally fast and convenient I am open to that as well.
Now I think that if I can fire Gogo shell commands from an Eclipse launch tasks, that would be a solution, but I also can't get my head around how to do that. I presume I need the Remote Shell bundle for that right?
I am starting to think about writing a telnet client in Java that can connect to the Remote Shell bundle and execute Gogo commands in an automated fashion. I have seen some example of that already which I can modify to suit my needs... However I am getting a 'reinventing the wheel' kind of feeling from that. Surely there is a better way?
Some background to help you understand what I am doing:
I have set up an Eclipse 'OSGiContainer' project which basically contains the Apache Felix jar and the third party bundles I want to deploy (like Gogo shell), similar to the project setup described here. Then I created a second 'MyBundle' project that contains my bundle. I want to start the OSGi container by launching the OSGiContainer project, and then just develop on my bundle and test my changes by launching the MyBundle project into the OSGiContainer that I just want to keep running the whole time during development.
Project layout:
OSGiContainer
bin (contains felix jar)
bundles (third party bundles)
conf (Felix' config.properties file)
MyBundle
src
target
classes
I am then able to deploy my bundle to the OSGi container by invoking these commands on the Gogo shell:
install reference:file:../MyBundle/target/classes
start <bundleId>
To re-deploy, I invoke these commands:
stop <bundleId>
uninstall <bundleId>
install reference:file:../MyBundle/target/classes
start <bundleId>
You can imagine having to invoke 4 commands on the shell each time is not that much fun... So even if you can give me a way to boil this down to less commands to type it would be a great improvement already.
UPDATE
I hacked around a bit and came up with the class below. It's an adaptation of the telnet example with some small changes and a main method with the necessary commands to uninstall a bundle and then re-install and start it. The path to the bundle should be given as an argument to the program and would look like:
reference:file:../MyBundle/target/classes
I still very much welcome answers to this question, as I don't really like this solution at all. I have however verified that this works:
import java.io.IOException;
import java.io.InputStream;
import java.io.PrintStream;
import java.net.SocketException;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.LinkedBlockingQueue;
import org.apache.commons.net.telnet.TelnetClient;
public class GogoDeployer {
static class Responder extends Thread {
private StringBuilder builder = new StringBuilder();
private final GogoDeployer checker;
private CountDownLatch latch;
private String waitFor = null;
private boolean isKeepRunning = true;
Responder(GogoDeployer checker) {
this.checker = checker;
}
boolean foundWaitFor(String waitFor) {
return builder.toString().contains(waitFor);
}
public synchronized String getAndClearBuffer() {
String result = builder.toString();
builder = new StringBuilder();
return result;
}
#Override
public void run() {
while (isKeepRunning) {
String s;
try {
s = checker.messageQueue.take();
} catch (InterruptedException e) {
break;
}
synchronized (Responder.class) {
builder.append(s);
}
if (waitFor != null && latch != null && foundWaitFor(waitFor)) {
latch.countDown();
}
}
System.out.println("Responder stopped.");
}
public String waitFor(String waitFor) {
synchronized (Responder.class) {
if (foundWaitFor(waitFor)) {
return getAndClearBuffer();
}
}
this.waitFor = waitFor;
latch = new CountDownLatch(1);
try {
latch.await();
} catch (InterruptedException e) {
e.printStackTrace();
return null;
}
String result = null;
synchronized (Responder.class) {
result = builder.toString();
builder = new StringBuilder();
}
return result;
}
}
static class TelnetReader extends Thread {
private boolean isKeepRunning = true;
private final GogoDeployer checker;
private final TelnetClient tc;
TelnetReader(GogoDeployer checker, TelnetClient tc) {
this.checker = checker;
this.tc = tc;
}
#Override
public void run() {
InputStream instr = tc.getInputStream();
try {
byte[] buff = new byte[1024];
int ret_read = 0;
do {
if (instr.available() > 0) {
ret_read = instr.read(buff);
}
if (ret_read > 0) {
checker.sendForResponse(new String(buff, 0, ret_read));
ret_read = 0;
}
} while (isKeepRunning && (ret_read >= 0));
} catch (Exception e) {
System.err.println("Exception while reading socket:" + e.getMessage());
}
try {
tc.disconnect();
checker.stop();
System.out.println("Disconnected.");
} catch (Exception e) {
System.err.println("Exception while closing telnet:" + e.getMessage());
}
}
}
private static final String prompt = "g!";
private static GogoDeployer client;
private String host;
private BlockingQueue<String> messageQueue = new LinkedBlockingQueue<String>();
private int port;
private TelnetReader reader;
private Responder responder;
private TelnetClient tc;
public GogoDeployer(String host, int port) {
this.host = host;
this.port = port;
}
public void stop() {
responder.isKeepRunning = false;
reader.isKeepRunning = false;
try {
Thread.sleep(10);
} catch (InterruptedException e) {
}
responder.interrupt();
reader.interrupt();
}
public void send(String command) {
PrintStream ps = new PrintStream(tc.getOutputStream());
ps.println(command);
ps.flush();
}
public void sendForResponse(String s) {
messageQueue.add(s);
}
public void connect() throws SocketException, IOException {
tc = new TelnetClient();
tc.connect(host, port);
reader = new TelnetReader(this, tc);
reader.start();
responder = new Responder(this);
responder.start();
}
public String waitFor(String s) {
return responder.waitFor(s);
}
private static String exec(String cmd) {
String result = "";
System.out.println(cmd);
client.send(cmd);
result = client.waitFor(prompt);
return result;
}
public static void main(String[] args) {
try {
String project = args[0];
client = new GogoDeployer("localhost", 6666);
client.connect();
System.out.println(client.waitFor(prompt));
System.out.println(exec("uninstall " + project));
String result = exec("install " + project);
System.out.println(result);
int start = result.indexOf(":");
int stop = result.indexOf(prompt);
String bundleId = result.substring(start + 1, stop).trim();
System.out.println(exec("start " + bundleId));
client.stop();
} catch (SocketException e) {
System.err.println("Unable to conect to Gogo remote shell: " + e.getMessage());
} catch (IOException e) {
System.err.println("Unable to conect to Gogo remote shell: " + e.getMessage());
}
}
}
When I met the same requirement (deploy bundle from target/classes as fast as I can) my first thought was also extending my container with some shell functionality. My second thought was, however, to write a simple bundle that opens up an always-on-top window and I can simply drag-and-drop any project(s) from Eclipse (or total commander or whatever) to that window. The code than checks if the folder(s) that was dropped has a target/classes folder and if it has it will be deployed.
The source code is available at https://github.com/everit-org/osgi-richconsole
The dependency is available from the maven-central.
The dependency is:
<dependency>
<groupId>org.everit.osgi.dev</groupId>
<artifactId>org.everit.osgi.dev.richconsole</artifactId>
<version>1.2.0</version>
</dependency>
You can use the bundle it while you develop and remove it when you set up your live server. However it is not necessary as if the container is running in a headless mode the pop-up window is not shown.
I called it richconsole as I would like to have more features in the future (not just deployment) :)
Related
is there some way on how to open powershell with process builder in javafx and keep it opened to execute any command anytime?
Example code(executing only one command at a time):
try {
ProcessBuilder builder = new ProcessBuilder("cmd.exe", "/c", "powershell -Command \"Add-Type -AssemblyName System.DirectoryServices.AccountManagement; [System.DirectoryServices.AccountManagement.UserPrincipal]::Current.DisplayName\"&&exit");
Process p = builder.start();
BufferedReader reader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = null;
while ((line = reader.readLine()) != null) {
if (!line.trim().isEmpty()) {
displayname = line;
}
}
reader.close();
p.waitFor();
p.destroy();
} catch (IOException | InterruptedException ex) {
Logger.getLogger(AccountStatus.class.getName()).log(Level.SEVERE, null, ex);
}
Reason to keep it opened: loading powershell takes maybe 3 seconds and loading for example active directory plugin takes another maybe 2 seconds everytime i want to execute some command. If there is some way on how to preload powershell and send command to processbuilder anytime it would be very helpfull, thanks for advices.
EDIT:
I have found solution here: Apache Commons exec PumpStreamHandler continuous input
Thanks to MichalVales!
With this quick sample i am able to open powershell, keep it opened, preload some module and execute any new command anytime without loading all again.
public class FXMLDocumentController implements Initializable {
private BufferedWriter writer;
#FXML
private void handleButtonAction(ActionEvent event) {
try {
writer.write("Import-Module ActiveDirectory -Cmdlet Get-ADUser\n");
writer.flush();
} catch (IOException ex) {
Logger.getLogger(FXMLDocumentController.class.getName()).log(Level.SEVERE, null, ex);
}
}
#FXML
private void handleButtonAction2(ActionEvent event) {
try {
writer.write("Get-ADUser somenamehere -Properties * | Select-Object LockedOut\n");
writer.flush();
} catch (IOException ex) {
Logger.getLogger(FXMLDocumentController.class.getName()).log(Level.SEVERE, null, ex);
}
}
#Override
public void initialize(URL url, ResourceBundle rb) {
ProcessBuilder builder = new ProcessBuilder("powershell.exe");
Process process;
try {
process = builder.start();
writer = new BufferedWriter(new OutputStreamWriter(process.getOutputStream()));
StreamReader outputReader = new StreamReader(process.getInputStream(), System.out);
outputReader.start();
StreamReader err = new StreamReader(process.getErrorStream(), System.err);
err.start();
} catch (IOException ex) {
Logger.getLogger(FXMLDocumentController.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
StreamReader code can be found on link from MichalVales
EDIT2:
I was trying to pass czech characters with any writer, but without success. I think that its impossible to pass czech characters like "ěščřžýáíé" to powershell without changing system locale, but i dont want to do it. I have tried processbuilder, apache exec, all failed, but i have found super library which works and is really easy to use:
jPowerShell
So if you have problems with keeping powershell alive or problem with characters, this is the best solution.
I am learning zookeeper and trying out the Curator framework for service discoveries. However, I am facing a weird issue that I have difficulties to figure out. The problem is when I tried to register an instance via serviceDiscovery, the cacheChanged event of the serviceCache gets triggered three times. When I removed an instance, it is only triggered once, which is the expected behavior. Please see the code below:
public class DiscoveryExample {
private static String PATH = "/base";
static ServiceDiscovery<InstanceDetails> serviceDiscovery = null;
public static void main(String[] args) throws Exception {
CuratorFramework client = null;
try {
// this is the ip address of my VM
client = CuratorFrameworkFactory.newClient("192.168.149.129:2181", new ExponentialBackoffRetry(1000, 3));
client.start();
JsonInstanceSerializer<InstanceDetails> serializer = new JsonInstanceSerializer<InstanceDetails>(
InstanceDetails.class);
serviceDiscovery = ServiceDiscoveryBuilder.builder(InstanceDetails.class)
.client(client)
.basePath(PATH)
.serializer(serializer)
.build();
serviceDiscovery.start();
ServiceCache<InstanceDetails> serviceCache = serviceDiscovery.serviceCacheBuilder()
.name("product")
.build();
serviceCache.addListener(new ServiceCacheListener() {
#Override
public void stateChanged(CuratorFramework curator, ConnectionState state) {
// TODO Auto-generated method stub
System.out.println("State Changed to " + state.name());
}
// THIS IS THE PART GETS TRIGGERED MULTIPLE TIMES
#Override
public void cacheChanged() {
System.out.println("Cached Changed ");
List<ServiceInstance<InstanceDetails>> list = serviceCache.getInstances();
Iterator<ServiceInstance<InstanceDetails>> it = list.iterator();
while(it.hasNext()) {
System.out.println(it.next().getAddress());
}
}
});
serviceCache.start();
BufferedReader in = new BufferedReader(new InputStreamReader(System.in));
System.out.print("> ");
String line = in.readLine();
} finally {
CloseableUtils.closeQuietly(serviceDiscovery);
CloseableUtils.closeQuietly(client);
}
}
}
AND
public class RegisterApplicationServer {
final static String PATH = "/base";
static ServiceDiscovery<InstanceDetails> serviceDiscovery = null;
public static void main(String[] args) throws Exception {
CuratorFramework client = null;
try {
client = CuratorFrameworkFactory.newClient("192.168.149.129:2181", new ExponentialBackoffRetry(1000, 3));
client.start();
JsonInstanceSerializer<InstanceDetails> serializer = new JsonInstanceSerializer<InstanceDetails>(
InstanceDetails.class);
serviceDiscovery = ServiceDiscoveryBuilder.builder(InstanceDetails.class).client(client).basePath(PATH)
.serializer(serializer).build();
serviceDiscovery.start();
// SOME OTHER CODE THAT TAKES CARES OF USER INPUT...
} finally {
CloseableUtils.closeQuietly(serviceDiscovery);
CloseableUtils.closeQuietly(client);
}
}
private static void addInstance(String[] args, CuratorFramework client, String command,
ServiceDiscovery<InstanceDetails> serviceDiscovery) throws Exception {
// simulate a new instance coming up
// in a real application, this would be a separate process
if (args.length < 2) {
System.err.println("syntax error (expected add <name> <description>): " + command);
return;
}
StringBuilder description = new StringBuilder();
for (int i = 1; i < args.length; ++i) {
if (i > 1) {
description.append(' ');
}
description.append(args[i]);
}
String serviceName = args[0];
ApplicationServer server = new ApplicationServer(client, PATH, serviceName, description.toString());
server.start();
serviceDiscovery.registerService(server.getThisInstance());
System.out.println(serviceName + " added");
}
private static void deleteInstance(String[] args, String command, ServiceDiscovery<InstanceDetails> serviceDiscovery) throws Exception {
// in a real application, this would occur due to normal operation, a
// crash, maintenance, etc.
if (args.length != 2) {
System.err.println("syntax error (expected delete <name>): " + command);
return;
}
final String serviceName = args[0];
Collection<ServiceInstance<InstanceDetails>> set = serviceDiscovery.queryForInstances(serviceName);
Iterator<ServiceInstance<InstanceDetails>> it = set.iterator();
while (it.hasNext()) {
ServiceInstance<InstanceDetails> si = it.next();
if (si.getPayload().getDescription().indexOf(args[1]) != -1) {
serviceDiscovery.unregisterService(si);
}
}
System.out.println("Removed an instance of: " + serviceName);
}
}
I appriciate if anyone can please point out where I am doing wrong and maybe can share some good materials/examples so I can refer to. The official website and the examples on github does not help a lot.
I'm having a huge problem with my java webstart application, I have tries a lot of solutions, but none seems to work correctly in th end.
I need to write a webstart applet to load basic hardware info about the client computer to check if my client can connect on our systems and use the software four our courses. I use Sigar to load the CPU and Memory information and then use JNI to load a custom c++ script that check the graphic card name (This one works perfectly).
I've put all my dlls in src/resources folder to load them in the jar, I also use what we call here "engines" which are classed that do specified operations (In our case, Jni Engine, Config Engine and Data Engine (Code below)), I'm new to webstart so I'm not sure if this concept works well with library loading.
I've tries to add the dlls in a jar as a library in Netbeans, I've tried to add the dlls in the jnlp, but each run recreates it and I can't add them with project properties, finnaly, I've built my Data Engine in a way that should load the dlls in the java temp directory in case they are not there, but Sigar still don't want to work. I've also put my dll in the java.library.path correctly cofigured (As it works in local).
It work when I run my main class locally (With right click-run), but when I click the run button to load the webstart, it crashes with this error message (it happens in ConfigEngine as it extends SigarBase) :
JNLPClassLoader: Finding library sigar-amd64-winnt.dll.dll
no sigar-amd64-winnt.dll in java.library.path
org.hyperic.sigar.SigarException: no sigar-amd64-winnt.dll in java.library.path
Here's the code :
JNi Engine (Loads the c++ code for the graphic card)
package Engine;
public class JniEngine
{
static private final String nomLibJni = "JniEngine";
static private final String nomLibJni64 = "JniEngine_x64";
static
{
if (System.getProperty("os.arch").contains("86"))
{
System.loadLibrary(nomLibJni);
}
else
{
System.loadLibrary(nomLibJni64);
}
}
public native String getInfoGPU() throws Error;
}
ConfigEngine
package Engine;
import java.net.NetworkInterface;
import java.net.SocketException;
import java.util.Collections;
import java.util.HashMap;
import java.util.List;
import org.hyperic.sigar.Sigar;
import org.hyperic.sigar.SigarException;
import org.hyperic.sigar.cmd.SigarCommandBase;
public class ConfigEngine extends SigarCommandBase
{
private final String nomOsAcceptes = "Windows";
static
{
DataEngine data;
}
public ConfigEngine()
{
super();
}
#Override
public void output(String[] args) throws SigarException
{
}
public HashMap<String, String> getMap() throws SigarException, SocketException
{
HashMap<String, String> hmConfig = new HashMap<>();
loadInfoCpu(hmConfig);
loadInfoRam(hmConfig);
loadInfoOs(hmConfig);
loadInfoNet(hmConfig);
loadInfoGpu(hmConfig);
return hmConfig;
}
private void loadInfoCpu(HashMap<String,String> Hashmap) throws SigarException
{
org.hyperic.sigar.CpuInfo[] configCpu = this.sigar.getCpuInfoList();
org.hyperic.sigar.CpuInfo infoCpu = configCpu[0];
long cacheSize = infoCpu.getCacheSize();
Hashmap.put("Builder", infoCpu.getVendor());
Hashmap.put("Model" , infoCpu.getModel());
Hashmap.put("Mhz", String.valueOf(infoCpu.getMhz()));
Hashmap.put("Cpus nbr", String.valueOf(infoCpu.getTotalCores()));
if ((infoCpu.getTotalCores() != infoCpu.getTotalSockets()) ||
(infoCpu.getCoresPerSocket() > infoCpu.getTotalCores()))
{
Hashmap.put("Cpus", String.valueOf(infoCpu.getTotalSockets()));
Hashmap.put("Core", String.valueOf(infoCpu.getCoresPerSocket()));
}
if (cacheSize != Sigar.FIELD_NOTIMPL) {
Hashmap.put("Cache", String.valueOf(cacheSize));
}
}
private void loadInfoRam(HashMap<String,String> Hashmap) throws SigarException
{
org.hyperic.sigar.Mem mem = this.sigar.getMem();
Hashmap.put("RAM" , String.valueOf(mem.getRam()));
Hashmap.put("Memoery", String.valueOf(mem.getTotal()));
Hashmap.put("Free", String.valueOf(mem.getUsed()));
}
private void loadInfoOs(HashMap<String,String> Hashmap) throws SigarException
{
Hashmap.put("OS", System.getProperty("os.name"));
Hashmap.put("Version", System.getProperty("os.version"));
Hashmap.put("Arch", System.getProperty("os.arch"));
}
private void loadInfoNet(HashMap<String,String> Hashmap) throws SocketException
{
List<NetworkInterface> interfaces = Collections.
list(NetworkInterface.getNetworkInterfaces());
int i = 1;
for (NetworkInterface net : interfaces)
{
if (!net.isVirtual() && net.isUp())
{
Hashmap.put("Port Name " + String.valueOf(i), net.getDisplayName());
}
i++;
}
}
private void loadInfoGpu(HashMap<String,String> Hashmap) throws SocketException
{
if (System.getProperty("os.name").contains(nomOsAcceptes))
{
JniEngine Jni = new JniEngine();
Hashmap.put("VGA", Jni.getInfoGPU());
}
}
}
Finally my Data Engine which tries to load all the dlls and change the library path (Most of it is temporary as it is patches on patches)
package Engine;
import java.io.File;
import java.io.FileOutputStream;
import java.io.InputStream;
import java.net.URL;
public class DataEngine
{
static private final String nomLibSigar = "sigar-x86-winnt";
static private final String nomLibSigar64 = "sigar-amd64-winnt";
static private final String nomLibJni = "JniEngine";
static private final String nomLibJni64 = "JniEngine_x64";
static private final String NomJar86 = "lib_config_x86";
static private final String nomJar64 = "lib_config_x64";
static private final String path = "Resources\\";
static
{
try
{
if (System.getProperty("os.arch").contains("86"))
{
System.loadLibrary(nomLibJni);
System.loadLibrary(nomLibSigar);
}
else
{
System.loadLibrary(nomLibJni64);
System.loadLibrary(nomLibSigar64);
}
}
catch (UnsatisfiedLinkError ex)
{
loadJniFromJar();
loadSigarFromJar();
}
}
public static void loadSigarFromJar()
{
try
{
File dll;
InputStream is;
if (System.getProperty("os.arch").contains("86"))
{
is = DataEngine.class.getResourceAsStream(
path + nomLibSigar + ".dll");
dll = File.createTempFile(path + nomLibSigar, ".dll");
}
else
{
is = DataEngine.class.getResourceAsStream(
path + nomLibSigar64 + ".dll");
dll = File.createTempFile(path + nomLibSigar64, ".dll");
}
FileOutputStream fos = new FileOutputStream(dll);
byte[] array = new byte[1024];
for (int i = is.read(array);
i != -1;
i = is.read(array))
{
fos.write(array, 0, i);
}
fos.close();
is.close();
System.load(dll.getAbsolutePath());
System.setProperty("java.library.path", dll.getAbsolutePath());
}
catch (Throwable e)
{
}
}
public static void loadJniFromJar()
{
try
{
File dll;
InputStream is;
if (System.getProperty("os.arch").contains("86"))
{
is = DataEngine.class.getResourceAsStream(
path + nomLibJni + ".dll");
dll = File.createTempFile(path + nomLibJni, ".dll");
}
else
{
is = DataEngine.class.getResourceAsStream(
path + nomLibJni64 + ".dll");
dll = File.createTempFile(path + nomLibJni64, ".dll");
}
FileOutputStream fos = new FileOutputStream(dll);
byte[] array = new byte[1024];
for (int i = is.read(array);
i != -1;
i = is.read(array))
{
fos.write(array, 0, i);
}
fos.close();
is.close();
System.load(dll.getAbsolutePath());
}
catch (Throwable e)
{
}
}
}
I also have some problem with my main class (NetBeans don't want my JAppletForm to be the main class of my project, but I'll probably recreate the project anyway since the hundreds of patches I tries have corrupted the build. My main class simply load the HashMap with GetMap of ConfigEngine and show it in the console if local or in the JAppletForm if it runs with webstart.
Its a pretty big problem so I'll update my question with all the info you'll need if asked.
Here's my code:
import java.io.BufferedReader;
import java.io.File;
import java.io.FileNotFoundException;
import java.io.FileReader;
import java.io.IOException;
import org.apache.lucene.index.CorruptIndexException;
public class Main
{
public static void main()
{
//Index index = new Index();
String[] titleAndContent = parseFile("files/methode.txt");
Index index = new Index("files",null);
try
{
index.openIndex(true);
index.addDocument(titleAndContent[0], titleAndContent[1], "files/methode.txt");
}
catch (CorruptIndexException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
catch (IOException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
}
public static String[] parseFile(String path)
{
String[] titleAndContent = new String[2];
File file = new File(path);
try
{
FileReader fr = new FileReader(file);
BufferedReader br = new BufferedReader(fr);
String line = new String();
String content = new String();
try
{
while((line = br.readLine()) != null)
{
if (line.substring(0,min(6,line.length())).equals("title:"))
{
titleAndContent[0] = line.substring(6,line.length());
}
else
{
if (line.substring(0,min(8,line.length())).equals("content:"))
{
content += line.substring(8,line.length())+"\n";
}
else
{
content += line+"\n";
}
}
}
}
catch (IOException e1)
{
// TODO Auto-generated catch block
e1.printStackTrace();
}
titleAndContent[1] = content;
try
{
fr.close();
}
catch (IOException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
}
catch (FileNotFoundException e)
{
// TODO Auto-generated catch block
e.printStackTrace();
}
return titleAndContent;
}
public static int max (int a, int b)
{
if (a<b)
{
return b;
}
return a;
}
public static int min (int a, int b)
{
if (a<b)
{
return a;
}
return b;
}
}
The problem is, I can't compile my Lucene projet under Eclipse. It keeps telling me:
ERROR: index path not specified
Usage: java org.apache.lucene.index.CheckIndex pathToIndex [-fix] [-segment X] [-segment Y]
-fix: actually write a new segments_N file, removing any problematic segments
-segment X: only check the specified segments. This can be specified multiple
times, to check more than one segment, eg '-segment _2 -segment _a'.
You can't use this with the -fix option
**WARNING**: -fix should only be used on an emergency basis as it will cause
documents (perhaps many) to be permanently removed from the index. Always make
a backup copy of your index before running this! Do not run this tool on an index
that is actively being written to. You have been warned!
Run without -fix, this tool will open the index, report version information
and report any exceptions it hits and what action it would take if -fix were
specified. With -fix, this tool will remove any segments that have issues and
write a new segments_N file. This means all documents contained in the affected
segments will be removed.
This tool exits with exit code 1 if the index cannot be opened or has any
corruption, else 0.
I tried everything to make it work, and as the whole web says, I used
-ea:org.apache.lucene... pathToIndex -fix
as argument of compilation. But whatever I put instead of pathToIndex, it keeps telling me
Unexpected argument pathToIndex (or whatever instead)
How can I get this f... project work?
Thank you in advance.
Edit: Of course I've imported all Lucene JARs.
Actually, I started the project over by creating a simple Main class and a main method inside it and tried to compile it immediately. And it worked fine, this time. Note you should show the Main class on your screen tab before compiling in Eclipse.
I am running JUnit test case from Eclipse 3.4.1 . This test case creates a class which starts a thread to do some stuff. When the test method ends it seems that Eclipse is forcibly shutting down the thread.
If I run the same test from the command line, then the thread runs properly.
Somehow I do not remember running into such problems with Eclipse before. Is this something that was always present in Eclipse or did they add it in 3.4.x ?
Here is an example:
When I run this test from Eclipse, I get a few printts of the cnt (till about 1800) and then the test case is terminated utomatically. However, if I run the main method, which starts JUnit's TestRunner, then the thread counts indefinetely.
import junit.framework.TestCase;
import junit.textui.TestRunner;
/**
* This class shows that Eclipses JUnit test case runner will forcibly
* terminate all running threads
*
* #author pshah
*
*/
public class ThreadTest extends TestCase {
static Runnable run = new Runnable() {
public void run() {
int cnt = 0;
while(true) System.out.println(cnt++);
}
};
public void testThread() {
Thread t = new Thread(run);
t.start();
}
public static void main(String args[]) {
TestRunner runner = new TestRunner();
runner.run(ThreadTest.class);
}
}
I adapted your code to JUnit NG and it's the same result: The thread is killed.
public class ThreadTest {
static Runnable run = new Runnable() {
public void run() {
int cnt = 0;
while (true)
System.out.println(cnt++);
}
};
#Test
public void threadRun() {
Thread t = new Thread(run);
t.start();
assertEquals("RUNNABLE", t.getState().toString());
}
}
If I use the JUnit jar (4.3.1 in my case) from the Eclipe plugin folder to execute the tests via the command line, it has the same behavior like executing it in Eclipse (It's logical :) ).
I tested JUnit 4.6 (just downloaded) in the commandline and it also stops after a short time! It's exactly the same behavior like in Eclipse
I found out, that it is killed if the last instruction is done. It's logical, if you consider how JUnit works:
For each test, a new object is created. If the test is over, it's killed. Everything belonging to this test is killed.
That means, that every thread must be stopped.
JUnit deals correctly with this situation. Unit test must be isolated and easy to execute. So it has to end all threads, if the end of the test is reached.
You may wait, till the test is finished and then execute your assertXXX instruction. This would be the right way to test threads.
But be carefull: It may kill your execution times!
I believe this modification will yield the desired result for unit testing various thread scenarios.
(sorry if the formatting is wonky)
public class ThreadTest {
static Runnable run = new Runnable() {
public void run() {
int cnt = 0;
while (true)
System.out.println(cnt++);
}
};
#Test
public void threadRun() {
Thread t = new Thread(run);
t.start();
//Run the thread, t, for 30 seconds total.
//Assert the thread's state is RUNNABLE, once per second
for(int i=0;i<30;i++){
assertEquals("RUNNABLE", t.getState().toString());
try {
Thread.sleep(1000);//1 second sleep
} catch (InterruptedException e) {
e.printStackTrace();
}
}
System.out.println("Done with my thread unit test.");
}
}
This works but you have to name your thread or find another way to refer to it.
protected boolean monitorSecondaryThread(String threadName, StringBuilder errorMessage, boolean ignoreFailSafe) {
int NUM_THREADS_BESIDES_SECONDARY_THREAD = 2;
int MAX_WAIT_TIME = 10000;
MyUncaughtExceptionHandler meh = new MyUncaughtExceptionHandler();
Set<Thread> threadSet = Thread.getAllStackTraces().keySet();
for (Thread t : threadSet) {
t.setUncaughtExceptionHandler(meh);
}
Date start = Calendar.getInstance().getTime();
boolean stillAlive = true;
while (stillAlive) {
for (Thread t : threadSet) {
if (t.getName().equalsIgnoreCase(threadName) && !t.isAlive()) {
stillAlive = false;
}
}
Date end = Calendar.getInstance().getTime();
if (!ignoreFailSafe && (end.getTime() - start.getTime() > MAX_WAIT_TIME || Thread.activeCount() <= NUM_THREADS_BESIDES_SECONDARY_THREAD)) {
System.out.println("Oops, flawed thread monitor.");
stillAlive = false;
}
}
if (meh.errorCount > 0) {
System.out.println(meh.error);
errorMessage.append(meh.error);
return false;
}
return true;
}
private class MyUncaughtExceptionHandler implements UncaughtExceptionHandler {
public int errorCount = 0;
public String error = "";
#Override
public void uncaughtException(Thread t, Throwable e) {
ByteArrayOutputStream bs = new ByteArrayOutputStream();
PrintStream ps = new PrintStream(bs);
e.printStackTrace(ps);
error = bs.toString();
errorCount++;
}
}