I have completed the core functionality of my installer, but need to add the ability for the user to launch the application after it has finished installing.
I have already gained elevated privileges earlier in the install process, but when the installer tries to launch the application it fails with the following error log:
java.io.IOException: Cannot run program "C:\Program Files\abc\xyz\xyz 1.2.exe" (in directory "C:\Program Files\abc\xyz"): CreateProcess error=740, The requested operation requires elevation
at java.base/java.lang.ProcessBuilder.start(Unknown Source)
at java.base/java.lang.ProcessBuilder.start(Unknown Source)
at com.install4j.runtime.installer.helper.launching.LaunchHelper.launchOnWindows(LaunchHelper.java:387)
at com.install4j.runtime.installer.helper.launching.LaunchHelper.launchApplicationDirectly(LaunchHelper.java:151)
at com.install4j.runtime.installer.helper.launching.LaunchHelper.access$000(LaunchHelper.java:33)
at com.install4j.runtime.installer.helper.launching.LaunchHelper$2.fetchValue(LaunchHelper.java:110)
at com.install4j.runtime.installer.helper.launching.LaunchHelper$2.fetchValue(LaunchHelper.java:107)
at com.install4j.runtime.installer.helper.comm.actions.FetchObjectAction.execute(FetchObjectAction.java:14)
at com.install4j.runtime.installer.helper.comm.HelperCommunication.executeActionWrapper(HelperCommunication.java:367)
at com.install4j.runtime.installer.helper.comm.HelperCommunication.access$200(HelperCommunication.java:30)
at com.install4j.runtime.installer.helper.comm.HelperCommunication$1.run(HelperCommunication.java:96)
Caused by: java.io.IOException: CreateProcess error=740, The requested operation requires elevation
at java.base/java.lang.ProcessImpl.create(Native Method)
at java.base/java.lang.ProcessImpl.<init>(Unknown Source)
at java.base/java.lang.ProcessImpl.start(Unknown Source)
... 11 more
Any ideas?
If this is generated launcher, try selecting "As invoker" on the "Executable info->Windows manifest options" step of the launcher wizard. The "Run executable" action will provide the privileges.
I try to execute spark-shell on Windows 10, but I keep getting this error every time I run it.
I used both latest and spark-1.5.0-bin-hadoop2.4 versions.
15/09/22 18:46:24 WARN Connection: BoneCP specified but not present in
CLASSPATH (or one of dependencies)
15/09/22 18:46:24 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
15/09/22 18:46:27 WARN ObjectStore: Version information not found in
metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
15/09/22 18:46:27 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
15/09/22 18:46:27 WARN : Your hostname, DESKTOP-8JS2RD5 resolves to a loopback/non-reachable address: fe80:0:0:0:0:5efe:c0a8:103%net1, but we couldn't find any external IP address!
java.lang.RuntimeException: java.lang.NullPointerException
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.ClientWrapper.<init> (ClientWrapper.scala:171)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala :163)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.<init>(<console>:9)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.sca la:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$proc ess$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$proc ess$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$proc ess$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scal a:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NullPointerException
at java.lang.ProcessBuilder.start(Unknown Source)
at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
at org.apache.hadoop.util.Shell.run(Shell.java:418)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
at org.apache.hadoop.fs.FileUtil.execCommand(FileUtil.java:1097)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.loadPermissionInfo(RawLocalFileSystem.java:559)
at org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.getPermission(RawLocalFileSystem.java:534)
org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:599)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
... 56 more
<console>:10: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:10: error: not found: value sqlContext
import sqlContext.sql
^
I used Spark 1.5.2 with Hadoop 2.6 and had similar problems. Solved by doing the following steps:
Download winutils.exe from the repository to some local folder, e.g. C:\hadoop\bin.
Set HADOOP_HOME to C:\hadoop.
Create c:\tmp\hive directory (using Windows Explorer or any other tool).
Open command prompt with admin rights.
Run C:\hadoop\bin\winutils.exe chmod 777 /tmp/hive
With that, I am still getting some warnings, but no ERRORs and can run Spark applications just fine.
I was facing a similar issue, got it resolved by putting the winutil inside bin folder. The Hadoop_home should be set as C:\Winutils and winutil to be placed in C:\Winutils\bin.
Windows 10 64 bit Winutils are available in https://github.com/steveloughran/winutils/tree/master/hadoop-2.6.0/bin
Also ensure that command line has administrative access.
Refer https://wiki.apache.org/hadoop/WindowsProblems
My guess is that you're running into https://issues.apache.org/jira/browse/SPARK-10528. I was seeing the same issue running on Windows 7. Initially I was getting the NullPointerException as you did. When I put winutils into the bin directory and set HADOOP_HOME to point to the Spark directory, I got the error described in the JIRA issue.
Or perhaps this link here below be easier to follow,
https://wiki.apache.org/hadoop/WindowsProblems
Basically download and copy winutils.exe to your spark\bin folder. Re-run spark-shell
If you have not set your /tmp/hive to a writable state, please do so.
You need to give permission to /tmp/hive directory to resolve this exception.
Hope you already have winutils.exe and set HADOOP_HOME environment variable. Then open the command prompt and run following command as administrator:
If winutils.exe is present in D:\winutils\bin location and \tmp\hive is also in D drive:
D:\winutils\bin\winutils.exe chmod 777 D:\tmp\hive
For more details,you can refer the following links :
Frequent Issues occurred during Spark Development
How to run Apache Spark on Windows7 in standalone mode
You can resolve this issue by placing mysqlconnector jar in spark-1.6.0/libs folder and restart it again.It works.
The important thing is here instead of running spark-shell you should do
spark-shell --driver-class-path /home/username/spark-1.6.0-libs-mysqlconnector.jar
Hope it should work.
For Python - Create a SparkSession in your python (This config section is only for Windows)
spark = SparkSession.builder.config("spark.sql.warehouse.dir", "C:/temp").appName("SparkSQL").getOrCreate()
Copy winutils.exe and keep in C:\winutils\bin and execute the bellow commands
C:\Windows\system32>C:\winutils\bin\winutils.exe chmod 777 C:/temp
Run command prompt in ADMIN mode ( Run as Administrator)
My issue was having other .exe's/Jars inside the winutils/bin folder. So I cleared all the others and was left with winutils.exe alone. Was using spark 2.1.1
Issue was resolved after installing correct Java version in my case its java 8 and setting the environmental variables. Make sure you run the winutils.exe to create a temporary directory as below.
c:\winutils\bin\winutils.exe chmod 777 \tmp\hive
Above should not return any error. Use java -version to verify the version of java you are using before invoking spark-shell.
In Windows, you need to clone "winutils"
git clone https://github.com/steveloughran/winutils.git
And
set var HADOOP_HOME to DIR_CLONED\hadoop-{version}
Remember to choose the version of your hadoop.
Setting SPARK_LOCAL_HOSTNAME as localhost (on Windows 10) resolved the problem for me
After I deleted a Google Cloud Storage directory through the Google Cloud Console, (the directory was generated by early Spark (ver 1.3.1) job), when re-run the job, it always fail and seemed the directory was still there to the job; I cannot find the directory with gsutil.
Is this a bug, or anything I missed? Thanks!
The error I got:
java.lang.RuntimeException: path gs://<my_bucket>/job_dir1/output_1.parquet already exists.
at scala.sys.package$.error(package.scala:27)
at org.apache.spark.sql.parquet.DefaultSource.createRelation(newParquet.scala:112)
at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:240)
at org.apache.spark.sql.DataFrame.save(DataFrame.scala:1196)
at org.apache.spark.sql.DataFrame.saveAsParquetFile(DataFrame.scala:995)
at com.xxx.Job1$.execute(Job1.scala:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
It appears you might be running into a known bug with the NFS list-consistency cache: https://github.com/GoogleCloudPlatform/bigdata-interop/issues/5
It was fixed in the latest release, and if you upgrade by deploying a new cluster with bdutil-1.3.1 (announced here: https://groups.google.com/forum/#!topic/gcp-hadoop-announce/vstNuV0LpDc) the problem should be fixed. If you need to upgrade in-place, you can try to download the latest gcs-connector-1.4.1 jarfile onto your master and worker nodes under /home/hadoop/hadoop-install/lib/gcs-connector-*.jar and then rebooting the Spark daemons:
sudo sudo -u hadoop /home/hadoop/spark-install/sbin/stop-all.sh
sudo sudo -u hadoop /home/hadoop/spark-install/sbin/start-all.sh
I wrote a JavaFx app (JavaFx windows based form) on my desktop running
Win 7 32 Bit
Netbeans 7.3 Beta
jdk-7u9-windows-i586
and it runs successfully.
I recently got a laptop running
Win 7 64 Bit
Netbeans 7.3 Beta
jdk-7u9-windows-x64
I just copied the code over and switch the Java Platform.... currently its set to the default "Default JavaFx Platform".
If i run the app I get the following error dialog
JavaFx launcher error - Exception while running Application
Anyone know what I need to change here...
Full stack trace when running application
ant -f "C:\\DEV\\Projects\\Java Apps\\BaseAppPlatform" jfxsa-run
init:
Deleting: C:\DEV\Projects\Java Apps\BaseAppPlatform\build\built-jar.properties
deps-jar:
Updating property file: C:\DEV\Projects\Java Apps\BaseAppPlatform\build\built-jar.properties
compile:
Detected JavaFX Ant API version 1.2
Launching <fx:jar> task from C:\Program Files\Java\jdk1.7.0_09\lib\ant-javafx.jar
Signing JAR: C:\DEV\Projects\Java Apps\BaseAppPlatform\dist\BaseAppPlatform.jar to C:\DEV\Projects\Java Apps\BaseAppPlatform\dist\BaseAppPlatform.jar as nb-jfx
Warning:
The signer certificate will expire within six months.
Enter Passphrase for keystore: Enter key password for nb-jfx:
Launching <fx:deploy> task from C:\Program Files\Java\jdk1.7.0_09\lib\ant-javafx.jar
jfx-deployment-script:
jfx-deployment:
jar:
Copying 12 files to C:\DEV\Projects\Java Apps\BaseAppPlatform\dist\run858846669
jfx-project-run:
Executing com.javafx.main.Main from C:\DEV\Projects\Java Apps\BaseAppPlatform\dist\run858846669\BaseAppPlatform.jar using platform C:\Program Files\Java\jdk1.7.0_09/bin/java
java.lang.ClassNotFoundException: co.za.chrispie.LoginController
file:/C:/DEV/Projects/Java%20Apps/BaseAppPlatform/dist/run858846669/BaseAppPlatform.jar!/BaseAppPlatform/Login.fxml:9
at javafx.fxml.FXMLLoader$ValueElement.processAttribute(FXMLLoader.java:728)
at javafx.fxml.FXMLLoader$InstanceDeclarationElement.processAttribute(FXMLLoader.java:777)
at javafx.fxml.FXMLLoader$Element.processStartElement(FXMLLoader.java:182)
at javafx.fxml.FXMLLoader$ValueElement.processStartElement(FXMLLoader.java:565)
at javafx.fxml.FXMLLoader.processStartElement(FXMLLoader.java:2314)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2131)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2028)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2742)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2721)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2707)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2694)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2683)
at BaseAppPlatform.BaseAppPlatform.start(BaseAppPlatform.java:21)
at com.sun.javafx.application.LauncherImpl$5.run(LauncherImpl.java:319)
at com.sun.javafx.application.PlatformImpl$5.run(PlatformImpl.java:206)
at com.sun.javafx.application.PlatformImpl$4.run(PlatformImpl.java:173)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.access$100(WinApplication.java:29)
at com.sun.glass.ui.win.WinApplication$3$1.run(WinApplication.java:73)
at java.lang.Thread.run(Thread.java:722)
Exception in Application start method
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at com.javafx.main.Main.launchApp(Main.java:642)
at com.javafx.main.Main.main(Main.java:805)
Caused by: java.lang.RuntimeException: Exception in Application start method
at com.sun.javafx.application.LauncherImpl.launchApplication1(LauncherImpl.java:403)
at com.sun.javafx.application.LauncherImpl.access$000(LauncherImpl.java:47)
at com.sun.javafx.application.LauncherImpl$1.run(LauncherImpl.java:115)
at java.lang.Thread.run(Thread.java:722)
Caused by: javafx.fxml.LoadException: java.lang.ClassNotFoundException: co.za.chrispie.LoginController
at javafx.fxml.FXMLLoader$ValueElement.processAttribute(FXMLLoader.java:728)
at javafx.fxml.FXMLLoader$InstanceDeclarationElement.processAttribute(FXMLLoader.java:777)
at javafx.fxml.FXMLLoader$Element.processStartElement(FXMLLoader.java:182)
at javafx.fxml.FXMLLoader$ValueElement.processStartElement(FXMLLoader.java:565)
at javafx.fxml.FXMLLoader.processStartElement(FXMLLoader.java:2314)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2131)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2028)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2742)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2721)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2707)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2694)
at javafx.fxml.FXMLLoader.load(FXMLLoader.java:2683)
at BaseAppPlatform.BaseAppPlatform.start(BaseAppPlatform.java:21)
at com.sun.javafx.application.LauncherImpl$5.run(LauncherImpl.java:319)
at com.sun.javafx.application.PlatformImpl$5.run(PlatformImpl.java:206)
at com.sun.javafx.application.PlatformImpl$4.run(PlatformImpl.java:173)
at com.sun.glass.ui.win.WinApplication._runLoop(Native Method)
at com.sun.glass.ui.win.WinApplication.access$100(WinApplication.java:29)
at com.sun.glass.ui.win.WinApplication$3$1.run(WinApplication.java:73)
... 1 more
Caused by: java.lang.ClassNotFoundException: co.za.chrispie.LoginController
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at javafx.fxml.FXMLLoader$ValueElement.processAttribute(FXMLLoader.java:726)
... 19 more
Judging by java.lang.ClassNotFoundException: co.za.chrispie.LoginController you missed few code or dependent library while copying.
Find java or jar file responsible for class co.za.chrispie.LoginController and make sure:
it exists on your laptop
it's in the correct location package-wise
it's included into your new project (sources folder)
I am very new to JBoss. Currently I have a requirement wherein I need to deploy the application (which is already running on Tomcat) on JBoss. I downloaded the JBoss however the directory structure is different in version 7.
I am running the bin\standalone.conf.bat file to start the server however I am getting the below error:
Calling "C:\Program Files\jboss-as-7.1.1.Final\bin\standalone.conf.bat"
===============================================================================
JBoss Bootstrap Environment
JBOSS_HOME: C:\Program Files\jboss-as-7.1.1.Final
JAVA: C:\Program Files\Java\jdk1.6.0_30\bin\java
JAVA_OPTS: -XX:+TieredCompilation -Dprogram.name=standalone.bat -Xms64M -Xmx51
2M -XX:MaxPermSize=256M -Dsun.rmi.dgc.client.gcInterval=3600000 -Dsun.rmi.dgc.se
rver.gcInterval=3600000 -Djava.net.preferIPv4Stack=true -Dorg.jboss.resolver.war
ning=true -Djboss.modules.system.pkgs=org.jboss.byteman -Djboss.server.default.c
onfig=standalone.xml
===============================================================================
Unable to set property fileName on class org.jboss.logmanager.handlers.FileHandl
er: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.jboss.logmanager.PropertyConfigurator.configureProperties(Propert
yConfigurator.java:187)
at org.jboss.logmanager.PropertyConfigurator.configureHandler(PropertyCo
nfigurator.java:312)
at org.jboss.logmanager.PropertyConfigurator.configure(PropertyConfigura
tor.java:128)
at org.jboss.logmanager.PropertyConfigurator.configure(PropertyConfigura
tor.java:86)
at org.jboss.logmanager.LogManager.readConfiguration(LogManager.java:246
)
at org.jboss.logmanager.LogManager.readConfiguration(LogManager.java:231
)
at java.util.logging.LogManager$2.run(LogManager.java:267)
at java.security.AccessController.doPrivileged(Native Method)
at java.util.logging.LogManager.readPrimordialConfiguration(LogManager.j
ava:265)
at java.util.logging.LogManager.getLogManager(LogManager.java:248)
at java.util.logging.Logger.<init>(Logger.java:225)
at java.util.logging.LogManager$RootLogger.<init>(LogManager.java:1092)
at java.util.logging.LogManager$RootLogger.<init>(LogManager.java:1089)
at java.util.logging.LogManager$1.run(LogManager.java:180)
at java.security.AccessController.doPrivileged(Native Method)
at java.util.logging.LogManager.<clinit>(LogManager.java:157)
at org.jboss.modules.Main.main(Main.java:275)
Caused by: java.io.FileNotFoundException: C:\Program Files\jboss-as-7.1.1.Final\
standalone\log\boot.log (The system cannot find the path specified)
at java.io.FileOutputStream.open(Native Method)
at java.io.FileOutputStream.<init>(FileOutputStream.java:194)
at org.jboss.logmanager.handlers.FileHandler.setFile(FileHandler.java:15
2)
at org.jboss.logmanager.handlers.FileHandler.setFileName(FileHandler.jav
a:183)
... 21 more
17:11:18,420 INFO [org.jboss.modules] JBoss Modules version 1.1.1.GA
java.lang.IllegalStateException: JBAS018704: Could not create server data direct
ory: C:\Program Files\jboss-as-7.1.1.Final\standalone\data
at org.jboss.as.server.ServerEnvironment.<init>(ServerEnvironment.java:3
88)
at org.jboss.as.server.Main.determineEnvironment(Main.java:242)
at org.jboss.as.server.Main.main(Main.java:83)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.jboss.modules.Module.run(Module.java:260)
at org.jboss.modules.Main.main(Main.java:291)
Press any key to continue . . .
How can I resolve the issue and get the server run properly?
Try running JBoss in Administrator mode or give your JBoss folder the proper permissions to access the log files mentioned in the error.
If you are using eclipse, starting it in admin mode does the trick, if not you can start you command prompt in admin mode and then run the start up script from there.
To set admin mode:
https://technet.microsoft.com/en-us/magazine/ff431742.aspx