I'm migrating some java+maven projects to scala+sbt but I've been having some troubles to make sbt work with my maven repository.
I'm using sbt 0.13.5 and my build.sbt looks like this:
name := "test"
version := "0.0.1"
resolvers += "uqbar-repo" at "ftp://mvn+uqbar-wiki.org:<my_password>#ftp.uqbar-wiki.org/releases"
libraryDependencies += "uqbar" % "uqbar-commons" % "1.1"
As you may see, my dependencies are hosted in an ftp server (ssh is not available so just ftp, NOT sftp) and there seems to be no problem to access them from maven, but running sbt yields the following error:
[info] Updating {file:/home/nicolas/Dev/test/}test...
[info] Resolving uqbar#uqbar-commons;1.1 ...
[warn] module not found: uqbar#uqbar-commons;1.1
[warn] ==== local: tried
[warn] /home/nicolas/.ivy2/local/uqbar/uqbar-commons/1.1/ivys/ivy.xml
[warn] ==== public: tried
[warn] http://repo1.maven.org/maven2/uqbar/uqbar-commons/1.1/uqbar-commons-1.1.pom
[warn] ==== uqbar-repo: tried
[warn] ftp://mvn+uqbar-wiki.org:<my_password>#ftp.uqbar-wiki.org/releases/uqbar/uqbar-commons/1.1/uqbar-commons-1.1.pom
[info] Resolving jline#jline;2.11 ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: uqbar#uqbar-commons;1.1: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: uqbar#uqbar-commons;1.1: not found
I found this weird, since the files are there (pom can be accessed from the url sbt tried, even from a browser) so I used a sniffer to check what was actually being send and receive. The output I got was:
25 2.887053000 31.22.4.33 192.168.1.109 FTP 386 Response: 220---------- Welcome to Pure-FTPd [privsep] [TLS] ----------
27 2.887536000 192.168.1.109 31.22.4.33 FTP 91 Request: USER mvn+uqbar-wiki.org
29 3.135330000 31.22.4.33 192.168.1.109 FTP 117 Response: 331 User mvn+uqbar-wiki.org OK. Password required
30 3.135590000 192.168.1.109 31.22.4.33 FTP 84 Request: PASS <my_password>
32 3.424510000 31.22.4.33 192.168.1.109 FTP 109 Response: 230 OK. Current restricted directory is /
33 3.424825000 192.168.1.109 31.22.4.33 FTP 74 Request: TYPE I
35 3.665227000 31.22.4.33 192.168.1.109 FTP 96 Response: 200 TYPE is now 8-bit binary
36 3.665449000 192.168.1.109 31.22.4.33 FTP 80 Request: CWD releases
37 3.897858000 31.22.4.33 192.168.1.109 FTP 106 Response: 250 OK. Current directory is /releases
38 3.898170000 192.168.1.109 31.22.4.33 FTP 77 Request: CWD uqbar
39 4.132531000 31.22.4.33 192.168.1.109 FTP 112 Response: 250 OK. Current directory is /releases/uqbar
40 4.132765000 192.168.1.109 31.22.4.33 FTP 85 Request: CWD uqbar-commons
43 4.365921000 31.22.4.33 192.168.1.109 FTP 126 Response: 250 OK. Current directory is /releases/uqbar/uqbar-commons
44 4.366217000 192.168.1.109 31.22.4.33 FTP 75 Request: CWD 1.1
45 4.606673000 31.22.4.33 192.168.1.109 FTP 130 Response: 250 OK. Current directory is /releases/uqbar/uqbar-commons/1.1
46 4.606921000 192.168.1.109 31.22.4.33 FTP 76 Request: EPSV ALL
47 4.842167000 31.22.4.33 192.168.1.109 FTP 87 Response: 500 Unknown command
48 4.842374000 192.168.1.109 31.22.4.33 FTP 72 Request: PASV
49 5.076465000 31.22.4.33 192.168.1.109 FTP 114 Response: 227 Entering Passive Mode (31,22,4,33,187,254)
54 5.309929000 192.168.1.109 31.22.4.33 FTP 94 Request: RETR uqbar-commons-1.1.pom
55 5.547075000 31.22.4.33 192.168.1.109 FTP 96 Response: 150 Accepted data connection
64 5.550301000 31.22.4.33 192.168.1.109 FTP 161 Response: 226-File successfully transferred
66 5.550465000 192.168.1.109 31.22.4.33 FTP 72 Request: QUIT
69 5.789052000 31.22.4.33 192.168.1.109 FTP 133 Response: 221-Goodbye. You uploaded 0 and downloaded 2 kbytes.
Now, my ftp is a little rusty, but that log seems to indicate that sbt found and download the pom.
In case it is of any use, here is the pom content:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>uqbar</groupId>
<artifactId>uqbar-commons</artifactId>
<version>1.1</version>
<packaging>jar</packaging>
<name>uqbar-commons</name>
<parent>
<groupId>uqbar</groupId>
<artifactId>uqbar-parent-project</artifactId>
<version>1.4</version>
</parent>
<properties>
<scm.svnPath>svn/uqbar/commons/uqbar-commons</scm.svnPath>
</properties>
<scm>
<connection>scm:svn:http://uqbar.no-ip.org/svn/uqbar/commons/uqbar-commons/tags/uqbar-commons-1.1</connection>
<developerConnection>scm:svn:http://uqbar.no-ip.org/svn/uqbar/commons/uqbar-commons/tags/uqbar-commons-1.1</developerConnection>
<url>http://uqbar.no-ip.org/svn/uqbar/commons/uqbar-commons/tags/uqbar-commons-1.1</url>
</scm>
<dependencies>
<!-- UQBAR -->
<dependency>
<groupId>com.uqbar</groupId>
<artifactId>uqbar-class-descriptor</artifactId>
<version>1.1</version>
</dependency>
<dependency>
<groupId>uqbar</groupId>
<artifactId>uqbar-bttf</artifactId>
<version>2.3</version>
</dependency>
<!-- /UQBAR -->
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring</artifactId>
<version>2.5.6</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>1.2.14</version>
</dependency>
<dependency>
<groupId>commons-beanutils</groupId>
<artifactId>commons-beanutils</artifactId>
<version>1.7.0</version>
</dependency>
<dependency>
<groupId>cglib</groupId>
<artifactId>cglib</artifactId>
<version>2.2</version>
</dependency>
</dependencies>
</project>
I've search the documentation but found nothing and I have no idea where the problem may be, so any hint is welcome.
Thanks in advance!
Related
I am not quite sure what node is causing this behaviour and there are tooo many flows so I can not install from scratch and yes I do not have a backup of them.
I realized today in the morning that I can not access the http gui of my nodered instance any longer on my raspberrypi zero. Just edited some flows but nothing real serious.
I am trying to start my node red on my Rapsberry PI zere and no GUI and UI is starting up to access the node red instance. I don't know how to solve and troubleshoot this. What I am doing or trying to do is:
pi#nodered-pi:~/.node-red $ node-red-start
Start Node-RED
Once Node-RED has started, point a browser at http://192.168.1.42:1880
On Pi Node-RED works better with the Firefox or Chrome browser
Use node-red-stop to stop Node-RED
Use node-red-start to start Node-RED again
Use node-red-log to view the recent log output
Use sudo systemctl enable nodered.service to autostart Node-RED at every boot
Use sudo systemctl disable nodered.service to disable autostart on boot
To find more nodes and example flows - go to http://flows.nodered.org
Starting as a systemd service.
Started Node-RED graphical event wiring tool.
19 Aug 15:13:55 - [info]
Welcome to Node-RED
===================
19 Aug 15:13:55 - [info] Node-RED version: v0.18.7
19 Aug 15:13:55 - [info] Node.js version: v8.11.1
19 Aug 15:13:55 - [info] Linux 4.14.52+ arm LE
19 Aug 15:14:06 - [info] Loading palette nodes
19 Aug 15:14:37 - [info] Dashboard version 2.9.6 started at /ui
19 Aug 15:14:49 - [warn] ------------------------------------------------------
19 Aug 15:14:49 - [warn] [node-red-contrib-delta-timed/delta-time] 'delta' already registered by module node-red-contrib-change-detect
19 Aug 15:14:49 - [warn] ------------------------------------------------------
19 Aug 15:14:49 - [info] Settings file : /home/pi/.node-red/settings.js
19 Aug 15:14:49 - [info] User directory : /home/pi/.node-red
19 Aug 15:14:49 - [warn] Projects disabled : set editorTheme.projects.enabled=true to enable
19 Aug 15:14:49 - [info] Flows file : /home/pi/.node-red/flows_nodered-pi.json
19 Aug 15:14:50 - [info] Server now running at http://127.0.0.1:1880/
19 Aug 15:14:50 - [warn]
---------------------------------------------------------------------
Your flow credentials file is encrypted using a system-generated key.
If the system-generated key is lost for any reason, your credentials
file will not be recoverable, you will have to delete it and re-enter
your credentials.
You should set your own key using the 'credentialSecret' option in
your settings file. Node-RED will then re-encrypt your credentials
file using your chosen key the next time you deploy a change.
---------------------------------------------------------------------
19 Aug 15:14:50 - [warn] Error loading credentials: SyntaxError: Unexpected token T in JSON at position 0
19 Aug 15:14:50 - [warn] Error loading flows: Error: Failed to decrypt credentials
19 Aug 15:14:51 - [info] Starting flows
19 Aug 15:15:01 - [warn] [telegram receiver:Telegram Receiver] bot not initialized
19 Aug 15:15:01 - [warn] [telegram sender:Temperatur Wetterstation] bot not initialized.
19 Aug 15:15:01 - [error] [function:Versorge mit Information] SyntaxError: Invalid or unexpected token
19 Aug 15:15:01 - [info] Started flows
19 Aug 15:15:02 - [info] [sonoff-server:166ef3ba.0029bc] SONOFF Server Started On Port 1080
19 Aug 15:15:02 - [red] Uncaught Exception:
19 Aug 15:15:02 - Error: listen EACCES 0.0.0.0:443
at Object._errnoException (util.js:1022:11)
at _exceptionWithHostPort (util.js:1044:20)
nodered.service: Main process exited, code=exited, status=1/FAILURE
nodered.service: Unit entered failed state.
nodered.service: Failed with result 'exit-code'.
nodered.service: Service hold-off time over, scheduling restart.
Stopped Node-RED graphical event wiring tool.
Started Node-RED graphical event wiring tool.
19 Aug 15:15:20 - [info]
Welcome to Node-RED
===================
19 Aug 15:15:20 - [info] Node-RED version: v0.18.7
19 Aug 15:15:02 - Error: listen EACCES 0.0.0.0:443
at Object._errnoException (util.js:1022:11)
at _exceptionWithHostPort (util.js:1044:20)
This error implies that something else is already running on port 443. This could be an existing copy of Node-RED or something else. You can search what applications are listening on what ports with the following command
lsof -i :443
This will list what is listening on port 443
I'm doing a research project and i need to use this project written in Scala .
With eclipse i have created a scala project in which i put the HELLO CAPS code.
however the project needs maven dependencies to be set :
<dependency>
<groupId>org.opencypher</groupId>
<artifactId>spark-cypher</artifactId>
<version>0.1.5</version>
My question is how to set this maven dependencies in this Scala project ?
1) create a project-folder with pom.xml in it.
Example;
mkdir my-project
cd my-project
touch pom.xml
2) Then add dependencies to pom.xml
Example,
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.group-name</groupId>
<artifactId>my-project</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>2.12.6</version>
</dependency>
<dependency>
<groupId>org.opencypher</groupId>
<artifactId>spark-cypher</artifactId>
<version>0.1.5</version>
</dependency>
</dependencies>
</project>
3) That should be it. Then you can run mvn clean compile from root of your project, which will download dependencies for you.
Example:
mvn clean compile
You can see the downloaded dependencies in local maven repo;
$ ls -l ~/.m2/repository/org/opencypher/
total 0
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 ast-9.1
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 expressions-9.1
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 front-end-9.1
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 front-end-parent
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 okapi
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 okapi-api
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 okapi-ir
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 okapi-logical
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 okapi-relational
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 okapi-trees
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 parser-9.1
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 rewriting-9.1
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 spark-cypher
drwxr-xr-x 3 prayagupd 184630988 102 Jul 15 12:53 util-9.1
4) Now you can import your project using eclipse or intellij. (You can skip step 3 as IDEs can do mvn clean compile for you as well)
Also read:
create a new maven hello-world project
I am trying to connect to redshift from notebook, so far i have done following -
Configured metadata for the notebook
"customDeps": [
"com.databricks:spark-redshift_2.10:3.0.0-preview1",
"com.databricks:spark-avro_2.11:3.2.0",
"com.databricks:spark-csv_2.11:1.5.0"
]
Checked browser console to ensure this library is loaded after restarting kernel
ui-logs-1422> [Tue Aug 22 2017 09:46:26 GMT+0530 (IST)] [notebook.util.CoursierDeps$] Fetched artifact to:/Users/xxxx/.m2/repository/com/databricks/spark-avro_2.10/3.0.0/spark-avro_2.10-3.0.0.jar
kernel.js:978 ui-logs-1452> [Tue Aug 22 2017 09:46:26 GMT+0530 (IST)] [notebook.util.CoursierDeps$] Fetched artifact to:/Users/xxxx/.coursier/cache/v1/http/repo1.maven.org/maven2/com/databricks/spark-redshift_2.10/3.0.0-preview1/spark-redshift_2.10-3.0.0-preview1.jar
kernel.js:978 ui-logs-1509> [Tue Aug 22 2017 09:46:26 GMT+0530 (IST)] [notebook.util.CoursierDeps$] Fetched artifact to:/Users/xxxx/.coursier/cache/v1/http/repo1.maven.org/maven2/com/databricks/spark-csv_2.11/1.5.0/spark-csv_2.11-1.5.0.jar
kernel.js:978 ui-logs-1526> [Tue Aug 22 2017 09:46:26 GMT+0530 (IST)] [notebook.util.CoursierDeps$] Fetched artifact to:/Users/xxxx/.coursier/cache/v1/http/repo1.maven.org/maven2/com/databricks/spark-avro_2.11/3.2.0/spark-avro_2.11-3.2.0.jar
When i try to load a table - i run into class not found exception,
java.lang.ClassNotFoundException: Failed to find data source: com.databricks.spark.redshift. Please find packages at http://spark.apache.org/third-party-projects.html
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:594)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:86)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:86)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:325)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
... 63 elided
Caused by: java.lang.ClassNotFoundException: com.databricks.spark.redshift.DefaultSource
at scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:579)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25$$anonfun$apply$13.apply(DataSource.scala:579)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:579)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$25.apply(DataSource.scala:579)
at scala.util.Try.orElse(Try.scala:84)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:579)
Anyone else running into this issue or have solved this?
I notice similar issue with another dependency as well, is there any thing missing in the configuration?
Trying out timeseries sample in the notebook - notebooks/timeseries/Spark-Timeseries.snb.ipynb
Notice an existing entry in metadata for custom dependency -
"customDeps": [
"com.cloudera.sparkts % sparkts % 0.3.0"
]
Quickly verified availability of this package # https://spark-packages.org/package/sryza/spark-timeseries
(updated meta data to include this line)
"com.cloudera.sparkts:sparkts:0.4.1"
After restarting kernel - validated library is loaded
ui-logs-337> [Wed Aug 23 2017 09:29:25 GMT+0530 (IST)] [notebook.util.CoursierDeps$] Will fetch these customDeps artifacts:Set(Dependency(com.cloudera.sparkts:sparkts,0.3.0,,Set(),Attributes(,),false,true), Dependency(com.cloudera.sparkts:sparkts,0.4.1,,Set(),Attributes(,),false,true))
kernel.js:978 ui-logs-347> [Wed Aug 23 2017 09:29:37 GMT+0530 (IST)] [notebook.util.CoursierDeps$] Fetched artifact to:/Users/xxxx/.coursier/cache/v1/http/repo1.maven.org/maven2/com/cloudera/sparkts/sparkts/0.4.1/sparkts-0.4.1.jar
Error message -
<console>:69: error: object cloudera is not a member of package com
import com.cloudera.sparkts._
^
<console>:70: error: object cloudera is not a member of package com
import com.cloudera.sparkts.stats.TimeSeriesStatisticalTests
Downlaoded another version of spark-notebook(this wasnt from master branch).
spark-notebook-0.7.0-scala-2.11.8-spark-2.1.1-hadoop-2.7.2
against
spark-notebook-0.9.0-SNAPSHOT-scala-2.11.8-spark-2.1.1-hadoop-2.7.2
In addition i had to ensure scala, spark & hadoop versions are intact across dependencies i have configured.
In this particular example i had to set jar file for amazon JDBC redshift driver from command line, as this was not available at maven repository.
export EXTRA_CLASSPATH=RedshiftJDBC4-1.2.7.1003.jar
Hope this helps others
If you want, you can add the jar to the kernel's environment section "env" (EXTRA_CLASSPATH) like this:
cat /usr/local/share/jupyter/kernels/apache_toree_scala/kernel.json
{
"argv": [
"/usr/local/share/jupyter/kernels/apache_toree_scala/bin/run.sh",
"--profile",
"{connection_file}"
],
"interrupt_mode": "signal",
"env": {
"__TOREE_SPARK_OPTS__": "",
"PYTHONPATH": "/opt/cloudera/parcels/SPARK2/lib/spark2/python:/opt/cloudera/parcels/SPARK2/lib/spark2/python/lib/py4j-0.10.7-src.zip",
"__TOREE_OPTS__": "",
"PYTHON_EXEC": "python",
"SPARK_HOME": "/opt/cloudera/parcels/SPARK2/lib/spark2",
"DEFAULT_INTERPRETER": "Scala",
"JAVA_HOME": "/usr/java/latest",
"EXTRA_CLASSPATH": "/opt/cloudera/parcels/SPARK2/lib/spark2/jars/mysql-connector-java-5.1.15.jar"
},
"metadata": {},
"display_name": "SPARK2/Scala",
"language": "scala"
}
After migrating my Watson IoTP boilerplate application to Diego, it not longer starts. I see this in the log:
[APP/0] OUT Welcome to Node-RED
[APP/0] OUT ===================
[APP/0] OUT 18 Jan 15:43:16 - [info] Node-RED version: v0.15.3
[APP/0] OUT 18 Jan 15:43:16 - [info] Node.js version: v4.6.2
[APP/0] OUT 18 Jan 15:43:16 - [info] Linux 4.4.0-45-generic x64 LE
[APP/0] OUT 18 Jan 15:43:16 - [info] Loading palette nodes
[APP/0] OUT 18 Jan 15:43:18 - [warn] [ibm hdfs in] Deprecated call to RED.runtime.nodes.registerType - node-set name must be provided as first argument
[APP/0] OUT 18 Jan 15:43:18 - [warn] [ibm hdfs] Deprecated call to RED.runtime.nodes.registerType - node-set name must be provided as first argument
[APP/0] OUT 18 Jan 15:43:18 - [warn] [ibmpush] Deprecated call to RED.runtime.nodes.registerType - node-set name must be provided as first argument
[APP/0] OUT 18 Jan 15:43:20 - [info] Settings file : /home/vcap/app/bluemix-settings.js
[APP/0] OUT 18 Jan 15:43:20 - [info] Server now running at http://127.0.0.1:`**1880**`/red/
[APP/0] OUT 18 Jan 15:43:20 - [info] Starting flows
[APP/0] OUT 18 Jan 15:43:20 - [info] Started flows
[CELL/0] ERR Timed out after 1m0s: health check never passed.
[CELL/0] OUT Exit status 0
[CELL/0] OUT Destroying container
[API/8] OUT App instance exited with guid ca3f2bbd-ac6e-42ec-8a61-1ff704274c3e payload: {"instance"=>"", "index"=>0, "reason"=>"CRASHED", "exit_description"=>"2 error(s) occurred:\n\n* 1 error(s) occurred:\n\n* Exited with status 4\n* 2 error(s) occurred:\n\n* cancelled\n* process did not exit", "crash_count"=>2, "crash_timestamp"=>1484754267594231230, "version"=>"0361fa77-694c-4e8f-991e-0c52dd0c4c87"}
How can I fix this ?
The problem is that the way apps find the port to bind to has changed with Diego. The VCAP_APP_PORT environment variable is no longer populated by default. To fix it, you need to create a git repo for your app (Overview tab -> Continuous Delivery -> Add GIT). Use jazzhub to edit the bluemix-settings.js file to change
uiPort: process.env.VCAP_APP_PORT
to
uiPort: process.env.PORT
Deploy those changes and the app should start.
I had a similar issue with a npm express app and fixed it without having to edit the bluemix-settings.js file.
Simple remove all references to process.env.VCAP_APP_HOST (no longer a reference in Diego) and change process.env.VCAP_APP_PORT to process.env.PORT.
for example,
var host = 'localhost';
var port = (process.env.PORT || 1337);
app.listen(port, host);
console.log('App started on port ' + port);
You can check to see if your app is running Diego by typing:
cf has-diego-enabled APPNAME
after installing the diego cli tool
cf install-plugin Diego-Enabler -r CF-Community
I also disabled health checks:
cf set-health-check APPNAME none
I use latest java mail api. I want to rename an IMAP folder which contains children (subfolders). Suppose we have inbox and inbox has one subfolder inbox.folder1.
The folder1 has one subfolder its full name is inbox.folder1.subfolder1
I want to rename folder1 to folder2 and i want that to see:
inbox.folder2.subfolder1
but after the code
// folder instance corresponds to folder1
newFolder = folder.getFolder(newName); //newName = "folder2"
folder.renameTo(newFolder);
i have if you connect using outlook or thunderbird
inbox.folder1.subfolder1
inbox.folder2
we have two folders, but i expect one. And also subfolder1 still sitting in folder1. And you can not enter folder or subfolder1, they generate error message: Reason Given: Mailbox does not exist, or must be subscribed to.
Should i use setSubscripbed(true/false)? Do I have to handle subfolders separately by iterating each one?
I solved. First i turned on debugging mode of thunderbird as stated in:
https://wiki.mozilla.org/MailNews:Logging#Environment_Variables_to_set
Then i renamed a folder which contained many subfolders.
As you see, you have to call subscribe to folders with new name and unscribe with old name with the order specified in the log file.
Without calling the subscribe/unsubscribe method calls, you get the message in the mail server console
68 OK Folder renamed.
but it is not enough and you have corrupted folder structure.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 68 rename "INBOX.folder1" "INBOX.folder2"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=23 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 68 OK Folder renamed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 69 subscribe "INBOX.folder2"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=26 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 69 OK Folder subscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 70 unsubscribe "INBOX.folder1"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=28 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 70 OK Folder unsubscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 71 subscribe "INBOX.folder2.subfolder1.subsubfolder1.subsubsubfolder1b"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=26 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 71 OK Folder subscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 72 unsubscribe "INBOX.folder1.subfolder1.subsubfolder1.subsubsubfolder1b"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=28 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 72 OK Folder unsubscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 73 subscribe "INBOX.folder2.subfolder1.subsubfolder1.subsubsubfolder1a"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=26 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 73 OK Folder subscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 74 unsubscribe "INBOX.folder1.subfolder1.subsubfolder1.subsubsubfolder1a"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=28 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 74 OK Folder unsubscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 75 subscribe "INBOX.folder2.subfolder3.subsubfolder3"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=26 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 75 OK Folder subscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 76 unsubscribe "INBOX.folder1.subfolder3.subsubfolder3"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=28 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 76 OK Folder unsubscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 77 subscribe "INBOX.folder2.subfolder3"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=26 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 77 OK Folder subscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 78 unsubscribe "INBOX.folder1.subfolder3"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=28 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 78 OK Folder unsubscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 79 subscribe "INBOX.folder2.subfolder1.subsubfolder1"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=26 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 79 OK Folder subscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 80 unsubscribe "INBOX.folder1.subfolder1.subsubfolder1"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=28 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 80 OK Folder unsubscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 81 subscribe "INBOX.folder2.subfolder2"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=26 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 81 OK Folder subscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 82 unsubscribe "INBOX.folder1.subfolder2"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=28 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 82 OK Folder unsubscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 83 subscribe "INBOX.folder2.subfolder1"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=26 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 83 OK Folder subscribed.
9944[ab79a70]: 7cea000:192.168.0.104:A:SendData: 84 unsubscribe "INBOX.folder1.subfolder1"
9944[ab79a70]: ReadNextLine [stream=a6db068 nb=28 needmore=0]
9944[ab79a70]: 7cea000:192.168.0.104:A:CreateNewLineFromSocket: 84 OK Folder unsubscribed.