The following is a valid query in a browser (e.g. Firefox):
http://www.freesound.org/api/sounds/search/?q=barking&api_key=074c0b328aea46adb3ee76f6918f8fae
yielding a JSON document:
{
"num_results": 610,
"sounds": [
{
"analysis_stats": "http://www.freesound.org/api/sounds/115536/analysis/",
"analysis_frames": "http://www.freesound.org/data/analysis/115/115536_1956076_frames.json",
"preview-hq-mp3": "http://www.freesound.org/data/previews/115/115536_1956076-hq.mp3",
"original_filename": "Two Barks.wav",
"tags": [
"animal",
"bark",
"barking",
"dog",
"effects",
...
I am trying to perform this query with Dispatch 0.9.4. Here's a build.sbt:
scalaVersion := "2.10.0"
libraryDependencies += "net.databinder.dispatch" %% "dispatch-core" % "0.9.4"
From sbt console, I do the following:
import dispatch._
val q = url("http://www.freesound.org/api/sounds/search")
.addQueryParameter("q", "barking")
.addQueryParameter("api_key", "074c0b328aea46adb3ee76f6918f8fae")
val res = Http(q OK as.String)
But the promise always completes with the following error:
res0: dispatch.Promise[String] = Promise(!Unexpected response status: 301!)
So what am I doing wrong? Here is the API documentation in case it helps.
You can enable redirect following with the configure method on the Http executor:
Http.configure(_ setFollowRedirects true)(q OK as.String)
You could also pull the Location out of the 301 response manually, but that's going to be a lot less convenient.
Related
Im trying to install setpropex utility for Android emulator on my ubuntu VM. Build.sbt file has the following code
import sbt.IO
import sbt.Keys._
import sbt._
name := "setpropex"
scalaVersion := "2.11.8"
exportJars := true
test in assembly := {}
libraryDependencies += "net.java.dev.jna" % "jna" % "4.2.1"
libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.6" % "test"
val dalvikTask = TaskKey[Unit]("dalvik", "build a command line program for Android")
(dalvikTask := (assembly, streams) map { (asm, s) =>
val appName = "setpropex"
val dexFileName = s"$appName.dex"
s"dx --dex --output=$dexFileName ${asm.getPath}"
val target = new File(appName)
IO.copyFile(new File("self-running.sh"), target , true)
val appData="data.tar"
s"tar -cf $appData run.sh com $dexFileName"
IO.append(target, IO.readBytes(new File(appData)))
s"chmod +x $appName"
s.log.info(s"Android cmdline app: $appName build successful!")
s.log.info("Output: " + new File(".").getCanonicalPath + "/" + appName)
s"adb push $appName /data/local/tmp"
s"adb shell /data/local/tmp/$appName"
})
When I run the command "sbt dalvik", I get the following error
error: value map is not a member of (sbt.TaskKey[sbt.File], sbt.TaskKey[sbt.Keys.TaskStreams])
dalvikTask := (assembly, streams) map {(asm, s) =>
[error] Type error in expression
It seems to me that im missing something quite fundamental in the syntax. Im a complete noob in scala language and im using it only for this particular project. Could someone kindly guide me in rectifying this error? I would be happy to provide any further clarifications, if needed.
This seems to be about this project:
https://github.com/wuhx/setpropex
It seems like the build system was written with an old sbt version in mind. Try creating a project/build.properties file with the content sbt.version=0.13.18.
Or you can migrate to the newer sbt 1.0 syntax:
https://www.scala-sbt.org/1.x/docs/Migrating-from-sbt-013x.html#Migrating+from+the+tuple+enrichments . However for this to work you'll probably also need to upgrade your sbt plugins as the plugin API changed after sbt 0.13.
I am using scala 2.12 and have following dependencies in my build.sbt.
libraryDependencies += "org.apache.kafka" % "kafka-clients" % "0.10.1.0"
libraryDependencies += "io.confluent" % "kafka-avro-serializer" % "3.1.1"
libraryDependencies += "io.confluent" % "common-config" % "3.1.1"
libraryDependencies += "io.confluent" % "common-utils" % "3.1.1"
libraryDependencies += "io.confluent" % "kafka-schema-registry-client" % "3.1.1"
Thanks to this community, I am able to convert my raw data to required avro format.
We need to use the confluent libraries to serialize and send the data to the Kafka topics.
I am using the following properties and avro record.
properties.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, "org.apache.kafka.common.serialization.StringSerializer")
properties.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, "io.confluent.kafka.serializers.KafkaAvroSerializer")
properties.put("schema.registry.url", "http://myschemahost:8081")
Just showing required snippet of code for brevity.
val producer = new KafkaProducer[String, GenericData.Record](properties)
val schema = new Schema.Parser().parse(new File(schemaFileName))
var avroRecord = new GenericData.Record(schema)
// code to populate record
// check output below to see the data
logger.info(s"${avroRecord.toString}\n")
producer.send(new ProducerRecord[String, GenericData.Record](topic, avroRecord), new ProducerCallback)
producer.flush()
producer.close()
Schema and Data as per the output.
{"name": "person","type": "record","fields": [{"name": "address","type": {"type" : "record","name" : "AddressUSRecord","fields" : [{"name": "streetaddress", "type": "string"},{"name": "city", "type":"string"}]}}]}
I am getting the following error while publishing to Kafka.
Error registering Avro schema:
org.apache.kafka.common.errors.SerializationException:
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: (sun.net.www.protocol.http.HttpURLConnection$HttpInputStream); line: 1, column: 2]; error code: 50005
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:170)
at io.confluent.kafka.schemaregistry.client.rest.RestService.httpRequest(RestService.java:187)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:238)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:230)
at io.confluent.kafka.schemaregistry.client.rest.RestService.registerSchema(RestService.java:225)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.registerAndGetId(CachedSchemaRegistryClient.java:59)
at io.confluent.kafka.schemaregistry.client.CachedSchemaRegistryClient.register(CachedSchemaRegistryClient.java:91)
at io.confluent.kafka.serializers.AbstractKafkaAvroSerializer.serializeImpl(AbstractKafkaAvroSerializer.java:72)
at io.confluent.kafka.serializers.KafkaAvroSerializer.serialize(KafkaAvroSerializer.java:54)
at org.apache.kafka.common.serialization.Serializer.serialize(Serializer.java:60)
at org.apache.kafka.clients.producer.KafkaProducer.doSend(KafkaProducer.java:877)
at org.apache.kafka.clients.producer.KafkaProducer.send(KafkaProducer.java:839)
Based on schema and data, is anything missing? My record is correct ?
Also, I want to know how should I populate "avro" NULL from Scala? None doesn't work.
Any help will be appreciated. I am really stuck here.
UPDATE:
Thanks #cricket_007 for pointing out the issue. I do get following error:
2019-03-20 13:26:09.660 [application-akka.actor.default-dispatcher-5] INFO i.c.k.s.KafkaAvroSerializerConfig.logAll(169) - KafkaAvroSerializerConfig values:
schema.registry.url = [http://myhost:8081]
max.schemas.per.subject = 1000
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unexpected character ('<' (code 60)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')
at [Source: (sun.net.www.protocol.http.HttpURLConnection$HttpInputStream); line: 1, column: 2]; error code: 50005
However, When I use the same URL (http://myhost:8081) on my browser it works well. I can see the subjects, and other information.
But as soon as I use the client (Scala program above), it fails with above error.
I just checked with a sample code like below, it gives same issue.
val client = new OkHttpClient
val request = new Request.Builder().url("http://myhost:8081/subjects").build()
val output = client.newCall(request).execute().body().string()
logger.info(s"Subjects: ${output}\n")
I am getting connection refused for the schema registry URL.
Subjects: <HEAD><TITLE>Connection refused</TITLE></HEAD>
<BODY BGCOLOR="white" FGCOLOR="black"><H1>Connection refused</H1><HR>
<FONT FACE="Helvetica,Arial"><B>
Description: Connection refused</B></FONT>
<HR>
<!-- default "Connection refused" response (502) -->
</BODY>
So, wanted to check if I am missing anything. Same thing works when I run it on browser but simple code like above it fails.
That's an HTTP response parsing error. Seems your schema registry is not returning a JSON response, and rather some HTML starting with a < open tag.
You should check if the registry is really running at http://myschemahost:8081, and you can manually post your schema to it using the REST API to do the same actions as the serializer would.
I'm trying to get around a CORS error for a simple "hello world" style REST API in Scala/Play 2.6.x and I have tried everything that I can think of at this point. As far as I can tell there is not a good solution or example to be found on the internet, so even if this should be an easy fix then anyone that has a good solution would really help me out by posting it in full. I am simply trying to send a post request from localhost:3000 (a react application using axios) to localhost:9000 where my Scala/Play framework lives.
THE ERRORS
The error that I am getting on the client-side is the following:
XMLHttpRequest cannot load http://localhost:9000/saveTest.
Response to preflight request doesn't pass access control check:
No 'Access-Control-Allow-Origin' header is present on the requested
resource. Origin 'http://localhost:3000' is therefore not allowed
access. The response had HTTP status code 403.
The error that I am getting on the server-side is
success] Compiled in 1s
--- (RELOAD) ---
[info] p.a.h.EnabledFilters - Enabled Filters
(see <https://www.playframework.com/documentation/latest/Filters>):
play.filters.csrf.CSRFFilter
play.filters.headers.SecurityHeadersFilter
play.filters.hosts.AllowedHostsFilter
play.filters.cors.CORSFilter
[info] play.api.Play - Application started (Dev)
[warn] p.f.c.CORSFilter - Invalid CORS
request;Origin=Some(http://localhost:3000);
Method=OPTIONS;Access-Control-Request-Headers=Some(content-type)
MY CODE
I have the following in my application.conf file
# https://www.playframework.com/documentation/latest/Configuration
play.filters.enabled += "play.filters.cors.CORSFilter"
play.filters.cors {
pathPrefixes = ["/"]
allowedOrigins = ["http://localhost:3000", ...]
allowedHttpMethods = ["GET", "POST", "PUT", "DELETE"]
allowedHttpHeaders = ["Accept"]
preflightMaxAge = 3 days
}
I've tried changing pathPrefixes to /saveTest (my endpoint), and tried changing allowedOrigins to simply 'https://localhost'. I've tried changing allowedHttpHeaders="Allow-access-control-allow-origin". I've tried setting allowedOrigins, allowedHttpMethods, and allowedHttpHeaders all to null which, according to the documentation (https://www.playframework.com/documentation/2.6.x/resources/confs/filters-helpers/reference.conf) should allow everything (as should pathPrefixes=["/"]
My build.sbt is the following, so it should be adding the filter to the libraryDependencies:
name := """scalaREST"""
organization := "com.example"
version := "1.0-SNAPSHOT"
lazy val root = (project in file(".")).enablePlugins(PlayScala)
scalaVersion := "2.12.2"
libraryDependencies += guice
libraryDependencies += "org.scalatestplus.play" %% "scalatestplus-play" % "3.1.0" % Test
libraryDependencies += filters
According to documentation available here: https://www.playframework.com/documentation/2.6.x/Filters#default-filters you can set the default filters like this:
import javax.inject.Inject
import play.filters.cors.CORSFilter
import play.api.http.{ DefaultHttpFilters, EnabledFilters }
class Filters #Inject()(enabledFilters: EnabledFilters, corsFilter: CORSFilter)
extends DefaultHttpFilters(enabledFilters.filters :+ corsFilter: _*)
I'm not sure exactly where that should go in my project - it doesn't say, but from other stackoverflow answers I kind of assume it should go in the root of my directory (that is /app). So that's where I put it.
Finally, there was one exotic stackoverflow response that said to put this class in my controllers and add it as a function to my OK responses
implicit class RichResult (result: Result) {
def enableCors = result.withHeaders(
"Access-Control-Allow-Origin" -> "*"
, "Access-Control-Allow-Methods" ->
"OPTIONS, GET, POST, PUT, DELETE, HEAD"
// OPTIONS for pre-flight
, "Access-Control-Allow-Headers" ->
"Accept, Content-Type, Origin, X-Json,
X-Prototype-Version, X-Requested-With"
//, "X-My-NonStd-Option"
, "Access-Control-Allow-Credentials" -> "true"
)
}
Needless to say, this did not work.
WRAP UP
Here is the backend for my current scala project.
https://github.com/patientplatypus/scalaproject1/tree/master/scalarest
Please, if you can, show a full working example of a CORS implementation - I cannot get anything I can find online to work. I will probably be submitting this as a documentation request to the Play Framework organization - this should not be nearly this difficult. Thank you.
Your preflight request fails because you have a Content-Type header set
Add content-type to allowedHttpHeaders in your application.conf like so
#application.conf
play.filters.cors {
#other cors configuration
allowedHttpHeaders = ["Accept", "Content-Type"]
}
I had this problem too and I added these code in application.conf
play.filters.enabled += "play.filters.cors.CORSFilter"
play.filters.cors {
allowedHttpMethods = ["GET", "HEAD", "POST"]
allowedHttpHeaders = ["Accept", "Content-Type"]"
}
and now everything is OK!
for more info
For playframework version 2.8.x , we can wrap the Response in a function as below -
def addCorsHeader (response : Result) : Result = {
response.withHeaders(
("Access-Control-Allow-Origin", "*"),
("Access-Control-Allow-Methods" , "GET,POST,OPTIONS,DELETE,PUT")
)
}
Now in the controller, wrap the Results using the above function.
val result = myService.swipeOut(inputParsed)
addCorsHeader(Ok(s"$result row successfully updated. Trip complete"))
}
else {
addCorsHeader(InternalServerError("POST body is mandatory"))
}
I'm trying to create a client for ES in SCALA [ school project ] .
but when I want to import Elastic search I got some problems
I've written a sbt file :
libraryDependencies += "org.elasticsearch" %% "elasticsearch" % "1.4.2"
libraryDependencies += "org.apache.lucene" % "lucene-core" % "4.10.2"
with other lucene
and when I try to use it :
import org.elasticsearch.node.Nodebuilder.*
object Setup {
Node node = nodeBuilder().node();
Client client = node.client();
}
it does recognize org.elasticsearch.node. but not .Nodebuilder.
Anyone has an idea ?
solved
import org.elasticsearch.node.NodeBuilder.nodeBuilder
val node = nodeBuilder().node()
val client = node.client()
I would suggest you to use the following library: https://github.com/sksamuel/elastic4s
I’m working my way through the Lift Application Development Cookbook by Gilberto T. Garcia Jr and have run up against a problem I can’t seem to resolve. I’ve copied the source code Chap06-map-table and I’m trying to modify it to work with my IBM i (iSeries, AS/400, i5) database. I was able to make it work with the first type of connection using Squeryl Record. However, I can’t seem to figure how to get this to work using a JNDI Datasource. I’ve spent a couple of days searching the internet for examples of setting this up and have not found a good example involving a DB/400 database connection. Below is the error I get when I attempt to start the container and the code I’ve modified in an effort to make it work. Any help would be appreciated.
There seems to be some choices for the data source class from jt4oo.jar (jtOpen) and I’m not sure which would be the best to use or perhaps there’s another. I’ve been trying this with each of the three and am assuming the first is the correct one.
com.ibm.as400.access.AS400JDBCManagedConnectionPoolDataSource
com.ibm.as400.access.AS400JDBCConnectionPoolDataSource
com.ibm.as400.access.AS400JDBCDataSource
Thanks. Bob
This is the start of the error:
> container:start
[info] jetty-8.0.4.v20111024
[info] No Transaction manager found - if your webapp requires one, please config
ure one.
[info] NO JSP Support for /, did not find org.apache.jasper.servlet.JspServlet
[info] started o.e.j.w.WebAppContext{/,[file:/C:/Users/Bob/Lift26Projects/scala_
210/chap06-map-table/src/main/webapp/]}
[info] started o.e.j.w.WebAppContext{/,[file:/C:/Users/Bob/Lift26Projects/scala_
210/chap06-map-table/src/main/webapp/]}
18:21:47.062 [pool-7-thread-1] ERROR n.liftweb.http.provider.HTTPProvider - Fail
ed to Boot! Your application may not run properly
java.sql.SQLException: The application requester cannot establish the connection
. ("jdbc:as400://www.busapp.com;libraries=PLAY2TEST";naming=system;errors=full;)
at com.ibm.as400.access.JDError.throwSQLException(JDError.java:524) ~[jt
400-6.7.jar:JTOpen 6.7]
at com.ibm.as400.access.AS400JDBCConnection.setProperties(AS400JDBCConne
ction.java:3142) ~[jt400-6.7.jar:JTOpen 6.7]
at com.ibm.as400.access.AS400JDBCManagedDataSource.createPhysicalConnect...
My Build.sbt File:
name := "Lift 2.5 starter template"
version := "0.0.1"
organization := "net.liftweb"
scalaVersion := "2.10.0"
resolvers ++= Seq("snapshots" at "http://oss.sonatype.org/content/repositories/snapshots",
"staging" at "http://oss.sonatype.org/content/repositories/staging",
"releases" at "http://oss.sonatype.org/content/repositories/releases"
)
seq(com.github.siasia.WebPlugin.webSettings :_*)
unmanagedResourceDirectories in Test <+= (baseDirectory) { _ / "src/main/webapp" }
scalacOptions ++= Seq("-deprecation", "-unchecked")
env in Compile := Some(file("./src/main/webapp/WEB-INF/jetty-env.xml") asFile)
libraryDependencies ++= {
val liftVersion = "2.5"
Seq(
"net.liftweb" %% "lift-webkit" % liftVersion % "compile",
"net.liftmodules" %% "lift-jquery-module_2.5" % "2.3",
"org.eclipse.jetty" % "jetty-webapp" % "8.0.4.v20111024" % "container",
"org.eclipse.jetty" % "jetty-plus" % "8.0.4.v20111024" % "container",
"ch.qos.logback" % "logback-classic" % "1.0.6",
"org.specs2" %% "specs2" % "1.14" % "test",
"net.liftweb" %% "lift-squeryl-record" % liftVersion % "compile",
"net.sf.jt400" % "jt400" % "6.7",
"org.liquibase" % "liquibase-maven-plugin" % "3.0.2"
)
}
This is my boot.scala file:
package bootstrap.liftweb
import _root_.liquibase.database.DatabaseFactory
import _root_.liquibase.database.jvm.JdbcConnection
import _root_.liquibase.exception.DatabaseException
import _root_.liquibase.Liquibase
import _root_.liquibase.resource.FileSystemResourceAccessor
import net.liftweb._
import util._
import Helpers._
import common._
import http._
import sitemap._
import Loc._
import net.liftmodules.JQueryModule
import net.liftweb.http.js.jquery._
import net.liftweb.squerylrecord.SquerylRecord
import org.squeryl.Session
import java.sql.{SQLException, DriverManager}
import org.squeryl.adapters.DB2Adapter
import javax.naming.InitialContext
import javax.sql.DataSource
import code.model.LiftBookSchema
/**
* A class that's instantiated early and run. It allows the application
* to modify lift's environment
*/
class Boot {
def runChangeLog(ds: DataSource) {
val connection = ds.getConnection
try {
val database = DatabaseFactory.getInstance().
findCorrectDatabaseImplementation(new JdbcConnection(connection))
val liquibase = new Liquibase(
"database/changelog/db.changelog-master.xml",
new FileSystemResourceAccessor(),
database
)
liquibase.update(null)
} catch {
case e: SQLException => {
connection.rollback()
throw new DatabaseException(e)
}
}
}
def boot {
// where to search snippet
LiftRules.addToPackages("code")
prepareDb()
// Build SiteMap
val entries = List(
Menu.i("Home") / "index", // the simple way to declare a menu
// more complex because this menu allows anything in the
// /static path to be visible
Menu(Loc("Static", Link(List("static"), true, "/static/index"),
"Static Content")))
// set the sitemap. Note if you don't want access control for
// each page, just comment this line out.
LiftRules.setSiteMap(SiteMap(entries: _*))
//Show the spinny image when an Ajax call starts
LiftRules.ajaxStart =
Full(() => LiftRules.jsArtifacts.show("ajax-loader").cmd)
// Make the spinny image go away when it ends
LiftRules.ajaxEnd =
Full(() => LiftRules.jsArtifacts.hide("ajax-loader").cmd)
// Force the request to be UTF-8
LiftRules.early.append(_.setCharacterEncoding("UTF-8"))
// Use HTML5 for rendering
LiftRules.htmlProperties.default.set((r: Req) =>
new Html5Properties(r.userAgent))
//Init the jQuery module, see http://liftweb.net/jquery for more information.
LiftRules.jsArtifacts = JQueryArtifacts
JQueryModule.InitParam.JQuery = JQueryModule.JQuery172
JQueryModule.init()
}
def prepareDb() {
Class.forName("com.ibm.as400.access.AS400JDBCManagedConnectionPoolDataSource")
val ds = new InitialContext().lookup("java:/comp/env/jdbc/dsliftbook").asInstanceOf[DataSource]
runChangeLog(ds)
SquerylRecord.initWithSquerylSession(
Session.create(
ds.getConnection,
new DB2Adapter)
)
}
}
This is my jetty-env-xml File
<!DOCTYPE Configure PUBLIC "-//Jetty//Configure//EN" "http://www.eclipse.org/jetty/configure.dtd">
<Configure class="org.eclipse.jetty.webapp.WebAppContext">
<New id="dsliftbook" class="org.eclipse.jetty.plus.jndi.Resource">
<Arg></Arg>
<Arg>jdbc/dsliftbook</Arg>
<Arg>
<New class="com.ibm.as400.access.AS400JDBCManagedConnectionPoolDataSource">
<Set name="serverName">"jdbc:as400://www.[server].com;libraries=PLAY2TEST";naming=system;errors=full;</Set>
<Set name="user">[user]</Set>
<Set name="password">[password]</Set>
</New>
</Arg>
</New>
</Configure>
Okay, I've managed to get connected. One problem was the quotation marks in the jetty-env-xml file. And the user name/password I was using apparently did not the authority required to make this work I'm not sure why since this is the same id/password I use for all my iSeries development. So for now, I'm another user profile with security officer authority until I can figure out what's happening or what authorities are required.
Once I got signed on, I was not able to set a library list for the user and this was causing the SQL to fail. It was looking for a library name that was the same as the user ID. For the time being, I've gotten around this issue by creating a new library named the same as the user id.
One other problem here is that even though I'm supplying both the ID and Password, I'm getting prompted to enter the ID/Password before it will connect. The ID and url are filled in but the password always has to be re-keyed.
I've included the current source for the jetty-env-xml file and the boot.scala file. Hopefully this may help others.
Thanks to Dave and James for their help!
Bob
boot.scala:
package bootstrap.liftweb
// import _root_.liquibase.database.DatabaseFactory
// import _root_.liquibase.database.jvm.JdbcConnection
// import _root_.liquibase.exception.DatabaseException
// import _root_.liquibase.Liquibase
// import _root_.liquibase.resource.FileSystemResourceAccessor
import net.liftweb._
import util._
import Helpers._
import common._
import http._
import sitemap._
import Loc._
import net.liftmodules.JQueryModule
import net.liftweb.http.js.jquery._
import net.liftweb.squerylrecord.SquerylRecord
import org.squeryl.Session
import java.sql.{SQLException, DriverManager}
import org.squeryl.adapters.DB2Adapter
import javax.naming.InitialContext
import javax.sql.DataSource
import code.model.LiftBookSchema
import com.ibm.as400.access.AS400JDBCManagedConnectionPoolDataSource
/**
* A class that's instantiated early and run. It allows the application
* to modify lift's environment
*/
class Boot {
// def runChangeLog(ds: DataSource) {
// val connection = ds.getConnection
// try {
// val database = DatabaseFactory.getInstance().
// findCorrectDatabaseImplementation(new JdbcConnection(connection))
// val liquibase = new Liquibase(
// "database/changelog/db.changelog-master.xml",
// new FileSystemResourceAccessor(),
// database
// )
// liquibase.update(null)
// } catch {
// case e: SQLException => {
// connection.rollback()
// throw new DatabaseException(e)
// }
// }
// }
def boot {
// where to search snippet
LiftRules.addToPackages("code")
prepareDb()
// Build SiteMap
val entries = List(
Menu.i("Home") / "index", // the simple way to declare a menu
// more complex because this menu allows anything in the
// /static path to be visible
Menu(Loc("Static", Link(List("static"), true, "/static/index"),
"Static Content")))
// set the sitemap. Note if you don't want access control for
// each page, just comment this line out.
LiftRules.setSiteMap(SiteMap(entries: _*))
//Show the spinny image when an Ajax call starts
LiftRules.ajaxStart =
Full(() => LiftRules.jsArtifacts.show("ajax-loader").cmd)
// Make the spinny image go away when it ends
LiftRules.ajaxEnd =
Full(() => LiftRules.jsArtifacts.hide("ajax-loader").cmd)
// Force the request to be UTF-8
LiftRules.early.append(_.setCharacterEncoding("UTF-8"))
// Use HTML5 for rendering
LiftRules.htmlProperties.default.set((r: Req) =>
new Html5Properties(r.userAgent))
//Init the jQuery module, see http://liftweb.net/jquery for more information.
LiftRules.jsArtifacts = JQueryArtifacts
JQueryModule.InitParam.JQuery = JQueryModule.JQuery172
JQueryModule.init()
}
def prepareDb() {
Class.forName("com.ibm.as400.access.AS400JDBCManagedConnectionPoolDataSource")
val ds = new InitialContext().lookup("java:/comp/env/jdbc/dsliftbook").asInstanceOf[DataSource]
// runChangeLog(ds)
SquerylRecord.initWithSquerylSession(Session.create(ds.getConnection, new DB2Adapter)
)
}
}
jetty-env-xml
<!DOCTYPE Configure PUBLIC "-//Jetty//Configure//EN" "http://www.eclipse.org/jetty/configure.dtd">
<Configure class="org.eclipse.jetty.webapp.WebAppContext">
<New id="dsliftbook" class="org.eclipse.jetty.plus.jndi.Resource">
<Arg></Arg>
<Arg>jdbc/dsliftbook</Arg>
<Arg>
<New class="com.ibm.as400.access.AS400JDBCManagedConnectionPoolDataSource">
<Set name="serverName">www.[server].com</Set>
<Set name="user">DBUSER</Set>
<Set name="password">DBUSER</Set>
</New>
</Arg>
</New>
</Configure>