here is my problem. I can't seem to use gatling for aggregation. I have this error: "invalid basic authentication header encoding" HTTP code 401.
here is my simplified code:
package app
import io.gatling.core.Predef._
import io.gatling.http.Predef._
import scala.concurrent.duration._
import java.util.Base64
class Aggregation extends Simulation {
val httpConf = http
.baseUrl("http://localhost:9200")
.acceptHeader("text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8")
val body = StringBody("""{
"aggs": {
"genres": {
"terms": { "field": "genre" }
}
}
}""".stripMargin.replaceAll("\n", " "))
// surely here there is an error but I don't know why
val auth = Base64.getEncoder.encodeToString("elastic:changeme".getBytes())
val scn = scenario("Test aggregation")
.exec(
http("Aggregation")
.get("/_search")
.header("Authorization", "Basic " + auth)
.body(body).asJson
.check(status.is(200)))
var users: Integer = 1
var duration: Integer = 1
setUp(
scn.inject(rampUsers(users) during (duration seconds))
).protocols(httpConf)
}
If your auth scheme is really Basic, you should use Gatling's built-in support instead of crafting the header yourself, see https://gatling.io/docs/gatling/reference/current/http/request/#authentication
http("Aggregation")
.get("/_search")
.basicAuth("elastic", "changeme")
.body(body).asJson
.check(status.is(200)))
Related
Doing performance testing of API using Gatling.
Scenario:
Login (authToken will be generated in header)
For GET, POST, PUT request, need to pass that generated authToken in header
Here's my code snippet:
package apitest
import scala.concurrent.duration.*
import io.gatling.core.Predef.*
import io.gatling.http.Predef.*
import io.gatling.jdbc.Predef.*
import scala.language.postfixOps
class TestEnv4trial extends Simulation {
var e1: String = "https://testenv1-dev.net"
var e2: String = "https://testenv2-dev.net"
var BaseUrl: String = e1
var pwd: String = "pass123"
// Users
var admin: String = "admin123"
val httpProtocol = http
.baseUrl(BaseUrl)
.inferHtmlResources()
val login_headers = Map(
"Accept" -> """*/*""",
"Connection" -> "keep-alive",
"Content-Type" -> "application/x-www-form-urlencoded;charset=UTF-8"
)
val scn1 = scenario("Admin Login")
.exec(http("Login Admin")
.post({BaseUrl} + "/api/user/login")
.formParam("username", admin123)
.formParam("password", pass123)
.check(jsonPath("$.authToken").saveAs("tokenId")))
.exec { session => println(session("tokenId").as[String]); session } //authToken getting printed
val common_headers = Map(
"Accept" -> """*/*""",
"Accept-Encoding" -> "gzip, deflate, br",
"Accept-Language" -> "en-GB,en;q=0.9,en-US;q=0.8",
"Authorization" -> "Bearer " + $tokenId, //With hardcoded authToken works. Need to pass generated authToken in prev scenario here.
"Connection" -> "keep-alive",
)
val scn2 = scenario("All Employees")
.exec(http("All Employees")
.post("/api/employee/lists/")
.headers(common_headers)
.body(RawFileBody("test/TestEnv4trial/employees_request.json")).asJson)
setUp(
scn1.inject(atOnceUsers(1)).protocols(httpProtocol),
scn2.inject(atOnceUsers(1)).protocols(httpProtocol))
}
When I hardcode authToken generated in scn1 in common_headers, scn2 works.
But when I use tokenId, its not able to identify tokenId.
How do I pass saved key tokenId in common_headers?
Thanks.
"Authorization" -> "Bearer " + $tokenId
This doesn't compile.
Currently, you're using Session attributes, meaning tokenId is scoped for the single user executing scn1.
There's no way for a user executing scn2 to be able to reach it as is.
I am trying to connect to Neo4j from Spark using neo4j-spark-connector. I am facing an authentication issue when I try to connect to the Neo4j org.neo4j.driver.v1.exceptions.AuthenticationException: Unsupported authentication token, scheme='none' only allowed when auth is disabled: { scheme='none' }
I have checked and the credentials I am passing are correct. Not sure why is it failing.
import org.neo4j.spark._
import org.apache.spark._
import org.graphframes._
import org.apache.spark.sql.SparkSession
import org.neo4j.driver.v1.GraphDatabase
import org.neo4j.driver.v1.AuthTokens
val config = new SparkConf()
config.set(Neo4jConfig.prefix + "url", "bolt://localhost")
config.set(Neo4jConfig.prefix + "user", "neo4j")
config.set(Neo4jConfig.prefix + "password", "root")
val sparkSession :SparkSession = SparkSession.builder.config(config).getOrCreate()
val neo = Neo4j(sparkSession.sparkContext)
val graphFrame = neo.pattern(("Person","id"),("KNOWS","null"), ("Employee","id")).partitions(3).rows(1000).loadGraphFrame
println("**********Graphframe Vertices Count************")
graphFrame.vertices.count
println("**********Graphframe Edges Count************")
graphFrame.edges.count
val pageRankFrame = graphFrame.pageRank.maxIter(5).run()
val ranked = pageRankFrame.vertices
ranked.printSchema()
val top3 = ranked.orderBy(ranked.col("pagerank").desc).take(3)
Can someone please have a look and let me know the reason for the same?
It might be a configuration issue with your neo4j.conf file. Is this line commented out:
dbms.security.auth_enabled=false
I had a similar problem, creating the following spring beans fixed the issue.
#Bean
public org.neo4j.ogm.config.Configuration getConfiguration() {
return new org.neo4j.ogm.config.Configuration.Builder()
.credentials("neo4j", "secret")
.uri("bolt://localhost:7687").build();
}
#Bean
public SessionFactory sessionFactory(org.neo4j.ogm.config.Configuration configuration) {
return new SessionFactory(configuration,
"<your base package>");
}
Good Day, I have an Issue uploading Jobs to Flink API using Scala
All Get request seem to work
import scalaj.http._
val url: String = "http://127.0.0.1:8081"
val response: HttpResponse[String] = Http(url+"/config").asString
return response
When I try Uploading a JAR file through CURL (works)
curl -vvv -X POST -H "Expect:" -F "jarfile=#/home/Downloads/myJob.jar" http://127.0.0.1:8081/jars/upload
Now I would Like to upload using SCALA
The documentation does not provide a working example and I am fairly new to this type of post: https://ci.apache.org/projects/flink/flink-docs-release-1.3/monitoring/rest_api.html#submitting-programs
Currently my code is (Does not Work):
Taken from : https://github.com/Guru107/flinkjobuploadplugin/tree/master/src/main/java/com/github/guru107 - Edited to my needs
// Ideal Case is to upload a Jar File as a multipart in Scala
import java.io.IOException
import org.apache.http.client.methods.HttpPost
import org.apache.http.entity.mime.MultipartEntityBuilder
import org.apache.http.impl.client.{HttpClients, LaxRedirectStrategy}
import org.apache.http.message.BasicHeader
import org.apache.http.util.EntityUtils
val requestUrl = "http://localhost:8081/jars/upload"
val jarPath = "#/home/Downloads/myJob.jar"
val httpClient: CloseableHttpClient = HttpClients.custom.setRedirectStrategy(new LaxRedirectStrategy).build
val fileToUpload: File = new File(jarPath)
val uploadFileUrl: HttpPost = new HttpPost(requestUrl)
val builder: MultipartEntityBuilder = MultipartEntityBuilder.create
builder.addBinaryBody("jarfile", fileToUpload)
val multipart: HttpEntity = builder.build
var jobUploadResponse: JSONObject = null
uploadFileUrl.setEntity(multipart)
var response: CloseableHttpResponse = null
try {
response = httpClient.execute(uploadFileUrl)
println("response: " + response)
response.setHeader(new BasicHeader("Expect", ""))
response.setHeader(new BasicHeader("content-type", "application/x-java-archive"))
val bodyAsString = EntityUtils.toString(response.getEntity, "UTF-8")
println("bodyAsString: " + bodyAsString)
jobUploadResponse = new JSONObject(bodyAsString)
println("jobUploadResponse: " + jobUploadResponse)
}
It fails to upload file.
Please provide a working example or link of scala example to upload a job/jar file to flink in scala
Thanks in Advance
You can use the client code from com.github.mjreid.flinkwrapper
And upload jar file with scala code:
val apiEndpoint: String = as.settings.config.getString("flink.url") //http://<flink_web_host>:<flink_web_port>
val client = FlinkRestClient(apiEndpoint, as)
client.runProgram(<jarId>)
I send a post to my playframework backend and when I want to print the body I got the Message AnyContentAsEmpty
My controller lools like this:
def createProcess = Action(parse.multipartFormData) { implicit request =>
println(request.body)
Ok("s")
}
my route looks like this
POST /process #controllers.ProcessesController.createProcess()
OPTIONS /process #controllers.ProcessesController.createProcess()
Whats the problem?
Solution is:
in application.conf
# Global fliters
play.http.filters=helpers.Filters
play.filters.cors {
pathPrefixes = ["/"]
allowedOrigins = null
allowedHttpMethods = ["GET", "POST", "PUT", "DELETE", "OPTIONS"]
allowedHttpHeaders = null
preflightMaxAge = 3 days
}
in filters.scala
package helpers
import javax.inject.Inject
import play.api.http.DefaultHttpFilters
import play.filters.cors.CORSFilter
class Filters #Inject()(corsFilter: CORSFilter)
extends DefaultHttpFilters(corsFilter)
How to do HTTP PUT/POSTs from inside Groovy code without having to import any libraries (if at all possible)? I know there is a simple getText() methods that Groovy adds to the java.net.URL class, that could be used without adding any dependencies. Is there a way to do Rest PUT in the same fashion?
You can do it with HttpURLConnection in a similar way as you would do it with java:
def url = new URL('http://your_rest_endpoint')
def http = url.openConnection()
http.setDoOutput(true)
http.setRequestMethod('PUT')
http.setRequestProperty('User-agent', 'groovy script')
def out = new OutputStreamWriter(http.outputStream)
out.write('data')
out.close()
http.inputStream // read server response from it
import groovyx.net.http.RESTClient
import static groovyx.net.http.ContentType.JSON
import groovy.json.JsonSlurper
import groovy.json.JsonOutput
url = "http://restapi3.apiary.io"
#Grab (group = 'org.codehaus.groovy.modules.http-builder', module = 'http-builder', version = '0.5.0')
def client = new RESTClient(url)
def jsonObj = new JsonSlurper().parseText('{ "title": "Pick-up posters from Post-Office" }')
def response = client.put(path: "/notes/id",
contentType: JSON,
body: jsonObj,
headers: [Accept: 'application/json'])
println("Status: " + response.status)
if (response.data) {
println("Content Type: " + response.contentType)
println("Headers: " + response.getAllHeaders())
println("Body:\n" + JsonOutput.prettyPrint(JsonOutput.toJson(response.data)))
}