Understanding log.info() - scala

Can you please explain what this log.info is doing here? Where can I see this log info for debugging purpose? In my existing project, it has been used at many places inside for loop. Thanks!
import java.io.{BufferedInputStream, FileInputStream}
import cats.instances.list._
import cats.syntax.contravariantSemigroupal._
import cats.syntax.flatMap._
import cats.syntax.parallel._
import com.google.api.client.util.{DateTime => GDateTime}
import com.google.api.services.storage.model.StorageObject
import com.google.cloud.storage.Storage
import com.leighperry.log4zio.Log.SafeLog
import zio.interop.catz._
import zio.{IO, Managed, ZIO}
def fileContentsToGcs(
cfg: AppConfig,
log: SafeLog[String],
filepath: String
): List[String] = {
val result: IO[FilesError, List[String]] =
for {
_ <- log.info(s"GCS file ${cfg.gcsBucketProject} ${cfg.gcsInputBucket} $filepath")
// list of csv files
contents <- expandedFileContents(log, cfg, filepath)
_ <- log.info(s"GCS temp files $contents")
} yield contents
unsafeRun(result)
}

Related

How to fix ZIO server Internal Server Error 500 generated buy custom method

I have this 2 methods
import org.h2.store.fs.FilePath
import zio.*
import zio.Console.printLine
import zio.http.Client
import zio.nio.file.*
import zio.nio.charset.Charset
import zio.stream.*
import java.io.IOException
object FileStorage:
def saveToFile(data: String = "", filePath: String = "src/main/resources/data.json"): Unit =
lazy val logic = for {
encoded <- Charset.Standard.utf8.encodeString(data)
path = Path(filePath.split("/").head, filePath.split("/").tail: _*)
notExists <- Files.notExists(path)
- <- if (notExists) Files.createFile(path) else ZIO.attempt(())
_ <- Files.writeBytes(path, encoded)
_ <- Console.printLine(s"written to $path")
} yield ()
def unsafeF = (unsafeVal: Unsafe) => {
implicit val unsafe: Unsafe = unsafeVal
Runtime.default.unsafe.run(logic)
}
Unsafe.unsafe(unsafeF)
def readFromFile: ZIO[Any, Throwable, String] = {
val path = Path("src", "main", "resources", "data.json")
val bool = for bool <- Files.isReadable(path) yield bool
val zioStr = bool.flatMap(bool =>
if (bool) Files.readAllLines(path, Charset.Standard.utf8).map(fileLines => fileLines.head)
else {
saveToFile()
readFromFile})
zioStr
}
In def readFromFile i try to make empty an empty file if file don't exists
File generation working fine
then I'm trying to read that empty file and return it like a ZIO Response like that
import zio.http.{Client, *}
import zio.json.*
import zio.http.model.Method
import zio.{Scope, Task, ZIO, ZIOAppDefault}
import zio.http.Client
import zhttp.http.Status.NotFound
import zhttp.http.Status
import scala.language.postfixOps
import zio._
import scala.collection.immutable.List
import zio.{ExitCode, URIO, ZIO}
object ClientServer extends ZIOAppDefault {
val app: Http[Client, Throwable, Request, Response] = Http.collectZIO[Request]
case Method.GET -> !! / "readLeagues" =>
FileStorage.readFromFile.map(str => Response.json(str))
BUT in this case I getting Internal Server Error 500 on postman in http://localhost:8080/readLeagues
If at first I feed prefilled json file to
def readFromFile
It works fine Status: 200
And I getting a nice looking json as a body
Maybe I should set another default strings for data to prefill
def saveToFile
so json can be parsable?
or smth else

How to reserve ZIO response inside a custom method in

I have this method
import ClientServer.*
import zio.http.{Client, *}
import zio.json.*
import zio.http.model.Method
import zio.{ExitCode, URIO, ZIO}
import sttp.capabilities.*
import sttp.client3.Request
import zio.*
import zio.http.model.Headers.Header
import zio.http.model.Version.Http_1_0
import zio.stream.*
import java.net.InetAddress
import sttp.model.sse.ServerSentEvent
import sttp.client3._
object fillFileWithLeagues:
def fill = for {
openDotaResponse <- Client.request("https://api.opendota.com/api/leagues")
bodyOfResponse <- openDotaResponse.body.asString
listOfLeagues <- ZIO.fromEither(bodyOfResponse.fromJson[List[League]].left.map(error => new Exception(error)))
save = FileStorage.saveToFile(listOfLeagues.toJson) //Ok
}yield ()
println("Im here fillFileWithLeagues.fill ")
and when I try use
fillFileWithLeagues.fill
nothing happens
I'm trying fill file with data from target api using
fillFileWithLeagues.fill
def readFromFileV8(path: Path = Path("src", "main", "resources", "data.json")): ZIO[Any, Throwable, String] =
val zioStr = (for bool <- Files.isReadable(path) yield bool).flatMap(bool =>
if (bool) Files.readAllLines(path, Charset.Standard.utf8).map(_.head)
else {
fillFileWithLeagues.fill
wait(10000)
println("im here readFromFileV8")
readFromFileV8()})
zioStr
I'm expecting data.json file must created from
Client.request("https://api.opendota.com/api/leagues")
but there is nothing happens
Maybe I should use some sttp, or some other tools?
If we fix indentation of the code we'll find this:
object fillFileWithLeagues {
def fill = {
for {
openDotaResponse <- Client.request("https://api.opendota.com/api/leagues")
bodyOfResponse <- openDotaResponse.body.asString
listOfLeagues <- ZIO.fromEither(bodyOfResponse.fromJson[List[League]].left.map(error => new Exception(error)))
save = FileStorage.saveToFile(listOfLeagues.toJson) //Ok
} yield ()
}
println("Im here fillFileWithLeagues.fill ")
}
As you see the println is part of fillFileWithLeagues, not of fill.
Another potential problem is that an expression like fillFileWithLeagues.fill only returns a ZIO instance, it is not yet evaluated. To evaluate it, it needs to be run. For example as follows:
import zio._
object MainApp extends ZIOAppDefault {
def run = fillFileWithLeagues.fill
}

Why "missing parameter type error" when i run scala REPL in Flink with Java?

When I run the flink scala REPL script in java cannot compile.
I tried this java code to run Flink scala REPL for test, bug always exception.
Settings settings = new Settings();
((MutableSettings.BooleanSetting) settings.usejavacp()).value_$eq(true);
IMain main = new IMain(settings, new PrintWriter(System.out));
// Thread.currentThread().setContextClassLoader(main.classLoader());
for (String imp : imports) {
main.interpret(MessageFormat.format("import {0}", imp));
}
ExecutionEnvironment env = ExecutionEnvironment.createLocalEnvironment();
String script = FileUtils.readFileToString(new File("/opt/project/security-detection/sappo/src/sappo-interpreter/src/test/resources/demo.txt"), StandardCharsets.UTF_8);
main.bind(new NamedParamClass("env", ExecutionEnvironment.class.getName(), env));
main.interpret(script);
scala text
val text = env.fromElements("Who's there?", "I think I hear them. Stand, ho! Who's there?")
// result 1
val counts = text.flatMap { _.toLowerCase.split("\\W+") filter { _.nonEmpty } } map { (_, 1) } groupBy(0) sum(1)
counts.print()
// result 2
val counts = text.map((x:String) => 1)
counts.print()
// result 3
text.print()
result 1
import org.apache.flink.core.fs._
import org.apache.flink.core.fs.local._
import org.apache.flink.api.common.io._
import org.apache.flink.api.common.aggregators._
import org.apache.flink.api.common.accumulators._
import org.apache.flink.api.common.distributions._
import org.apache.flink.api.common.operators._
import org.apache.flink.api.common.operators.base.JoinOperatorBase.JoinHint
import org.apache.flink.api.common.functions._
import org.apache.flink.api.java.io._
import org.apache.flink.api.java.aggregation._
import org.apache.flink.api.java.functions._
import org.apache.flink.api.java.operators._
import org.apache.flink.api.java.sampling._
import org.apache.flink.api.scala._
import org.apache.flink.api.scala.utils._
import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.api.windowing.time._
env: org.apache.flink.api.java.ExecutionEnvironment = Local Environment (parallelism = 8) : ee335d29eefca69ee5fe7279414fc534
console:67: error: missing parameter type for expanded function ((x$1) => x$1.toLowerCase.split("\\W+").filter(((x$2) => x$2.nonEmpty)))
val counts = text.flatMap { _.toLowerCase.split("\\W+") filter { _.nonEmpty } } map { (_, 1) } groupBy(0) sum(1)
result 2
import org.apache.flink.core.fs._
import org.apache.flink.core.fs.local._
import org.apache.flink.api.common.io._
import org.apache.flink.api.common.aggregators._
import org.apache.flink.api.common.accumulators._
import org.apache.flink.api.common.distributions._
import org.apache.flink.api.common.operators._
import org.apache.flink.api.common.operators.base.JoinOperatorBase.JoinHint
import org.apache.flink.api.common.functions._
import org.apache.flink.api.java.io._
import org.apache.flink.api.java.aggregation._
import org.apache.flink.api.java.functions._
import org.apache.flink.api.java.operators._
import org.apache.flink.api.java.sampling._
import org.apache.flink.api.scala._
import org.apache.flink.api.scala.utils._
import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.api.windowing.time._
env: org.apache.flink.api.java.ExecutionEnvironment = Local Environment (parallelism = 8) : 5cbf8e476ebf32fd8fdf91766bd40af0
console:71: error: type mismatch;
found : String => Int
required: org.apache.flink.api.common.functions.MapFunction[String,?]
val counts = text.map((x:String) => 1)
result 3
import org.apache.flink.core.fs._
import org.apache.flink.core.fs.local._
import org.apache.flink.api.common.io._
import org.apache.flink.api.common.aggregators._
import org.apache.flink.api.common.accumulators._
import org.apache.flink.api.common.distributions._
import org.apache.flink.api.common.operators._
import org.apache.flink.api.common.operators.base.JoinOperatorBase.JoinHint
import org.apache.flink.api.common.functions._
import org.apache.flink.api.java.io._
import org.apache.flink.api.java.aggregation._
import org.apache.flink.api.java.functions._
import org.apache.flink.api.java.operators._
import org.apache.flink.api.java.sampling._
import org.apache.flink.api.scala._
import org.apache.flink.api.scala.utils._
import org.apache.flink.streaming.api.scala._
import org.apache.flink.streaming.api.windowing.time._
env: org.apache.flink.api.java.ExecutionEnvironment = Local Environment (parallelism = 8) : ee335d29eefca69ee5fe7279414fc534
Who's there?
I think I hear them. Stand, ho! Who's there?
text: org.apache.flink.api.java.operators.DataSource[String] = org.apache.flink.api.java.operators.DataSource#53e28097
PASSED: testIMain
PASSED: testIMainScript
Try using the Scala REPL that comes with Flink:
$ bin/start-scala-shell.sh local
I tried the three examples you shared (with Flink 1.7.0), and they all worked just fine.

error getting files from gridfs

I'm trying to display images and files from gridfs. So I started with the save function and it's working well:
import javax.inject.Inject
import org.joda.time.DateTime
import scala.concurrent.Future
import play.api.Logger
import play.api.Play.current
import play.api.i18n.{ I18nSupport, MessagesApi }
import play.api.mvc.{ Action, Controller, Request }
import play.api.libs.concurrent.Execution.Implicits.defaultContext
import play.api.libs.json.{ Json, JsObject, JsString }
import reactivemongo.api.gridfs.{ GridFS, ReadFile }
import play.modules.reactivemongo.{
MongoController, ReactiveMongoApi, ReactiveMongoComponents
}
import play.modules.reactivemongo.json._, ImplicitBSONHandlers._
import play.modules.reactivemongo.json.collection._
class griidfs #Inject() (
val messagesApi: MessagesApi,
val reactiveMongoApi: ReactiveMongoApi)
extends Controller with MongoController with ReactiveMongoComponents {
import java.util.UUID
import MongoController.readFileReads
type JSONReadFile = ReadFile[JSONSerializationPack.type, JsString]
// get the collection 'articles'
// a GridFS store named 'attachments'
//val gridFS = GridFS(db, "attachments")
private val gridFS = reactiveMongoApi.gridFS
// let's build an index on our gridfs chunks collection if none
gridFS.ensureIndex().onComplete {
case index =>
Logger.info(s"Checked index, result is $index")
}
def saveAttachment =
Action.async(gridFSBodyParser(gridFS)) { request =>
// here is the future file!
val futureFile = request.body.files.head.ref
futureFile.onFailure {
case err => err.printStackTrace()
}
// when the upload is complete, we add the article id to the file entry (in order to find the attachments of the article)
val futureUpdate = for {
file <- { println("_0"); futureFile }
// here, the file is completely uploaded, so it is time to update the article
updateResult <- {
println("_1"); futureFile
}
} yield updateResult
futureUpdate.map { _ =>
Redirect(routes.Application.index())
}.recover {
case e => InternalServerError(e.getMessage())
}
}
but when I try to get files from gridfs to display them in my browser with this code:
import reactivemongo.api.gridfs.Implicits.DefaultReadFileReader
def getAttachment = Action.async { request =>
// find the matching attachment, if any, and streams it to the client
val file = gridFS.find[JsObject, JSONReadFile](Json.obj("_id" -> id))
request.getQueryString("inline") match {
case Some("true") =>
serve[JsString, JSONReadFile](gridFS)(file, CONTENT_DISPOSITION_INLINE)
case _ => serve[JsString, JSONReadFile](gridFS)(file)
}
}
I get this error:
type arguments [play.api.libs.json.JsObject,griidfs.this.JSONReadFile]
do not conform to method find's type parameter bounds
[S,T <:
reactivemongo.api.gridfs.ReadFile[reactivemongo.play.json.JSONSerializationPack.type, _]]
in this line:
val file = gridFS.find[JsObject, JSONReadFile](Json.obj("_id" -> id))
Any help please?
I have been battling the same issue most of the day and finally have it up and running. I think it has something to do with the imports I will check in my work then begin to remove them one by one to check which one causes the issue. In the meantime here is a list of my imports. Hope this helps
import javax.inject.Inject
import forms._
import models._
import org.joda.time.DateTime
import play.modules.reactivemongo.json.JSONSerializationPack
import services._
import play.api.data._
import play.api.data.Forms._
import scala.async.Async._
import play.api.i18n.{ I18nSupport, MessagesApi }
import play.api.mvc.{ Action, Controller, Request }
import play.api.libs.json.{ Json, JsObject, JsString }
import reactivemongo.api.gridfs.{ GridFS, ReadFile }
import play.modules.reactivemongo.{
MongoController, ReactiveMongoApi, ReactiveMongoComponents
}
import play.api.libs.concurrent.Execution.Implicits.defaultContext
import scala.concurrent.Future
import play.modules.reactivemongo.json._
import com.mohiva.play.silhouette.api.{ Environment, LogoutEvent, Silhouette }
import com.mohiva.play.silhouette.impl.authenticators.CookieAuthenticator
import com.mohiva.play.silhouette.impl.providers.SocialProviderRegistry
import MongoController._
Edit Answer:
I have narrowed it down to this one: "import MongoController._"
add it and you should be ok :)

foreach loop error reading from file

I am trying to read lines from a text file into a foreach loop but I keep getting this error: value getLines is not a member of org.xml.sax.InputSource. Can someone explain what this error means, and how I can resolve it?
import scala.xml._
import collection.mutable.HashMap
val noDupFile="nodup_steam_out.txt"
Source.fromFile(noDupFile).getLines().par.foreach((res:String){
//....
})
import scala.xml._
import collection.mutable.HashMap
val noDupFile="nodup_steam_out.txt"
io.Source.fromFile(noDupFile).getLines().foreach(res => {
//....
})
without the right import you are refering to org.xml.sax.InputSource
io.Source.fromFile returns a io.BufferedSource and does not have the par method (defined on Parallelizable).
You can write
import scala.xml._
import collection.mutable.HashMap
val noDupFile="nodup_steam_out.txt"
io.Source.fromFile(noDupFile).getLines().toStream.par.foreach(res => {
//....
})
You seem to be trying to use the scala.xml library to iterate over you txt file. If you import scala.io._ instead, you should be able to do something like:
import scala.io._
val noDupFile="nodup_steam_out.txt"
Source.fromFile(noDupFile).getLines().foreach{ res =>
//....
}