Nativescript and angular 2 app breaks when making http calls - angular2-http

On updating to tns core module 2.2.0 and angular rc4 (officially released version by telerik), My app can no longer make http calls to a server, I keep getting this error
com.tns.NativeScriptException:
Calling js method onClick failed
EXCEPTION: Error in /data/data/org.nativescript.EatSafe/files/app/pages/login/login.html:5:75
ORIGINAL EXCEPTION: Error: not implemented
ORIGINAL STACKTRACE:
Error: not implemented
at NativeScriptDomAdapter.Parse5DomAdapter.getCookie (/data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/platform-server/src/parse5_adapter.js:619:68)
at CookieXSRFStrategy.configureRequest (/data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/http/src/backends/xhr_backend.js:150:82)
at XHRBackend.createConnection (/data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/http/src/backends/xhr_backend.js:165:28)
at httpRequest (/data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/http/src/http.js:22:20)
at Http.post (/data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/http/src/http.js:78:16)
at UserService.signin (/data/data/org.nativescript.EatSafe/files/app/shared/services/user.service.js:13:27)
at LoginComponent.login (/data/data/org.nativescript.EatSafe/files/app/pages/login/login.component.js:31:27)
at DebugAppView._View_LoginComponent0._handle_tap_8_0 (LoginComponent.template.js:355:28)
at /data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/core/src/linker/view.js:375:24
at /data/data/org.nativescript.EatSafe/files/app/tns_modules/nativescript-angular/renderer.js:204:26
ERROR CONTEXT:
[object Object]
File: "/data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/core/src/linker/view.js, line: 365, column: 16
StackTrace:
Frame: function:'DebugAppView._rethrowWithContext', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/core/src/linker/view.js', line: 365, column: 17
Frame: function:'', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/core/src/linker/view.js', line: 378, column: 23
Frame: function:'', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/nativescript-angular/renderer.js', line: 204, column: 26
Frame: function:'ZoneDelegate.invoke', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/zone.js/dist/zone-node.js', line: 290, column: 29
Frame: function:'NgZoneImpl.inner.inner.fork.onInvoke', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/core/src/zone/ng_zone_impl.js', line: 53, column: 41
Frame: function:'ZoneDelegate.invoke', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/zone.js/dist/zone-node.js', line: 289, column: 35
Frame: function:'Zone.run', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/zone.js/dist/zone-node.js', line: 183, column: 44
Frame: function:'NgZoneImpl.runInner', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/core/src/zone/ng_zone_impl.js', line: 84, column: 71
Frame: function:'NgZone.run', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/#angular/core/src/zone/ng_zone.js', line: 235, column: 66
Frame: function:'zonedCallback', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/nativescript-angular/renderer.js', line: 203, column: 24
Frame: function:'Observable.notify', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/data/observable/observable.js', line: 174, column: 23
Frame: function:'Observable._emit', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/data/observable/observable.js', line: 193, column: 18
Frame: function:'_android.setOnClickListener.android.view.View.OnClickListener.onClick', file:'/data/data/org.nativescript.EatSafe/files/app/tns_modules/ui/button/button.js', line: 33, column: 32
at com.tns.Runtime.callJSMethodNative(Native Method)
at com.tns.Runtime.dispatchCallJSMethodNative(Runtime.java:862)
at com.tns.Runtime.callJSMethodImpl(Runtime.java:727)
at com.tns.Runtime.callJSMethod(Runtime.java:713)
at com.tns.Runtime.callJSMethod(Runtime.java:694)
at com.tns.Runtime.callJSMethod(Runtime.java:684)
at com.tns.gen.android.view.View_OnClickListener.onClick(View_OnClickListener.java:11)
at android.view.View.performClick(View.java:5233)
at android.view.View$PerformClick.run(View.java:21209)
at android.os.Handler.handleCallback(Handler.java:739)
at android.os.Handler.dispatchMessage(Handler.java:95)
at android.os.Looper.loop(Looper.java:152)
at android.app.ActivityThread.main(ActivityThread.java:5507)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:726)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:616)
I have been trying to look online for a release log of the update to see if there are any breaking changes, but to no avail. does anyone have any pointers on how to make http calls with the new nativescript angular updates?
Thank you

I got this fixed by importing in app.moudle.ts the following
import {NativeScriptHttpModule} from 'nativescript-angular/http';
And then
imports: [NativeScriptHttpModule]

You need this code in your main.ts file
import {Parse5DomAdapter} from '#angular/platform-server/src/parse5_adapter';
(<any>Parse5DomAdapter).prototype.getCookie = function (name) { return null; };

Related

pyflink with kafka java.lang.RuntimeException: Failed to create stage bundle factory

・Python3.8
・JDK 11
I've started learning pyflink and write a code instructed by official web which is https://nightlies.apache.org/flink/flink-docs-master/docs/dev/python/datastream/intro_to_datastream_api/
And here is my code
from pyflink.common.serialization import JsonRowDeserializationSchema,JsonRowSerializationSchema
from pyflink.common import WatermarkStrategy, Row
from pyflink.common.serialization import Encoder
from pyflink.common.typeinfo import Types
from pyflink.datastream import StreamExecutionEnvironment
from pyflink.datastream.connectors import FlinkKafkaConsumer,FlinkKafkaProducer
def streaming():
env = StreamExecutionEnvironment.get_execution_environment()
deserialization_schema =JsonRowDeserializationSchema.builder().type_info(
type_info=Types.ROW([Types.INT(), Types.STRING()])).build()
kafka_consumer = FlinkKafkaConsumer(
topics='test',
deserialization_schema=deserialization_schema,
properties={'bootstrap.servers': 'localhost:9092','group.id': 'test_group'})
ds = env.add_source(kafka_consumer)
ds = ds.map(lambda a: Row(a % 4, 1),
output_type=Types.ROW([Types.LONG(), Types.LONG()])) \
.key_by(lambda a: a[0]) \
.reduce(lambda a, b: Row(a[0], a[1] + b[1]))
serialization_schema = JsonRowSerializationSchema.builder().with_type_info(
type_info=Types.ROW([Types.LONG(), Types.LONG()])).build()
kafka_sink = FlinkKafkaProducer(
topic='test_sink_topic',
serialization_schema=serialization_schema,
producer_config={'bootstrap.servers': 'localhost:9092',
'group.id': 'test_group'})
ds.add_sink(kafka_sink)
env.execute('datastream_api_demo')
if __name__ == '__main__':
streaming()
Firstly it said to me to specify jarfile. So I downloaded flink-connector-kafka and kafka-clients jarfile for each from https://mvnrepository.com/artifact/org.apache.flink and put them into pyflink/lib directory.
And now I'm at next step getting this error;
(pyflink_demo) C:\work\pyflink_demo>python Kafka_stream_Kafka.py
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.flink.api.java.ClosureCleaner (file:/C:/work/pyflink_demo/Lib/site-packages/pyflink/lib/flink-dist_2.11-1.14.4.jar) to field java.util.P
roperties.serialVersionUID
WARNING: Please consider reporting this to the maintainers of org.apache.flink.api.java.ClosureCleaner
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Traceback (most recent call last):
File "Kafka_stream_Kafka.py", line 38, in <module>
streaming()
File "Kafka_stream_Kafka.py", line 33, in streaming
env.execute('datastream_api_demo')
File "C:\work\pyflink_demo\lib\site-packages\pyflink\datastream\stream_execution_environment.py", line 691, in execute
return JobExecutionResult(self._j_stream_execution_environment.execute(j_stream_graph))
File "C:\work\pyflink_demo\lib\site-packages\py4j\java_gateway.py", line 1285, in __call__
return_value = get_return_value(
File "C:\work\pyflink_demo\lib\site-packages\pyflink\util\exceptions.py", line 146, in deco
return f(*a, **kw)
File "C:\work\pyflink_demo\lib\site-packages\py4j\protocol.py", line 326, in get_return_value
raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling o0.execute.
: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.runtime.minicluster.MiniClusterJobClient.lambda$getJobExecutionResult$3(MiniClusterJobClient.java:137)
at java.base/java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:642)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.runtime.rpc.akka.AkkaInvocationHandler.lambda$invokeRpc$1(AkkaInvocationHandler.java:258)
at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.util.concurrent.FutureUtils.doForward(FutureUtils.java:1389)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$null$1(ClassLoadingUtils.java:93)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.lambda$guardCompletionWithContextClassLoader$2(ClassLoadingUtils.java:92)
at java.base/java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at java.base/java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at java.base/java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.base/java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:2073)
at org.apache.flink.runtime.concurrent.akka.AkkaFutureUtils$1.onComplete(AkkaFutureUtils.java:47)
at akka.dispatch.OnComplete.internal(Future.scala:300)
at akka.dispatch.OnComplete.internal(Future.scala:297)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:224)
at akka.dispatch.japi$CallbackBridge.apply(Future.scala:221)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at org.apache.flink.runtime.concurrent.akka.AkkaFutureUtils$DirectExecutionContext.execute(AkkaFutureUtils.java:65)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:68)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.$anonfun$tryComplete$1$adapted(Promise.scala:284)
at scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:284)
at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:621)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:24)
at akka.pattern.PipeToSupport$PipeableFuture$$anonfun$pipeTo$1.applyOrElse(PipeToSupport.scala:23)
at scala.concurrent.Future.$anonfun$andThen$1(Future.scala:532)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:29)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:29)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:60)
at akka.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:63)
at akka.dispatch.BatchingExecutor$BlockableBatch.$anonfun$run$1(BatchingExecutor.scala:100)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:81)
at akka.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:100)
at akka.dispatch.TaskInvocation.run(AbstractDispatcher.scala:49)
at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:48)
at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:252)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:242)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:684)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:444)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:316)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:314)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:123)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
... 5 more
Caused by: java.lang.RuntimeException: Failed to create stage bundle factory! INFO:root:Initializing Python harness: C:\work\pyflink_demo\lib\site-packages\pyflink\fn_execution\beam\bea
m_boot.py --id=4-1 --provision_endpoint=localhost:51794
INFO:root:Starting up Python harness in loopback mode.
at org.apache.flink.streaming.api.runners.python.beam.BeamPythonFunctionRunner.createStageBundleFactory(BeamPythonFunctionRunner.java:566)
at org.apache.flink.streaming.api.runners.python.beam.BeamPythonFunctionRunner.open(BeamPythonFunctionRunner.java:255)
at org.apache.flink.streaming.api.operators.python.AbstractPythonFunctionOperator.open(AbstractPythonFunctionOperator.java:131)
at org.apache.flink.streaming.api.operators.python.AbstractOneInputPythonFunctionOperator.open(AbstractOneInputPythonFunctionOperator.java:116)
at org.apache.flink.streaming.api.operators.python.PythonProcessOperator.open(PythonProcessOperator.java:59)
at org.apache.flink.streaming.runtime.tasks.RegularOperatorChain.initializeStateAndOpenOperators(RegularOperatorChain.java:110)
at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreGates(StreamTask.java:711)
at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.call(StreamTaskActionExecutor.java:100)
at org.apache.flink.streaming.runtime.tasks.StreamTask.restoreInternal(StreamTask.java:687)
at org.apache.flink.streaming.runtime.tasks.StreamTask.restore(StreamTask.java:654)
at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:958)
at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:927)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:766)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:575)
at java.base/java.lang.Thread.run(Thread.java:834)
Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalStateException: Process died with exit code 0
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4964)
at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:451)
at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory.<init>(DefaultJobBundleFactory.java:436)
at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory.forStage(DefaultJobBundleFactory.java:303)
at org.apache.flink.streaming.api.runners.python.beam.BeamPythonFunctionRunner.createStageBundleFactory(BeamPythonFunctionRunner.java:564)
... 14 more
Caused by: java.lang.IllegalStateException: Process died with exit code 0
at org.apache.beam.runners.fnexecution.environment.ProcessManager$RunningProcess.isAliveOrThrow(ProcessManager.java:75)
at org.apache.beam.runners.fnexecution.environment.ProcessEnvironmentFactory.createEnvironment(ProcessEnvironmentFactory.java:112)
at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:252)
at org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$1.load(DefaultJobBundleFactory.java:231)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
... 22 more
I tried to figure out what's going on and found very similar question What's wrong with my Pyflink setup that Python UDFs throw py4j exceptions?
It says that was caused by network proxy problem. JVM and python uses local socket communication. So set local communication with no proxy.
I set environment valuable "no_proxy" but it doesn't work.
enter image description here
Could anyone provide solution for this?
There is no useful information in the exception stack to help to identify the problem. This should be caused by a known issue(FLINK-26543, already solved, however still not released). This issue only occurs in loopback mode which is enabled by default when executing the job locally.
For now, you could try to force the job run in process mode instead of loopback mode by setting environment variable _python_worker_execution_mode to process. After doing this, you should see the root cause of the failure.
Besides, there is also a small issue in your code. I guess you meant ds.map(lambda a: Row(a[0] % 4, 1), output_type=Types.ROW([Types.LONG(), Types.LONG()])) instead of ds.map(lambda a: Row(a % 4, 1), output_type=Types.ROW([Types.LONG(), Types.LONG()])) as it doesn't support % operation in Row object.
I have tried the script. I am not quite sure what caused the error. Try to start kafka first and create the topics, before running the script. Or start kafka and run the script a second time after first failure.

pyspark issue while connecting

I am new to the pyspark.
i was trying to initialize a pyspark session .
But getting the below error. I am doing the pyspark2 command in local machine .
When i tried first time using scala the spark session invokation is correct . Then i tried to invoke Pyspark that time i am getting error. Please let me know how i can come out of this error
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/03/08 22:55:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
22/03/08 22:55:41 WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor). This may indicate an error, since only one SparkContext should be running in this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480)
py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
py4j.Gateway.invoke(Gateway.java:238)
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
py4j.ClientServerConnection.run(ClientServerConnection.java:106)
java.base/java.lang.Thread.run(Thread.java:833)
C:\Spark\spark-3.2.1-bin-hadoop3.2\spark-3.2.1-bin-hadoop3.2\bin\..\python\pyspark\shell.py:42: UserWarning: Failed to initialize Spark session.
warnings.warn("Failed to initialize Spark session.")
Traceback (most recent call last):
File "C:\Spark\spark-3.2.1-bin-hadoop3.2\spark-3.2.1-bin-hadoop3.2\bin\..\python\pyspark\shell.py", line 38, in <module>
spark = SparkSession._create_shell_session() # type: ignore
File "C:\Spark\spark-3.2.1-bin-hadoop3.2\spark-3.2.1-bin-hadoop3.2\python\pyspark\sql\session.py", line 553, in _create_shell_session
return SparkSession.builder.getOrCreate()
File "C:\Spark\spark-3.2.1-bin-hadoop3.2\spark-3.2.1-bin-hadoop3.2\python\pyspark\sql\session.py", line 228, in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "C:\Spark\spark-3.2.1-bin-hadoop3.2\spark-3.2.1-bin-hadoop3.2\python\pyspark\context.py", line 392, in getOrCreate
SparkContext(conf=conf or SparkConf())
File "C:\Spark\spark-3.2.1-bin-hadoop3.2\spark-3.2.1-bin-hadoop3.2\python\pyspark\context.py", line 146, in __init__
self._do_init(master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
File "C:\Spark\spark-3.2.1-bin-hadoop3.2\spark-3.2.1-bin-hadoop3.2\python\pyspark\context.py", line 209, in _do_init
self._jsc = jsc or self._initialize_context(self._conf._jconf)
File "C:\Spark\spark-3.2.1-bin-hadoop3.2\spark-3.2.1-bin-hadoop3.2\python\pyspark\context.py", line 329, in _initialize_context
return self._jvm.JavaSparkContext(jconf)
File "C:\Spark\spark-3.2.1-bin-hadoop3.2\spark-3.2.1-bin-hadoop3.2\python\lib\py4j-0.10.9.3-src.zip\py4j\java_gateway.py", line 1585, in __call__
return_value = get_return_value(
File "C:\Spark\spark-3.2.1-bin-hadoop3.2\spark-3.2.1-bin-hadoop3.2\python\lib\py4j-0.10.9.3-src.zip\py4j\protocol.py", line 326, in get_return_value
raise Py4JJavaError(
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:110)
at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348)
at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:287)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:191)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:247)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:238)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)
at py4j.ClientServerConnection.run(ClientServerConnection.java:106)
at java.base/java.lang.Thread.run(Thread.java:833)
C:\Spark\spark-3.2.1-bin-hadoop3.2\spark-3.2.1-bin-hadoop3.2\bin>SUCCESS: The process with PID 21928 (child process of PID 14900) has been terminated.
SUCCESS: The process with PID 14900 (child process of PID 31720) has been terminated.
SUCCESS: The process with PID 31720 (child process of PID 10468) has been terminated.

AWS Glue Job Method pyWriteDynamicFrame does not exist

My goal is to read dataframe from existing catalog table, make some transformations and create a new table out of it. So according to https://docs.aws.amazon.com/glue/latest/dg/update-from-job.html, I use the sink.writeFrame method:
datasource0 = glueContext.create_dynamic_frame.from_catalog(database = "my_db", table_name = "table1", transformation_ctx = "datasource0")
datasource1 = datasource0.toDF().withColumn("date", current_date().cast("string"))
datasource2 = DynamicFrame.fromDF(datasource1, glueContext, "datasource2")
sink = glueContext.getSink(connection_type="s3", path="s3://my_bucket/output", enableUpdateCatalog=True)
sink.setFormat("json")
sink.setCatalogInfo(catalogDatabase='my_db', catalogTableName='table2')
sink.writeFrame(datasource2)
job.commit()
But as a result I get a misleading error, that method pyWriteDynamicFrame doesn't exist:
Traceback (most recent call last):
File "/tmp/test", line 39, in <module>
sink.writeFrame(datasource1)
File "/opt/amazon/lib/python3.6/site-packages/awsglue/data_sink.py", line 31, in writeFrame
return DynamicFrame(self._jsink.pyWriteDynamicFrame(dynamic_frame._jdf, callsite(), info), dynamic_frame.glue_ctx, dynamic_frame.name + "_errors")
File "/opt/amazon/spark/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py", line 1257, in __call__
answer, self.gateway_client, self.target_id, self.name)
File "/opt/amazon/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
return f(*a, **kw)
File "/opt/amazon/spark/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py", line 332, in get_return_value
format(target_id, ".", name, value))
py4j.protocol.Py4JError: An error occurred while calling o75.pyWriteDynamicFrame. Trace:
py4j.Py4JException: Method pyWriteDynamicFrame([class org.apache.spark.sql.Dataset, class java.lang.String, class java.lang.String]) does not exist
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318)
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:326)
at py4j.Gateway.invoke(Gateway.java:274)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Thread.java:748)
Versions:
Spark: 2.4, Python: 3, Glue: 2
You can use Glue native transformation Map class which will builds a new DynamicFrame by applying a function to all records in the input DynamicFrame.
So in your case to derive a column date you can use below snippet to achieve the it.
from datetime import datetime
def addDate(d):
d["date"] = datetime.today()
return d
datasource1 = Map.apply(frame = datasource0, f = addDate)

Method col([class java.util.ArrayList]) does not exist

Getting this exception on the below code.
Code:
when(INV['VFA_EXTRA_AM'].isNull(), None)
.otherwise((INV['VFA_EXTRA_AM'].cast(DecimalType(12,2))/INV['EXTRAADULT_COUNT_CHECK']
).cast(DecimalType(12,2))).alias("PER_PRSN_VFA_EXTRA_AM"),
when(INV['VFC_EXTRA_AM'].isNull(), None)
.otherwise((INV['VFC_EXTRA_AM'].cast(DecimalType(12, 2))/INV['EXTRACHILD_COUNT_CHECK']
).cast(DecimalType(12, 2))).alias("PER_PRSN_VFC_EXTRA_AM"),
when(INV['VFI_EXTRA_AM'].isNull(), None)
.otherwise((INV['VFI_EXTRA_AM'].cast(DecimalType(12, 2))/INV['EXTRAINFANT_COUNT_CHECK']
).cast(DecimalType(12, 2))).alias("PER_PRSN_VFI_EXTRA_AM"))
INV is the DataFrame name.
Error log:
File "/mnt/dclrms-cogs/resbaseline/Integration.py", line 52, in execueIntegration
).cast(DecimalType(12, 2))).alias("PER_PRSN_VFI_EXTRA_AM"))\
File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/dataframe.py", line 1040, in select
File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/dataframe.py", line 895, in _jcols
File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/dataframe.py", line 882, in _jseq
File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/column.py", line 60, in _to_seq
File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/column.py", line 48, in _to_java_column
File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/column.py", line 41, in _create_column_from_name
File "/usr/lib/spark/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in __call__
File "/usr/lib/spark/python/lib/pyspark.zip/pyspark/sql/utils.py", line 63, in deco
File "/usr/lib/spark/python/lib/py4j-0.10.4-src.zip/py4j/protocol.py", line 323, in get_return_value
py4j.protocol.Py4JError: An error occurred while calling z:org.apache.spark.sql.functions.col. Trace:
py4j.Py4JException: Method col([class java.util.ArrayList]) does not exist
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318)
at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:339)
at py4j.Gateway.invoke(Gateway.java:274)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.lang.Thread.run(Thread.java:748)

jxbrowser on macosx using SWT deadlock

I have finished evaluating jxbrowser integration with Eclipse for Windows and Linux. My last roadblock is Mac. I get a dead lock while instantiating the browser. I am new to cocoa programming, so any pointer will be greatly appreciated.
I took the example project and modify it to work with SWT.
import java.awt.BorderLayout;
import java.awt.Frame;
import org.eclipse.swt.SWT;
import org.eclipse.swt.awt.SWT_AWT;
import org.eclipse.swt.widgets.Composite;
import org.eclipse.swt.widgets.Display;
import org.eclipse.swt.widgets.Shell;
import com.teamdev.jxbrowser.chromium.Browser;
import com.teamdev.jxbrowser.chromium.swing.BrowserView;
public class JxBrowserApp {
public static void main(String[] args) {
final Display display = new Display();
final Shell shell = new Shell(display);
Composite composite = new Composite(shell, SWT.EMBEDDED | SWT.NO_BACKGROUND);
Frame frame = SWT_AWT.new_Frame(composite);
Browser browser = new Browser(); // deadlock here
BrowserView browserView = new BrowserView(browser);
frame.add(browserView, BorderLayout.CENTER);
frame.setLocationRelativeTo(null);
frame.setFocusable(true);
browser.loadURL("http://www.google.com");
shell.pack();
shell.open ();
while (!shell.isDisposed ()) {
if (!display.readAndDispatch ()) display.sleep ();
}
display.dispose ();
}
}
Here is the callstack
JxBrowserApp at localhost:57970
Thread [main] (Suspended)
owns: IPC (id=72)
Unsafe.park(boolean, long) line: not available [native method]
LockSupport.parkNanos(Object, long) line: 215
CountDownLatch$Sync(AbstractQueuedSynchronizer).doAcquireSharedNanos(int, long) line: 1037
CountDownLatch$Sync(AbstractQueuedSynchronizer).tryAcquireSharedNanos(int, long) line: 1328
CountDownLatch.await(long, TimeUnit) line: 277
LatchUtil.await(CountDownLatch, RuntimeException, int) line: 25
LatchUtil.await(CountDownLatch, RuntimeException) line: 20
IPC.a(boolean) line: 159
IPC.start() line: 128
Browser.<init>(BrowserType, BrowserContext, Channel) line: 195
Browser.<init>(BrowserType, BrowserContext) line: 172
Browser.<init>(BrowserContext) line: 139
Browser.<init>() line: 125
JxBrowserApp.main(String[]) line: 27
Thread [AWT-EventQueue-0] (Suspended)
CPlatformView.nativeCreateView(int, int, int, int, long) line: not available [native method]
CPlatformView.initialize(LWWindowPeer, CPlatformResponder) line: 61
CViewPlatformEmbeddedFrame.initialize(Window, LWWindowPeer, PlatformWindow) line: 55
LWWindowPeer.<init>(Window, PlatformComponent, PlatformWindow, LWWindowPeer$PeerType) line: 156
LWCToolkit(LWToolkit).createDelegatedPeer(Window, PlatformComponent, PlatformWindow, LWWindowPeer$PeerType) line: 210
LWCToolkit.createEmbeddedFrame(CViewEmbeddedFrame) line: 204
CViewEmbeddedFrame.addNotify() line: 55
SWT_AWT$1.run() line: not available
InvocationEvent.dispatch() line: 311
EventQueue.dispatchEventImpl(AWTEvent, Object) line: 756
EventQueue.access$500(EventQueue, AWTEvent, Object) line: 97
EventQueue$3.run() line: 709
EventQueue$3.run() line: 703
AccessController.doPrivileged(PrivilegedAction<T>, AccessControlContext) line: not available [native method]
ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(PrivilegedAction<T>, AccessControlContext, AccessControlContext) line: 76
EventQueue.dispatchEvent(AWTEvent) line: 726
EventDispatchThread.pumpOneEventForFilters(int) line: 201
EventDispatchThread.pumpEventsForFilter(int, Conditional, EventFilter) line: 116
EventDispatchThread.pumpEventsForHierarchy(int, Conditional, Component) line: 105
EventDispatchThread.pumpEvents(int, Conditional) line: 101
EventDispatchThread.pumpEvents(Conditional) line: 93
EventDispatchThread.run() line: 82
Daemon Thread [Timer-0] (Suspended)
waiting for: TaskQueue (id=87)
Object.wait(long) line: not available [native method]
TimerThread.mainLoop() line: 552
TimerThread.run() line: 505
Daemon Thread [IPC Server Thread] (Suspended)
PlainSocketImpl.socketAccept(SocketImpl) line: not available [native method]
SocksSocketImpl(AbstractPlainSocketImpl).accept(SocketImpl) line: 409
ServerSocket.implAccept(Socket) line: 545
ServerSocket.accept() line: 513
Server.start(int) line: 81
e.run() line: 1237
Thread.run() line: 745
Daemon Thread [IPC Process Thread] (Suspended)
CGraphicsDevice.nativeGetScreenInsets(int) line: not available [native method]
CGraphicsDevice.getScreenInsets() line: 128
LWCToolkit.getScreenInsets(GraphicsConfiguration) line: 407
SwingUtilities$SharedOwnerFrame(Window).init(GraphicsConfiguration) line: 506
SwingUtilities$SharedOwnerFrame(Window).<init>() line: 537
SwingUtilities$SharedOwnerFrame(Frame).<init>(String) line: 420
SwingUtilities$SharedOwnerFrame(Frame).<init>() line: 385
SwingUtilities$SharedOwnerFrame.<init>() line: 1758
SwingUtilities.getSharedOwnerFrame() line: 1833
JWindow.<init>(Frame) line: 187
JWindow.<init>() line: 139
InternalChromiumProcess.doStart(List<String>) line: 1078
InternalChromiumProcess(ChromiumProcess).start(int) line: 123
d.run() line: 184
Thread.run() line: 745