Import Leaflet.awesome-marker in angular/cli component - leaflet

I want to use the Leaflet.awesome-marker plugin in my angular project.
I installed the package throught yarn and imported in my component using
import * as awesome from 'leaflet.awesome-marker';
But I receive the following error:
Cannot find module 'leaflet.awesome-marker'
Doing the same thing with the geojson module works fine, why not with this one?

Try importing leaflet.awesome-markers, not leaflet.awesome-marker

Related

QGIS plugin - getting ModuleNotFoundError when importing specifically python-ternary

I am building a QGIS plugin and I'd like to import a module named python-ternary. I have a specific issue importing this module (others get imported), and importing it in QGIS (works outside of QGIS).
I installed it using pip, tried to import it outside of QGIS, works fine.
Now writing this in my plugin main file:
from qgis.PyQt.QtCore import QSettings, QTranslator, QCoreApplication, Qt
from qgis.PyQt.QtGui import QIcon, QColor
from qgis.PyQt.QtWidgets import QAction, QApplication
from .resources import *
import psycopg2
import ternary
I get this when loading my plugin with the QGIS plugin manager:
QGIS error when loading the plugin
Now the python-ternary module is well installed in lib\site-packages:
ternary file in lib\site-packages
And lib\site-packages is on path as shown in the QGIS error.
I have no problem importing other modules in QGIS e.g. psycopg2 is imported without issue. I force reinstalled python-ternary with pip, to no avail.
Is there a compatibility issue with this particular module I'm not aware of ?

ModuleNotFoundError: No module named 'tflite_support.task'

tflite_support's task library is missing. I've install the tflite_support with pip install tflite-support. I've tried using help() function to get the pakage content with help(tflite_support) and got the output 'PACKAGE CONTENTS
_pywrap_codegen
_pywrap_flatbuffers codegen
flatbuffers (package)
metadata
metadata_schema_py_generated
schema_py_generated'. There is no task library inside like how the tflite website shows https://www.tensorflow.org/lite/inference_with_metadata/task_library/object_detector#run_inference_in_python. I get the same result doing it in my window pc. Am I doing anything wrong or the task library is just missing?
I'm using tflite-support 0.4.1 and it looks like the task module is not supported on Windows:
import flatbuffers
import platform
from tensorflow_lite_support.metadata import metadata_schema_py_generated
from tensorflow_lite_support.metadata import schema_py_generated
from tensorflow_lite_support.metadata.python import metadata
from tflite_support import metadata_writers
if platform.system() != 'Windows':
# Task Library is not supported on Windows yet.
from tflite_support import task
There's also a note about it in the task_library docs.

Can not find ionic-tooltips module

When i imported TooltipsModule from ionic-tooltips that time i am getting an error can not find ionic-tooltips module.
i installed "npm i ionic-tooltips" this library.
This is the link for the library
https://www.npmjs.com/package/ionic-tooltips
here is an image for the import statement

Can't import the Svg library in elm?

Trying to use Svg and Svg.Attributes. Getting the error message
I cannot find module 'Svg'.
Module 'Main' is trying to import it.
Potential problems could be:
* Mispelled the module name
* Need to add a source directory or new dependency to elm-package.json
I'm certain that there aren't any spelling errors because I copy and pasted the imports from a tutorial. Where do I install this library?
The tutorial I'm going through is the one elm-lang.org, specifically the section on time.
You need the elm-lang/svg package as a dependency in your elm-package.json. Run elm package install elm-lang/svg in the project directory.

How to import libraries in Spark Notebook

I'm having trouble importing magellan-1.0.4-s_2.11 in spark notebook. I've downloaded the jar from https://spark-packages.org/package/harsha2010/magellan and have tried placing SPARK_HOME/bin/spark-shell --packages harsha2010:magellan:1.0.4-s_2.11 in the Start of Customized Settings section of the spark-notebook file of the bin folder.
Here are my imports
import magellan.{Point, Polygon, PolyLine}
import magellan.coord.NAD83
import org.apache.spark.sql.magellan.MagellanContext
import org.apache.spark.sql.magellan.dsl.expressions._
import org.apache.spark.sql.Row
import org.apache.spark.sql.types._
And my errors...
<console>:71: error: object Point is not a member of package org.apache.spark.sql.magellan
import magellan.{Point, Polygon, PolyLine}
^
<console>:72: error: object coord is not a member of package org.apache.spark.sql.magellan
import magellan.coord.NAD83
^
<console>:73: error: object MagellanContext is not a member of package org.apache.spark.sql.magellan
import org.apache.spark.sql.magellan.MagellanContext
I then tried to import the new library like any other library by placing it into the main script like so:
$lib_dir/magellan-1.0.4-s_2.11.jar"
This didn't work and I'm left scratching my head wondering what I've done wrong. How do I import libraries such as magellan into spark notebook?
Try evaluating something like
:dp "harsha2010" % "magellan" % "1.0.4-s_2.11"
It will load the library into Spark, allowing it to be imported - assuming it can be obtained though the Maven repo. In my case it failed with a message:
failed to load 'harsha2010:magellan:jar:1.0.4-s_2.11 (runtime)' from ["Maven2 local (file:/home/dev/.m2/repository/, releases+snapshots) without authentication", "maven-central (http://repo1.maven.org/maven2/, releases+snapshots) without authentication", "spark-packages (http://dl.bintray.com/spark-packages/maven/, releases+snapshots) without authentication", "oss-sonatype (https://oss.sonatype.org/content/repositories/releases/, releases+snapshots) without authentication"] into /tmp/spark-notebook/aether/b2c7d8c5-1f56-4460-ad39-24c4e93a9786
I think file was to big and connection was interrupted before whole file could be downloaded.
Workaround
So I downloaded the JAR manually from:
http://dl.bintray.com/spark-packages/maven/harsha2010/magellan/1.0.4-s_2.11/
and copied it into the:
/tmp/spark-notebook/aether/b2c7d8c5-1f56-4460-ad39-24c4e93a9786/harsha2010/magellan/1.0.4-s_2.11
And then :dp command worked. Try Calling it first, and if it will fail copy JAR into the right path to make things work.
Better solution
I should investigate why download failed to fix it in the first place... or put that library in my local M2 repo. But that should get you going.
I would suggest to check this:
https://github.com/spark-notebook/spark-notebook/blob/master/docs/metadata.md#import-download-dependencies
and
https://github.com/spark-notebook/spark-notebook/blob/master/docs/metadata.md#add-spark-packages
I think the :dp magic command is depreciated, instead you should add your custom dependencies in the notebook metadata. You can go in the menu Edit > Edit notebook metadata, there add something like:
"customDeps": [
"harsha2010 % magellan % 1.0.4-s_2.11"
]
Once done, you will need to restart the kernel, you can check in the browser console if the package is being downloaded properly.
The easy way, you should set or add the EXTRA_CLASSPATH environnent variable to point to your .jar file downloaded :
export EXTRA_CLASSPATH = </link/to/your.jar> or set EXTRA_CLASSPATH= </link/to/your.jar> in wondows OS. Here find the detailed solution.