Fetching .kmz files from local disk and importing them into Google Earth via API - google-earth

I need to import 3D models of each building in New York City into the Google Earth API. I fetched their .kmz files using google.earth.fetchKml. Since this command uses the URL of the kmz files and the number of files to imported is a lot it is very slow.
Is there any way I can fetch these files from my local disk?
Are there other formats I can use instead of .kmz? For example .dae files?

You cannot use a local file (non http) url to fetch your KML data.
But you could run a local webserver and use that.
For example, if you have python installed you could go to your directory with your KML files and run "python -m SimpleHTTPServer 8000", at which point pointing to http://localhost:8000/myfile.kml would load them up.
That said you should also note that the terms of use for the plugin require your site be publicly available, amongst other things - so hopefully you are only using this setup for local testing :)

Related

where can i find the sap-ui-cachebuster-info.json file

I'm new to openui5 and i'm trying to understand how the openui5 cache buster works by reading through the documentation. I don't understand where I can find the generated sap-ui-cachebuster-info.json, am I supposed to be able to find it in my server (after build)?? or can I access it and read its content in some way? Am I supposed to see it in the list of files my browser receives in the network tab? Can I read it at all?
'sap-ui-cachebuster-info.json' file is generated on the server and usually located in the server root directory.
it's contains information about the version of the OpenUI5 library in use and the corresponding file names,
file content is not accessible to the client, it is read by the server when serving the library files. You can inspect it using browser development tools

How to import remote python files using pyscript

Pyscript allows one to run python inside a web browser. I have two python scripts I wrote that I’d like to use. One way to do this is to copy and paste the python code held in these files directly into the index.html file where the index file is part of a GitHub.io page. If possible however, I would rather load/Import them from a remote location. Currently, they reside in the gh-page branch on GitHub alongside the index.html file.
My question is whether this is possible? Most tutorials show how to load and import a local python file which I don’t want to do.
Update: This is my current attempt which I add to the index.html file:
<py-config>
[[fetch]]
from = "https://github.com/etc/blob/gh-pages/"
files = ["myadd.py"]
</py-config>
When I try this I get the error message:
(PY0001): PyScript: Access to local files (using "Paths:" in ) is not available when directly opening a HTML file; you must use a webserver to serve the additional files. See this reference on starting a simple webserver with Python.
I want to avoid starting a server because this is meant to be client-side only approach with only a dumb file repo at the other end.
There is a solution, and it's very simple, just use the syntax:
<py-script src="mypythonscript.py"> </py-script>
And it will pick up the file from the GitHub directory.

Where is JupyterLite notebook located locally on Windows?

I am using jupyterlite which is JupyterLab distribution that runs entirely in the browser.
However, after clearing the browser history, the files are no more visible.
Please let me know how can I retrieve the *ipynb files from my windows machine.
I have already checked %AppData% and I don't see any *ipynb files.
The files are stored in well... the browser. Specifically in the IndexDB or localStorage. This means that the physical location on the disk will depend entirely on the browser that you use, rather than on the operating system, and will likely be inaccessible (for an average user) without decoding binary blobs.
For example, in Chrome you can check the path to the application data using chrome://version/ (under Profile Path) and in that directory there should be IndexedDB folder. Then you need to find a sub-folder depending on the domain in which you accessed JupyterLite, for example https_jupyterlite.readthedocs.io_0.indexeddb.leveldb, and there you will find a LevelDB database file with .ldb extension and a MANIFEST file (with the pointer to the current version in the CURRENT file. The details of how to extract the blobs are outside of scope for this answer, but have a look at How to access Google Chrome's IndexedDB/LevelDB files?.
However, you can use files from your file system directly in JupyterLite without worrying about in-browser technologies with the jupyterlab-filesystem-access extension which uses File System Access API however this API is not available on Firefox yet.
As noted by #Wayne all of this is still quite experimental (both as in "using the newest browser APIs" and "the team of developers is still figuring way forward, please help by providing kind feedback and contributing").

Google Compute Startup Script PHP Files From Bucket

I'd like to automatically load a folder full of php files from a bucket when an instance starts up. My php files are normally located at /var/www/html
How do I write a startup script for this?
I think this would be enormously useful for people such as myself who are trying to deploy autoscaling, but don't want to have to create a new image with their php files every time they want to deploy changes. It would also be useful as a way of keeping a live backup on cloud storage.

Can Google Package App use external directories during packing?

I am writing a number of Google Packaged Apps which run independently, but share lots of code. For example, they all use "library.js". I would like to have only one copy of library.js so any changes to it will be used by all newly packed apps.
To package my apps, it seems they all must have a copy of library.js in their own directory structure, whereas it would be nice to have a single master copy in some other directory that is accessible to all. I currently do a manual check to make sure all files are up-to-date before packing, and I am writing some code to do the check automatically, but it seems like a work-around.
Can a Google Packaged App use JS code in external library directories, or must all code be under the root directory of the app (i.e., requiring copying from external directory) when packing?
Have you tried providing a URL i.e. host the javscript file in .js format to an accessible location to your apps and then provide the .js file URL in all your apps code. The very next time you want to change, all you have to do is to update that .js file.