Trying to attach a GCS bucket to Datalore - google-cloud-storage

(I asked this also on Datalore's forum. There doesn't seem to be much going on there -- so I'm asking here in the hope of a better/quicker response.)
I'm having trouble attaching GCS buckets. The documentation is sparse. All that I could find is here, which simply says:
In the New datasource dialog, fill in the fields and click Save and close.
Here's that dialog and I'm not sure what information to put here.
What I tried
GCS datasource name
I believe it's for reference within Datalore, correct? So can I just put anything here? I wrote "patant-data-ors".
bucket
Options I tried:
patent-data-ors (this is the name of the bucket)
storage.googleapis.com/patent-data-ors
patent-data-ors.storage.googleapis.com
Also tried 2 and 3 with https://.
key_file_content
I left it blank. I'm guessing it's for private buckets? Mine is publicly accessible.
What am I doing wrong?

Related

How do I set up a postgres connection on airflow for the postgres operator

For the rtfm crowd, let me document my suffering.
I went here:
https://betterdatascience.com/apache-airflow-postgres-database/
But my ui has UNAUTHORIZED in pink after I add the info.
I also went here:
https://airflow.apache.org/docs/apache-airflow-providers-postgres/stable/connections/postgres.html
But obvious questions remain. Which file? what is the format of the default data. Why can't I just make a connection string and put it somewhere.
I also read this, which doesn't tell us where to put this information, it only tells us how to programmatically override it. It did give me this golden nugget:
Which would have been another stack overflow question.
Is there a file I should type my connection information or connection string into that has examples already?
I solved it. There's no secret file like airflow.cfg. It's hidden away a database somewhere I set up long ago familiar to nobody who doesn't do this full time. To update or add connections, you either have to use the ui, which doesn't work for me, or you have to use the cli and type airflow connections add --conn-uri and the string we all know and love.
Since I wasn't born with the knowledge of all the commands available under the cli, I googled here:
https://airflow.apache.org/docs/apache-airflow/stable/howto/connection.html

Setting up Dynamic Links in Firebase with Wordpress site

I am really struggling here... All I actually want to achieve is that I can get the Generate-Strong-Password function inside my app but that is actually harder than I thought.
I learned that I should go with Firebase Dynamic Links because I have a Wordpress-Website from All-Inkl.com.
I followed this Tutorial and there is actually an Apple-Site-Association-File at the moment. But I can't access my Website anymore as it looks like this:
Inside my Firebase Project I am getting this error which says that there not all the necessary "A-Files" are inside my Website:
My DNS-Settings:
I've been struggling for weeks now to get this done so if anyone has any idea how I can fix it I would be extremely grateful!! (btw, I am a total newbie when it comes to websites; I know my way around Swift though)
It seems that different domain providers accept different values for DNS entries ('A records' = 'A-Datensätze', in this case).
Try editing the entries for the Host field (which currently hold your website's URL) to one of the 'common inputs' listed here: https://firebase.google.com/docs/hosting/custom-domain?hl=de#domain-key
As the URL to your site doesn't seem to be what your provider accepts, I would suggest you try replacing it with the next option, i.e. replacing it with # .
Hope this helps solving your issue!

Unity Examples with AWS Cognito and S3 for posting files

I am trying to run the example Unity packages provided by Amazon to post a file to my bucket, which is the us-west-1 region. The message back says its successful, but there's no file. When I put the http response in Debug.Log, it says Moved (which I assume is 301). When I've researched this it says there could be a region error, but I know its the right region (us-west-1 for Northern California).
Here are my Inspector, IAM policy and bucket policies.
Any help would be greatly appreciated!
It seems there may be a bug in the code provided by Amazon. When I created bucket in the US Standard region and specified us-east-1, everything worked fine.

When I try to add a schema in stackmob it is not getting saved

I was trying to do a tutorial in stackmob but while I try to save a schema and when I come back to it I cant find the changes I made to the schema?Any help??
Assuming you mean that you are in the Stackmob Dashboard and trying to create and save a schema, there are two snags that can get you (and me). If you didn't hit save schema, nothing will be saved. The other snag can be that you do not set the Create, Read, Update, and Delete permissions at the bottom of the page, although I think it flags this as something you must complete. If these things have been done and your data is not being retained, provide a bit more info and I can try my best to help.

Working with Facebook Connect

I have two files with identical code (it is the code they mention here: http://developers.facebook.com/blog/post/198). I have one of these files here: http://gnucom.cc/test.html and another one of these files here: http://blog.gnucom.cc/test.html. I have the main URL set to gnucom.cc and the Connect URL set to http://blog.gnucom.cc.
For the life of me, I cannot figure out why the version accessed from the subdomain doesn't work. I receive a loading icon and that is it - afterwards it disappears.
Can anyone suggest what I may be doing wrong?
http://wiki.developers.facebook.com/index.php/Supporting_Subdomains_In_Facebook_Connect
You should type in http://gnucom.cc instead of http://www.gnucom.cc (if it's not already this way)