Problem with file-systemy on github - Discord.Js - github

I'm doing my discord bot with discord.js. I've host it with GitHub and Heroku. When a user send a command, the bot write some information on a txt file. Locally that's working but when i'm host it on github and heroku, the bot doesnt write correctly, dont write on github's files, so on github the files are empty and when I restart the bot that reset all
Thanks

The Heroku filesystem is ephemeral - that means that any changes to the filesystem whilst the dyno is running only last until that dyno is shut down or restarted. That means that the filesystem on Heroku is not suitable for persistent storage of data.
A solution to this issue is to use Heroku Postgres, which is free.

Related

Webhooks: GCP and Github

Since Github changed their authentication methods to only accept Personal Access Tokens I've had some trouble with getting my GCP Build Trigger to run when I push to the main branch of my repo.
Does anyone know how I can re-authenticate, or change the password that's being used to connect GCP to Github?
On GCP I have tried reconnecting to the repo, 'forgetting' the repo and then reconnecting, I'm not incredibly clued up on this platform, I've only been using it for a few weeks.
"token" would be used for HTTPS URL.
The official GCP documentation uses SSH URLs, which does not need tokens (but SSH keys): that would be one alternative.

working with Google Cloud Storage without gsutil

I have developed a Software in which is configured directories to save files. I run it on Linux. These directories are informed by config file.
I would like to use compute engine nodes because I need to increase its performance. Therefore, I would like to use Google Storage to save these files into a save repository.
In [1] is showed mounting a bucket as file system. I tried it, but no success. I receive authentication error.
Can anyone help me to get success in order to access my bucket by compute engine nodes ?
[1] https://cloud.google.com/compute/docs/disks/gcs-buckets
Best regards,
It sounds like you did not start your GCE instance with a service account.
According to the docs you linked, you need to configure a service account or run gcloud auth login to configure your credentials for accessing cloud storage.
If you are trying to set up gcsfuse without running on GCE you will need to use the gcloud auth login approach.

Migrating Files from Parse.com to our hosted parse server [image]

I have migrated files from Parse.com to my hosted parse server using "https://github.com/parse-server-modules/parse-files-utils" tool by applying "Option-2".
Now My problem is when I click on the image in my hosted parse server dashboard, it will show me message "File not found." and my url is like,
http://ip of my server:1337/parse/files/OE9gP1wrd2OT9avp3RBmt8zysmM25wRTMtDOxsfe/tfss-6ca44378-72fb-4ddf-aef2-11af0485b11b-profile-pic
If I upload new image from mobile aap, its working fine.
I have installed mongodb and migrated parse.com data to newly created database in mongodb.
I am not using any FileAdapter in my new created parse server.
Thanks in advance, kindly please look into this issue and help me that how can I display migrated images in our hosted parse server.
What is the http error you are seeing (404?) Are you sure the error is "file not found"? Maybe your server folder permission is set not set to public, so you can't publicly access the files (should be a 403 error).
You usually store the image files to a container (storage) that can be accessed via APIs like Amazon (AWS) or Microsoft Azure. It's usually more efficient to keep you local server file storage small and have fast access speeds to your images.
You can find out how to setup an Amazon S3 bucket or Google Cloud Storage here.
You can find out how to setup an Azure Storage here and connect it to your parse-server using this adapter.
I'm not sure about AWS, but Google and Azure gives you free credits if you sign up, and (at least for Azure) the storage aren't too expensive, so the free credits can last you a while...

How can I test a heroku database on travis securely?

I'm using Ring(Clojure) on Heroku and I'm implementing some tests in the app. Some of the tests are GET requests to the app, and the app has a PostgreSQL database that fills some pages. But, to do so, the app has to connect to the original Heroku database.
How can I test if some content is present in a GET request without putting the database connection specs (url, user, password) in the .travis.yml file? Am I even supposed to do that? Or should I just setup a test database, fill it with test data and test the contents, with a localhost connection?
Thanks.
I think you could put the credential in a Travis Encryption key. The only problem is that Pull Requests cannot use the decrypted keys (for security purposes).

Git pulling onto a vm without an ssh key

I'm trying to pull an existing github repo made on my local machine onto a vm running on EC2 that will be used by multiple people. I have some concerns with using an ssh key without a password, so I was wondering if there was any way to pull directly onto the VM either anonymously, or by providing the username and password of the account that originally pushed the repo, so that my personal information won't have to be stored on the vm, and there's no security risk in having someone get ahold of a password-less ssh key for the vm. Is this possible?
Currently running Ubuntu 12.04
I recommend generating a new key and adding it as deployment key to your specific repo.
These keys are linked to a specific repo, not your account.
Alot of options are also available here.
https://help.github.com/articles/managing-deploy-keys