yaml language server and nvim configuration - neovim

I would like to use nvim lsp to validate an OpenAPI file.
Here are the steps I've been following:
I installed yaml-language-server, and made sure it was available in the PATH
I downloaded the OpenAPI schema from here, and stored it in my filesystem.
I modified my existing init.vim to include the following:
lspconfig.yamlls.setup {
on_attach = on_attach,
flags = {
debounce_text_changes = 150,
},
settings = {
yaml = {
schemas = {
{
fileMatch = { ".openapi.yaml" },
url = "file:///[...]/openapi.schema.yaml"
}
},
format = {
enable = true,
singleQuote = false,
bracketSpacing = true
},
validate = true,
completion = true
}
}
}
I wrote a simple OpenAPI spec file, and opened it with nvim.
It seems that my nvim correctly hits the yaml-language-server to validate the yaml syntax, but it does not seem to validate against the schema.
One of the problem I have is that I don't have access to logs of nvim, or the yaml-language-server, to have some insight about what's going on.

it's been a while but I figured I'd give this a shot in case it helps anyone.
First, about your lack of access to the nvim log. You don't explain why that is but if your nvim is running, and it seems that it is, you should be able to get the log running this command:
:lua vim.cmd('e'..vim.lsp.get_log_path())
I think the way you have the settings.yaml.schemas node is not correct.
It might also be a problem that you are using a yaml format. I highly recommend you use json for open api schema validation. If you want to try this, just sub all the yaml for json (s/yaml/json), and replace your local yaml file with the json one.
See the example below if it works for you.
-- configure yamlls ls:
require('lspconfig')['yamlls'].setup {
on_attach = on_attach,
capabilities = capabilities,
settings = {
yaml = {
schemas = {
["https://raw.githubusercontent.com/OAI/OpenAPI-Specification/main/schemas/v3.0/schema.yaml"] = "/*"
}
}
}
}
If you really need to load a local file, either one of these formats should work:
["file:///home/$user/openapi.schema.yaml"] = "/*"
["../relative/path/openapi.schema.yaml"] = "/*"

Related

Terraform: update resources only when Vault secret data has changed

This should be fairly easy, or I might doing something wrong, but after a while digging into it I couldn't find a solution.
I have a Terraform configuration that contains a Kubernetes Secret resource which data comes from Vault. The resource configuration looks like this:
resource "kubernetes_secret" "external-api-token" {
metadata {
name = "external-api-token"
namespace = local.platform_namespace
annotations = {
"vault.security.banzaicloud.io/vault-addr" = var.vault_addr
"vault.security.banzaicloud.io/vault-path" = "kubernetes/${var.name}"
"vault.security.banzaicloud.io/vault-role" = "reader"
}
}
data = {
"EXTERNAL_API_TOKEN" = "vault:secret/gcp/${var.env}/micro-service#EXTERNAL_API_TOKEN"
}
}
Everything is working fine so far, but every time I do terraform plan or terraform apply, it marks that resource as "changed" and updates it, even when I didn't touch the resource or other resources related to it. E.g.:
... (other actions to be applied, unrelated to the offending resource) ...
# kubernetes_secret.external-api-token will be updated in-place
~ resource "kubernetes_secret" "external-api-token" {
~ data = (sensitive value)
id = "platform/external-api-token"
type = "Opaque"
metadata {
annotations = {
"vault.security.banzaicloud.io/vault-addr" = "https://vault.infra.megacorp.io:8200"
"vault.security.banzaicloud.io/vault-path" = "kubernetes/gke-pipe-stg-2"
"vault.security.banzaicloud.io/vault-role" = "reader"
}
generation = 0
labels = {}
name = "external-api-token"
namespace = "platform"
resource_version = "160541784"
self_link = "/api/v1/namespaces/platform/secrets/external-api-token"
uid = "40e93d16-e8ef-47f5-92ac-6d859dfee123"
}
}
Plan: 3 to add, 1 to change, 0 to destroy.
It is saying that the data for this resource has been changed. However the data in Vault remains the same, nothing has been modified there. This update happens every single time now.
I was thinking on to use the ignore_changes lifecycle feature, but I assume this will make any changes done in Vault secret to be ignored by Terraform, which I also don't want. I would like the resource to be updated only when the secret in Vault was changed.
Is there a way to do this? What am I missing or doing wrong?
You need to add in the Terraform Lifecycle ignore changes meta argument to your code. For data with API token values but also annotations for some reason Terraform seems to assume that, that data changes every time a plan or apply or even destroy has been run. I had a similar issue with Azure KeyVault.
Here is the code with the lifecycle ignore changes meta argument included:
resource "kubernetes_secret" "external-api-token" {
metadata {
name = "external-api-token"
namespace = local.platform_namespace
annotations = {
"vault.security.banzaicloud.io/vault-addr" = var.vault_addr
"vault.security.banzaicloud.io/vault-path" = "kubernetes/${var.name}"
"vault.security.banzaicloud.io/vault-role" = "reader"
}
}
data = {
"EXTERNAL_API_TOKEN" = "vault:secret/gcp/${var.env}/micro-service#EXTERNAL_API_TOKEN"
}
lifecycle {
ignore_changes = [
# Ignore changes to data, and annotations e.g. because a management agent
# updates these based on some ruleset managed elsewhere.
data,annotations,
]
}
}
link to meta arguments with lifecycle:
https://www.terraform.io/language/meta-arguments/lifecycle

Firebase Storage upload permissions issue using Admin SDK [duplicate]

I've started working with firebase storage and firebase functions recently. Right now I've been developing file upload from functions to storage .
I've got it working (upload is done and file appears on the storage section), yet, the image, stays like this forever (loading forever on the right side):
I though that it was an error from my code. Yet, if I open Google Cloud Platform - Storage, the image appears and I can open it and preview it.
In firebase storage, if I open the image (select on it and click open), it returns the following url: https://console.firebase.google.com/u/0/undefined
What may I been doing wrong? Here's the code I'm using:
function uploadImage() {
const newImageData = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAOEAAADhCAMAAAAJbSJIAAAAgVBMVEX///8AAAAEBASAgIDr6+vw8PBYWFjU1NTGxsbz8/P29vb8/Py1tbVhYWHd3d1ra2vk5OS/v78pKSlTU1NOTk6Tk5OpqanNzc13d3dKSkplZWWbm5s5OTkfHx+NjY2GhoYcHBw9PT0TExOioqJ7e3soKCiurq5CQkI6OjoXFxcwMDAuPQWoAAAIJ0lEQVR4nO2daXuqPBCGVfYtIbKLgorLKf//B75ga2sPAdmS8F5n7m/VXgyPISGZzExWKwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAOAVy6wwlCZG/YUl+vamYLoy0lAkORWbdZNN/YUUVf8ju6bomx2Kq+mEBL5K0UVD9QNCdM0Vfds9QSQMnYTWaN1sEicMCRJ9+29QSCqp28HiftiqUkoU0TLaMDLsl8PbrtGWpY8zQ7QYCoE/pe3+ZusHogW9Yln2nxnVPfljW8t4l8go2zPQV7PPkCxa3kqOvOldr52NF4nVKNuYobxPsC1OoxVilu33ZINDQf1Rw2cO+mrOWBMhMD5w0ldziLnrQyeO+mpOfOdzZspZX03KcQGC2I+gNDC3ZkQ7IQLX6x0niW7fZd83ief7Pg7tH0JcfeIlQy+kcllCyk7/O/LUWIsiDbkVv9/bcv0Rqr+MVa//BR0Ob/++AstdKLuK8n4ZZCiKK4e7cikSez2iTjxm2DPjPj8e8wf1zWviXPqpPuHyeuqXb2ZK6WxaqJCuZe7m7qTTp1da6ty7prtbMoOOVro64VktonlmyFZUqB0NybQrkna7l1ndZIhc2k0xbESjdTEf63M7yBQ9bjO2Z+enamvCXcTCphG1zZ3YNSJ9OroJmRkM6UMOZmXPoppzWE4WEX1oY7Xmv1Fs7Zn9nl9gWt+/MTImNU2dprze+6GfmmYlRrYoHZ9dF/whbJrdMTLVdMzEPDZRlOZr48DIVMMQ2wnUN5SpIiNLjaF7z74X1uiNwWbDyNI/qHCd8XBGW1nDLj+F64K9RKtomuWocMvq3fvDjbIk5ahwvc0YGXuS0dbcPBWu1zlLx4mbU23yVbi+s3srkjvdJGeF1WqGTTO6rZsH3BVWjyqae1C1EP0BFaVwvQ60OR1EshZ0GROisJqG23Pt1Gp2q49GqML1OsmK6aMOKbK3OzbCFFZsLwEZ75YySHDpE1olUmFl/XrP02jE5aM0v1972phd2ydDQkvOnjRk4aFL3pDAjiUo/LoTHOuKLMtGzet+lPn4pPpG0eMRQTnLUfi8oTreWc0j7UmUq22x0f9PhbMDCkEhKBQPKJxNYTkgEmYCXiMShZvCXEkHTUXGcPZSpbFS5KZQWq3Qu4XORGIbUTa9eCqsFqt6OGemxSvbUH8sqwUrXK0sl/gM9PnE/XKNCFdYYZozB7VvsPkzU1+Cwhrrph7mGHfOB/X226+1FIU1duztr+Nlnq97L7YbV12SworITqXdvmcw5QvlfielNtU5sDCFNYpO0iJTk2MvbcdEzYqUtIdVLVDhAwvVubIk8D3Po421j88DUufHvvEnL1XhF2Yd7+xqTR6f9wq1XbjCGQCFs/EPKsR8sq6Vxi4bvxVwzCOJBTVXLxzX+BxSrgPKm4arFwM3J1lzQk/D5eunOTjsIqN0h57Gyd0TlbBJYtVa9xFF+NpUedZSM5Ypd6UfifEm4tts5QFkdOvO4RTlLz3GoTZDVpAWxu/WJAI9wn/uRaiPj65x9bC496ixIdjnvXek7Db8HWLfMsnpWWFjAV79w9ZxqlVtzwtXq2XH2Q7I71+Aws/72FyvVz/IsrzlgnmWBX71P5vBV16IQhr3uCUYbxBLVjgPoBAUgkLxgEJQCArFAwpBISgUDzeF1+E7vWMor8IUZi5mFUrzwxa7jRRLnntP0YCqUaNwItG7a3LBUqNT1B5K0fuHboj7RSQM5YjDT7edaIV1HlbeGAwmc82/c8XEK1ytTDkcXKeuEzWUlxf1JRdzOJlq7sXv/YGlKKxwi91pQtBXHfZ12hUNn/mCFNZEaYxPY6YC5QnH9JSwhSl8qCRpFu9Pfb3Yh9M+zlLSmvAmTmGbL/uBoUfkFhY48dvDa30/wUV4I5HembMoLs7b6xGKISOE9Ce5p6qql3//XX3XY88RNVICllvbpM7Hk+WhOab8aps0nzIeRZSoZZQYWaIkyLMNNfnEbprtHAAmQCtlyL6OEq3Q34WRLcqPyb4V+RqlGVuHLMP3FEofXDPrhqsVPYIHM6xtQg86SZgZ1Kj21iWjEjVu3jL5Y3eOgEw3WK3l4vnrRVlx65qTXRFaWlGqJ/Fq1qivVUdKHMvyW3JXMJZPBs9W6BhyZ6YYZlrSW++wXHVIiaCpIg1EpO61F9syePK7fMokt8noekMWInb+rnRLzLgqu/ved3jw8tQePrq6dpp771eUDvPDA+TT25uo2Ho4v4V9n1gjvOXY6+U+P3E4OqC7K75w/DglSZymQftNyUGaxkly+ujtcOVSi7K3xF987KRXdh+jrsKn2OZK57Pl1KTkJLCaD7PejqHjcDwYsXXGyBBWs98WrBtvieWN89FkJucTkTDif9ysgTgKnDwZHMfwY5FGwucQJAqdaS2zKhR1AiIoBIWgEBSCQlAICkEhKASFi1DIokobDV/Y2ePjvG7D4eaA+gcVRnyi9zdj6p/Pg8vncOedqCX+amXxOYA85exle8Xg4XDDYrxQX2g9iyBMYM8uLqEXiHVKyZbbkfFt5GwLCZ9ZhbANIJwrfp3GnU/04xsidpM3X9yb8BcoYJQzEwjvg09M++3ZNyNIbP6bMa1YCjVIchK2IvBFT8PaDaig85bDbmHyHljhaZ7+eDyFS9RXo9jx9K3hMrY57tcPh6SX/fiWPO4vKZ8TeKdg6SRTx4g8qhnRl/p4/oWJIiKpQx7YUpVIJGCrfgqG7CIb9zjSaeNgG7kzhaZyx7JM07Ljy4XWnuXlEtv1P7B9Mv8DltyUV+hIpoIAAAAASUVORK5CYII="
var mimeTypes = require('mimetypes');
var image = newImageData,
mimeType = image.match(/data:([a-zA-Z0-9]+\/[a-zA-Z0-9-.+]+).*,.*/)![1],
fileName = 'test.' + mimeTypes.detectExtension(mimeType),
base64EncodedImageString = image.replace(/^data:image\/\w+;base64,/, ''),
imageBuffer = new Buffer(base64EncodedImageString, 'base64');
// Instantiate the GCP Storage instance
const { Storage } = require('#google-cloud/storage');
const googleCloudStorage = new Storage(firebaseSettings);
const bucket = googleCloudStorage.bucket('projectID.appspot.com');
var file = bucket.file(fileName);
return file.save(imageBuffer, {
metadata: { contentType: mimeType, cacheControl: "public, max-age=300" },
public: true,
validation: 'md5'
}, function (error: any) {
if (error) {
throw 'error';
}
return "https://storage.googleapis.com/share-expanses-dcc9f.appspot.com/" + fileName;
});
}
Thanks for the help
Haven't been able to test the solution given by Firebase, but here's the transcript of the response:
The problem that you are facing could be because of two reasons. The
first one is how you are uploading the files, via the Firebase
Console, using any Admin SDK, or via the gsutil command. If using the
Admin SDK option, the problem is a known issue where the required
metadata doesn’t exist, fortunately there is a workaround, you can try
this script to solve this issue.
Now, the second one is related to the network if you are using
comcast, please, try on a different network to see if this issue is
related to that.
When you save an image to firebase, you need to provide an access token in metadata : firebaseStorageDownloadTokens. It has to be an uuid.
More info can be found here : https://www.sentinelstand.com/article/guide-to-firebase-storage-download-urls-tokens
const { v4: uuid } = require("uuid")
function uploadImage() {
const newImageData = "data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAOEAAADhCAMAAAAJbSJIAAAAgVBMVEX///8AAAAEBASAgIDr6+vw8PBYWFjU1NTGxsbz8/P29vb8/Py1tbVhYWHd3d1ra2vk5OS/v78pKSlTU1NOTk6Tk5OpqanNzc13d3dKSkplZWWbm5s5OTkfHx+NjY2GhoYcHBw9PT0TExOioqJ7e3soKCiurq5CQkI6OjoXFxcwMDAuPQWoAAAIJ0lEQVR4nO2daXuqPBCGVfYtIbKLgorLKf//B75ga2sPAdmS8F5n7m/VXgyPISGZzExWKwAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAOAVy6wwlCZG/YUl+vamYLoy0lAkORWbdZNN/YUUVf8ju6bomx2Kq+mEBL5K0UVD9QNCdM0Vfds9QSQMnYTWaN1sEicMCRJ9+29QSCqp28HiftiqUkoU0TLaMDLsl8PbrtGWpY8zQ7QYCoE/pe3+ZusHogW9Yln2nxnVPfljW8t4l8go2zPQV7PPkCxa3kqOvOldr52NF4nVKNuYobxPsC1OoxVilu33ZINDQf1Rw2cO+mrOWBMhMD5w0ldziLnrQyeO+mpOfOdzZspZX03KcQGC2I+gNDC3ZkQ7IQLX6x0niW7fZd83ief7Pg7tH0JcfeIlQy+kcllCyk7/O/LUWIsiDbkVv9/bcv0Rqr+MVa//BR0Ob/++AstdKLuK8n4ZZCiKK4e7cikSez2iTjxm2DPjPj8e8wf1zWviXPqpPuHyeuqXb2ZK6WxaqJCuZe7m7qTTp1da6ty7prtbMoOOVro64VktonlmyFZUqB0NybQrkna7l1ndZIhc2k0xbESjdTEf63M7yBQ9bjO2Z+enamvCXcTCphG1zZ3YNSJ9OroJmRkM6UMOZmXPoppzWE4WEX1oY7Xmv1Fs7Zn9nl9gWt+/MTImNU2dprze+6GfmmYlRrYoHZ9dF/whbJrdMTLVdMzEPDZRlOZr48DIVMMQ2wnUN5SpIiNLjaF7z74X1uiNwWbDyNI/qHCd8XBGW1nDLj+F64K9RKtomuWocMvq3fvDjbIk5ahwvc0YGXuS0dbcPBWu1zlLx4mbU23yVbi+s3srkjvdJGeF1WqGTTO6rZsH3BVWjyqae1C1EP0BFaVwvQ60OR1EshZ0GROisJqG23Pt1Gp2q49GqML1OsmK6aMOKbK3OzbCFFZsLwEZ75YySHDpE1olUmFl/XrP02jE5aM0v1972phd2ydDQkvOnjRk4aFL3pDAjiUo/LoTHOuKLMtGzet+lPn4pPpG0eMRQTnLUfi8oTreWc0j7UmUq22x0f9PhbMDCkEhKBQPKJxNYTkgEmYCXiMShZvCXEkHTUXGcPZSpbFS5KZQWq3Qu4XORGIbUTa9eCqsFqt6OGemxSvbUH8sqwUrXK0sl/gM9PnE/XKNCFdYYZozB7VvsPkzU1+Cwhrrph7mGHfOB/X226+1FIU1duztr+Nlnq97L7YbV12SworITqXdvmcw5QvlfielNtU5sDCFNYpO0iJTk2MvbcdEzYqUtIdVLVDhAwvVubIk8D3Po421j88DUufHvvEnL1XhF2Yd7+xqTR6f9wq1XbjCGQCFs/EPKsR8sq6Vxi4bvxVwzCOJBTVXLxzX+BxSrgPKm4arFwM3J1lzQk/D5eunOTjsIqN0h57Gyd0TlbBJYtVa9xFF+NpUedZSM5Ypd6UfifEm4tts5QFkdOvO4RTlLz3GoTZDVpAWxu/WJAI9wn/uRaiPj65x9bC496ixIdjnvXek7Db8HWLfMsnpWWFjAV79w9ZxqlVtzwtXq2XH2Q7I71+Aws/72FyvVz/IsrzlgnmWBX71P5vBV16IQhr3uCUYbxBLVjgPoBAUgkLxgEJQCArFAwpBISgUDzeF1+E7vWMor8IUZi5mFUrzwxa7jRRLnntP0YCqUaNwItG7a3LBUqNT1B5K0fuHboj7RSQM5YjDT7edaIV1HlbeGAwmc82/c8XEK1ytTDkcXKeuEzWUlxf1JRdzOJlq7sXv/YGlKKxwi91pQtBXHfZ12hUNn/mCFNZEaYxPY6YC5QnH9JSwhSl8qCRpFu9Pfb3Yh9M+zlLSmvAmTmGbL/uBoUfkFhY48dvDa30/wUV4I5HembMoLs7b6xGKISOE9Ce5p6qql3//XX3XY88RNVICllvbpM7Hk+WhOab8aps0nzIeRZSoZZQYWaIkyLMNNfnEbprtHAAmQCtlyL6OEq3Q34WRLcqPyb4V+RqlGVuHLMP3FEofXDPrhqsVPYIHM6xtQg86SZgZ1Kj21iWjEjVu3jL5Y3eOgEw3WK3l4vnrRVlx65qTXRFaWlGqJ/Fq1qivVUdKHMvyW3JXMJZPBs9W6BhyZ6YYZlrSW++wXHVIiaCpIg1EpO61F9syePK7fMokt8noekMWInb+rnRLzLgqu/ved3jw8tQePrq6dpp771eUDvPDA+TT25uo2Ho4v4V9n1gjvOXY6+U+P3E4OqC7K75w/DglSZymQftNyUGaxkly+ujtcOVSi7K3xF987KRXdh+jrsKn2OZK57Pl1KTkJLCaD7PejqHjcDwYsXXGyBBWs98WrBtvieWN89FkJucTkTDif9ysgTgKnDwZHMfwY5FGwucQJAqdaS2zKhR1AiIoBIWgEBSCQlAICkEhKASFi1DIokobDV/Y2ePjvG7D4eaA+gcVRnyi9zdj6p/Pg8vncOedqCX+amXxOYA85exle8Xg4XDDYrxQX2g9iyBMYM8uLqEXiHVKyZbbkfFt5GwLCZ9ZhbANIJwrfp3GnU/04xsidpM3X9yb8BcoYJQzEwjvg09M++3ZNyNIbP6bMa1YCjVIchK2IvBFT8PaDaig85bDbmHyHljhaZ7+eDyFS9RXo9jx9K3hMrY57tcPh6SX/fiWPO4vKZ8TeKdg6SRTx4g8qhnRl/p4/oWJIiKpQx7YUpVIJGCrfgqG7CIb9zjSaeNgG7kzhaZyx7JM07Ljy4XWnuXlEtv1P7B9Mv8DltyUV+hIpoIAAAAASUVORK5CYII="
var mimeTypes = require('mimetypes');
var image = newImageData,
mimeType = image.match(/data:([a-zA-Z0-9]+\/[a-zA-Z0-9-.+]+).*,.*/)![1],
fileName = 'test.' + mimeTypes.detectExtension(mimeType),
base64EncodedImageString = image.replace(/^data:image\/\w+;base64,/, ''),
imageBuffer = new Buffer(base64EncodedImageString, 'base64');
// Instantiate the GCP Storage instance
const { Storage } = require('#google-cloud/storage');
const googleCloudStorage = new Storage(firebaseSettings);
const bucket = googleCloudStorage.bucket('projectID.appspot.com');
var file = bucket.file(fileName);
return file.save(imageBuffer, {
metadata: {
contentType: mimeType,
cacheControl: "public,
max-age=300",
// THIS IS THE LINE YOU NEED TO ADD
firebaseStorageDownloadTokens: uuid(),
},
public: true,
validation: 'md5'
}, function (error: any) {
if (error) {
throw 'error';
}
return "https://storage.googleapis.com/share-expanses-dcc9f.appspot.com/" + fileName;
});
}
After that you'll need to click on "Create access token"
#jean-smaug answer is almost complete. Based on the page he linked (https://www.sentinelstand.com/article/guide-to-firebase-storage-download-urls-tokens), the only missing thing is to wrap the firebaseStorageDownloadTokens property inside a metadata object. I've just tested it and it's working fine 👌 No need to create access token afterwards.
In my case I added metadata while uploading and it loading as it showed in image but when I'm refresh page after 3 min I found that it upload correctly , so as Cafn explain if it not matter of metadata you should wait until it loaded
$uploadedObject=$bucket->upload($imageFile, [
'name' => 'Image_Name',
"metadata" => [ "contentType"=> 'image/png'],
]);

Aplakka scala s3 connector hangs when trying to put data

I'm trying to process aws s3 put into bucket, with a simple string, I couldn't do this with alpakka (scala) but I can process with same request using aws java sdk
Using alpakka my thread just hangs not processing anything, Future.onComplete not triggering
I've tried to specify aplakka conf file like this ('*' marks covers sensitive data) :
alpakka.s3 {
aws {
credentials {
provider = static
access-key-id = "********"
secret-access-key = "********"
}
region {
provider = static
default-region = "*****"
}
}
}
I do have ~/.aws/credentials on my machine correct, I can connect both with aws sdk and aws cli
As I understand ideally I may not specify any apakka.s3 creds at all, like in aws java sdk
I've already checked this article https://discuss.lightbend.com/t/alpakka-s3-connection-issue/6551/2 nothing worked
My example is strainghforward scala code from docs:
val file: Source[ByteString, NotUsed] =
Source.single(ByteString(body))
val s3Sink: Sink[ByteString, Future[MultipartUploadResult]] =
S3.multipartUpload(bucket, bucketKey)
val result: Future[MultipartUploadResult] =
file.runWith(s3Sink)
but actually I also need my source to be InputStream
val source: Source[ByteString, Future[IOResult]] = StreamConverters.fromInputStream(() => is, 4096)
PS: I don't actually get why i need to specify some host like this:
endpoint-url = "http://localhost:9000"
If you leave alpakka.s3.aws empty, it will use the default AWS configuration methods, as in the CLI (e.g. you can use the AWS_REGION environment variable to set the region and the standard AWS credential file). You can also leave alpakka.s3.aws.credentials empty to use the default AWS credential methods and set the AWS region via
alpakka.s3 {
aws {
region {
provider = static
default-region = "us-east-1"
}
}
}
endpoint-url is only for use with alternative (non-AWS) implementations of the S3 API (e.g. minio). If you're setting it, you will not be able to connect to AWS S3.

How do I access environment variables in config SailsJS

I'm trying to access my environment variables inside a config file. Can I use this variable inside a config?
For example
// config/env/development.js
module.exports = {
appUrl: 'http://MY_DEV_PLACE/',
}
//config/passport.js
var appUrl = appUrl || sails.config.appUrl || 'localhost:1337'; //<-- sails is not defined
I also tried in local.js:
// config/local.js
module.exports = {
gAPI: { secret: 'aaa'}
}
//config/passport.js
var appUrl = gAPI || sails.config.gAPI || 'some pass'; //<-- sails is not defined
EDIT:
For appURL I'm using env like: APP_URL=http://example.com/api sails lift
For password I'm using:
var locals;
try {
locals = require('./local');
} catch (e) {
// not local so just ignore
}
module.exports.passport = {
'GoogleAPI.Password': locals ? locals.gAPI.secret : ’some key'
};
You can use the local.js file for environment variables. This file is discussed in-depth here. This is pretty much the go to for storing environment variables in sails.
Important caveats: make sure this file is included in your .gitignore file lest you risk exposing important information to the world, this file will need to be configured for each environment (e.g. local, staging, production), you can access your environment variables via sails.config.variable_name, this file will take priority over the development.js and production.js file in the /env/ sub-directory.
I accessed the sails object from the routes.js like this:
module.exports.routes = {
'/': (req, res) => {
res.redirect(sails.config.frontendUrl)
}
}
In config/global.js add the following line....
sails: true,
This makes sails instance global.

How do I use jest with coffeescript and ES6/ES2015 (e.g. via Babel)?

My team has been using both coffeescript and ES6/ES2015 (via Babel) in our project. Since the files are ultimately compatible due to Babel, it's been working great. But I can't figure out how we can write a jest test that allows both.
I've found examples of how to use jest with coffee, or ES6 features, but not both:
Example with coffeescript
Another example with coffeescript
Example with Babel
Somewhere someone suggested to set the preprocessor as babel-jest, which works fine for me if I only use it with ES6.
These all work. But again, I can't figure out how to combine them
What I've tried
I tried my own solution:
In package.json I have:
"jest": {
"scriptPreprocessor": "<rootDir>/jest-script-preprocessor",
"unmockedModulePathPatterns": [
"<rootDir>/node_modules/react",
"<rootDir>/node_modules/react-dom",
"<rootDir>/node_modules/react-addons-test-utils",
"<rootDir>/node_modules/fbjs"
],
"testFileExtensions": ["coffee", "cjsx", "js", "jsx"],
"moduleFileExtensions": ["coffee", "cjsx", "js", "jsx"]
}
In jest-script-preprocessor.js, I have:
var coffee = require('coffee-react');
var transform = require('coffee-react-transform');
var babelJest = require("babel-jest");
module.exports = {
process: function(src, path) {
if (coffee.helpers.isCoffee(path)) {
console.log(path);
return coffee.compile(transform(src), {
'bare': true
});
} else {
console.log(path);
return babelJest.process(src, {
filename: path,
stage: 0
});
}
}
};
If I run a test like npm test __tests__/sometest.jsx, it loads the ES6 test file fine. That test file will import the module under test, which is also ES6, and THAT'S where it blows up. It simply says Unexpected reserved word as soon as it hits an ES6-only syntax, like import, export, etc. There is no additional line information, but I know it's ES6 that causes the problem because if I change the module under test to be ONLY export default 'mystring', it blows up, and if I change it to non-ES6 syntax like var q = 5 + 5; module.exports = q;, it imports the module fine. (Of course, that's not really a testable module, but it doesn't need to be for this proof-of-concept.)
Note the console.log() lines in there. I never see them. So one reason this has been so tricky to track down is I can't put any of my own debug logic in. I'm sure these lines run, because if I throw in some random syntax on those lines, it'll choke. But no console output.
I've tried jest-webpack-alias, which is officially recommended by jest, and it sounds great in theory: you use jest with webpack, which then allows you to use whatever preprocessors you've already set up. It gives me the same error of Unexpected reserved word. I wrote up an issue on their github repo.
Notes
I found jestpack, but I don't want to use it as it requires Node >= 5, and I want to use Node 4.2.3. It also doesn't work with Jest >= 0.8, and I want to use Jest 0.8, as it's currently the latest and I assume has the best chance of being in sync with the live docs and react version (0.14.6).
Here's what I'm doing that's working for me.
npm install --save-dev babel-jest
npm install --save-dev coffee-react
In package.json:
"jest": {
"scriptPreprocessor": "<rootDir>/jest-script-preprocessor",
"unmockedModulePathPatterns": [
"<rootDir>/node_modules/."
],
"testFileExtensions": ["coffee", "cjsx", "js", "jsx"],
"moduleFileExtensions": ["coffee", "cjsx", "js", "jsx"]
}
Take note of the unmockedModulePathPatterns. I set it to match everything in node_modules. You might not want that, but if you start getting errors like Error: Failed to get mock metadata:..., consider it, as recommended here.
In jest-script-preprocessor.js:
var babelJest = require('babel-jest');
var coffee = require('coffee-react');
module.exports = {
process: function(src, filename) {
if (filename.indexOf('node_modules') === -1) {
if (filename.match(/\.coffee|\.cjsx/)) {
src = coffee.compile(src, {bare: true});
} else {
src = babelJest.process(src, filename);
}
}
return src;
}
};