How can I run SQL on PostgreSQL RDS from Lambda function in Node.js? - postgresql

I have this code in Lambda funcion:
sql="SELECT ...";
var pg = require('pg');
var connectionString = 'postgres://...';
var client = new pg.Client(connectionString);
client.connect();
var query = client.query(sql);
query.on('end', function() { client.end(); });
When I run from EC2, it works fine. When I run from Lambda I get Error: Cannot find module 'pg'

I am a super noob in Node JS, but I really wanted to try AWS Lambda. Here are the steps I took. I used a Ubuntu 14.04.
sudo apt-get update
sudo apt-get install nodejs
sudo apt-get install npm
sudo ln -s /usr/bin/nodejs /usr/bin/node
mkdir the_function && cd the_function
mkdir node_modules
npm install pg
******Now create a file index.js and paste the content of your funcion.
console.log('Loading S2 Function');
var pg = require("pg");
exports.handler = function(event, context) {
var conn = "pg://user:password#host:5432/bd_name";
var client = new pg.Client(conn);
client.connect();
var query = client.query("SELECT * FROM BLA WHERE ID = 1");
query.on("row", function (row, result) {
result.addRow(row);
});
query.on("end", function (result) {
var jsonString = JSON.stringify(result.rows);
var jsonObj = JSON.parse(jsonString);
console.log(jsonString);
client.end();
context.succeed(jsonObj);
});
};
******Now zip the contents of the_function folder (not the_function folder itself)
You can check the official sample from this AWS link: http://docs.aws.amazon.com/lambda/latest/dg/walkthrough-s3-events-adminuser-create-test-function-create-function.html

You can import easily only predifined libs to your lambda. For example You can use just boto3 and core for python, for java You can use just core. http://docs.aws.amazon.com/lambda/latest/dg/current-supported-versions.html
You cannot import any additional libs in simple way.
You can try to use "hard way".
In this case You should save all necessary libraries to s3(or other place which You can access from lambda) and then copy to lambda environment (/tmp) and import it with the help of reflection.

Error: Cannot find module 'pg'
In my case I was just uploading index.js. We need to package 3rd party node modules as well.
create index.js (name may vary based on your handler name)
run npm install package
It is better you create package.json will all dependencies you need and run mpn install
Confirm node_modules folder is created in same directory.
Zip these contents (index.js and node_modules folder) and upload the zip.
You can upload directly or use S3.
For more details read their offcial doc - Creating a Deployment Package (Node.js)
Now I get: Unable to import module 'index': Error . Is my function must be called index.js
In my case I was zipping the entire directory instead of it's contents. So you need to really do -
zip -r deploymentpkg.zip ./*
instead of
zip -r deploymentpkg.zip folder

Related

ModuleNotFoundError: No module named 'gspread' on python anywhere

What I am trying to achieve.
Run a python script saved On pythonanywhere host from google sheets on a button press.
Check the answer by Dustin Michels
Task of Each File?
app.py: contains code of REST API made using Flask.
runMe.py: contains code for that get values from(google sheet cell A1:A2). And sum both values send sum back to A3.
main.py: contains code for a GET request with an argument as name(runMe.py).filename may change if the user wants to run another file.
I Made an API by using Flask.it works online and offline perfectly but still, if you want to recommend anything related to the app.py.Code Review App.py
from flask import Flask, jsonify
from flask_restful import Api, Resource
import os
app = Flask(__name__)
api = Api(app)
class callApi(Resource):
def get(self, file_name):
my_dir = os.path.dirname(__file__)
file_path = os.path.join(my_dir, file_name)
file = open(file_path)
getvalues = {}
exec(file.read(), getvalues)
return jsonify({'data': getvalues['total']})
api.add_resource(callApi, "/callApi/<string:file_name>")
if __name__ == '__main__':
app.run()
Here is the Code of runMe2.py
import gspread
from oauth2client.service_account import ServiceAccountCredentials
# use creds to create a client to interact with the Google Drive API
scopes =['https://www.googleapis.com/auth/spreadsheets',"https://www.googleapis.com/auth/drive.file","https://www.googleapis.com/auth/drive"]
creds = ServiceAccountCredentials.from_json_keyfile_name('service_account.json', scopes)
client = gspread.authorize(creds)
# Find a workbook by name and open the first sheet
# Make sure you use the right name here.
sheet = client.open("Demosheet").sheet1
# Extract and print all of the values
list_of_hashes = sheet.get_all_records()
print(list_of_hashes)
below is the main.py code
import requests
BASE = 'https://username.pythonanywhere.com/callApi/test.py'
response = requests.get(BASE)
print(response.json())
main.py output
{'data': 54}
Test.py code
a = 20
b = 34
total = a+b
print(total)
PROBLEM IS
if I request runMe2.py at that time I am got this error.
check runMe2.py code above
app.py is hosted on https://www.pythonanywhere.com/
ModuleNotFoundError: No module named 'gspread'
However, I installed gspread on pythonanywhere why using the command. but it's not working.
You either haven't installed the gspread package on your current python environment or it is installed somewhere (e.g. in a diff. virtual env) and your script cant find it.
Try installing the package inside the environment your running your script in using pip3:
pip3 install gspread
You can try something like this on Windows
pip install gspread
or on Mac
pip3 install gspread
If you're running on Docker, or building with a requirements.txt you can try adding this line you your requirements.txt file
gspread==3.7.0
Any other instructions for this package can be found here => https://github.com/burnash/gspread
Download gspread here
Download the tar file: gspread-3.7.0.tar.gz from the above link
Extract file and convert folder in zip then upload it back on server
Open bash console and use command as
$ unzip gspread-3.7.0
$ cd gspread-3.7.0
$ python3.7 setup.py install --user

How to import data from cloud firestore to the local emulator?

I want to be able to run cloud functions locally and debug against a copy from the production data.
Is there a way to copy the data that is online to the local firestore emulator?
This can be accomplished through a set of commands in terminal on the existing project:
1. Login to firebase and Gcloud:
firebase login
gcloud auth login
2. See a list of your projects and connect to one:
firebase projects:list
firebase use your-project-name
gcloud projects list
gcloud config set project your-project-name
3. Export your production data to gcloud bucket with chosen name:
gcloud firestore export gs://your-project-name.appspot.com/your-choosen-folder-name
4. Now copy this folder to your local machine, I do that in functions folder directly:
Note : Don't miss the dot ( . ) at the end of below command
cd functions
gsutil -m cp -r gs://your-project-name.appspot.com/your-choosen-folder-name .
5. Now we just want to import this folder. This should work with the basic command, thanks to latest update from Firebase team https://github.com/firebase/firebase-tools/pull/2519.
firebase emulators:start --import ./your-choosen-folder-name
Check out my article on Medium about it and a shorthanded script to do the job for you https://medium.com/firebase-developers/how-to-import-production-data-from-cloud-firestore-to-the-local-emulator-e82ae1c6ed8
Note: Its better to use a different bucket for it, as copying into your project bucket will result in the folder created in your firebase storage.
If you are interested in gsutil arguments like -m, you can see them described by executing gsutil --help.
My method is somewhat manual but it does the trick. I've shared it in this useful Github thread but I'll list the steps I did here if you find them useful:
Go to my local Firebase project path.
Start the emulators using: firebase emulators:start
Create manually some mockup data using the GUI at http://localhost:4000/firestore using the buttons provided: + Start Collection and + Add Document.
Export this data locally using: emulators:export ./mydirectory
About the project data located at Firebase Database / Cloud Firestore, I exported a single collection like this: gcloud firestore export gs://my-project-bucket-id.appspot.com --collection-ids=myCollection The export is now located under Firebase Storage in a folder with a timestamp as name (I didn't use a prefix for my test)
Download this folder to local drive with: gsutil cp -r gs://my-project-bucket-id.appspot.com/myCollection ./production_data_export NOTE: I did this in a Windows environment... gsutil will throw this error: "OSError: The filename, directory name, or volume label syntax is incorrect" if the folder has invalid characters for a folder name in Windows (i.e. colons) or this error: "OSError: Invalid argument.9.0 B]" if an inner file in the folder has invalid characters too. To be able to download the export locally, rename these with a valid Windows name (i.e. removing the colons) like this: gsutil mv gs://my-project-bucket-id.appspot.com/2020-05-22T02:01:06_86152 gs://my-project-bucket-id.appspot.com/myCollection
Once downloaded, imitate the local export structure renaming the folder to firestore_export and copying the firebase-export-metadata.json file from the local export folder. Just to be visual, here's the structure I got:
$ tree .
.
├── local_data_export
│ ├── firebase-export-metadata.json
│ └── firestore_export
│ ├── all_namespaces
│ │ └── all_kinds
│ │ ├── all_namespaces_all_kinds.export_metadata
│ │ └── output-0
│ └── firestore_export.overall_export_metadata
└── production_data_export
├── firebase-export-metadata.json
└── firestore_export
├── all_namespaces
│ └── kind_myCollection
│ ├── all_namespaces_kind_myCollection.export_metadata
│ ├── output-0
│ └── output-1
└── firestore_export.overall_export_metadata
8 directories, 9 files
Finally, start the local emulator pointing to this production data to be imported: firebase emulators:start --import=./mock_up_data/production_data_export/
You should see the imported data at: http://localhost:4000/firestore/
This should assist readers for now, while we await a more robust solution from the Firebase folks.
You can use the firestore-backup-restore to export and import your production data as JSON files.
I wrote a quick hack to allow for importing these JSON in the Firebase Simulator Firestore instance.
I proposed a pull request and made this npm module in the meantime.
You can use it this way:
const firestoreService = require('#crapougnax/firestore-export-import')
const path = require('path')
// list of JSON files generated with the export service
// Must be in the same folder as this script
const collections = ['languages', 'roles']
// Start your firestore emulator for (at least) firestore
// firebase emulators:start --only firestore
// Initiate Firebase Test App
const db = firestoreService.initializeTestApp('test', {
uid: 'john',
email: 'john#doe.com',
})
// Start importing your data
let promises = []
try {
collections.map(collection =>
promises.push(
firestoreService.fixtures(
path.resolve(__dirname, `./${collection}.json`),
[],
[],
db,
),
),
)
Promise.all(promises).then(process.exit)
} catch (err) {
console.error(err)
}
Obviously, since this data won't persist in the emulator, you'll typically inject them in the before() function of your test suite or even before every test.
There is no built-in way to copy data from a cloud project to the local emulator. Since the emulator doesn't persist any data, you will have to re-generate the initial data set on every run.
I was able to make some npm scripts to import from remote to local emulator and vice-versa.
"serve": "yarn build && firebase emulators:start --only functions,firestore --import=./firestore_export",
"db:update-local-from-remote": "yarn db:backup-remote && gsutil -m cp -r gs://my-firebase-bucket.appspot.com/firestore_export .",
"db:update-remote-from-local": "yarn db:backup-local && yarn db:backup-remote && gsutil -m cp -r ./firestore_export gs://my-firebase-bucket.appspot.com && yarn run db:import-remote",
"db:import-remote": "gcloud firestore import gs://my-firebase-bucket.appspot.com/firestore_export",
"db:backup-local": "firebase emulators:export --force .",
"db:rename-remote-backup-folder": "gsutil mv gs://my-firebase-bucket.appspot.com/firestore_export gs://my-firebase-bucket.appspot.com/firestore_export_$(date +%d-%m-%Y-%H-%M)",
"db:backup-remote": "yarn db:rename-remote-backup-folder && gcloud firestore export gs://my-firebase-bucket.appspot.com/firestore_export"
So you can export the local Firestore data to remote with:
npm db:update-remote-from-local
Or to update your local Firestore data with remote one, do:
npm db:update-local-from-remote
These operations will backup the remote Firestore data, making a copy of it and storing it on Firebase Storage.
I was about to go add a cli option to firebase-tools but pretty happy with the node-firestore-import-export package.
yarn add -D node-firestore-import-export
"scripts": {
"db:export": "firestore-export -a ./serviceAccountKey.json -b ./data/firestore.json",
"db:import": "firestore-import -a ./serviceAccountKey.json -b ./data/firestore.json",
"db:emulator:export": "export FIRESTORE_EMULATOR_HOST=localhost:8080 && yarn db:export",
"db:emulator:import": "export FIRESTORE_EMULATOR_HOST=localhost:8080 && yarn db:import",
"db:backup": "cp ./data/firestore.json ./data/firestore-$(date +%d-%m-%Y-%H-%M).json",
"dev": "firebase emulators:start --import=./data --export-on-exit=./data",
},
You will need to create a service account in the firebase console.
You can replace the GCLOUD_PROJECT environment variable with hard coded values.
open https://console.firebase.google.com/project/$GCLOUD_PROJECT/settings/serviceaccounts/adminsdk
mv ~/Downloads/myProjectHecticKeyName.json ./serviceAccountKey.json
That being said the gcloud tools are definitely the way to go in production, as you will need s3 backups anyway.
you can use fire-import npm package. for importing both firestore and firebase storage
There is also a way to import data to local storage from Google Cloud Storage without any commands:
export Firestore to Google cloud storage bucket by clicking More in google cloud
choose your desired file in google cloud storage bucket
open terminal (Google terminal shell near the search bar)
in terminal click Open editor
right click on desired file in online VSCode and click download.
You shoud start downloading of .tar file which is in fact your exported data from firestore.
Create a folder in your root (as example you may call it 'firestore-local-data')
Copy paste (or unarchive data) to this folder from archive file .tar
run firebase emulators:start --import ./firestore-local-data
This should do the trick
I wrote a little script to able to do that:
const db = admin.firestore();
const collections = ['albums', 'artists'];
let rawData: any;
for (const i in collections) {
rawData = fs.readFileSync(`./${collections[i]}.json`);
const arr = JSON.parse(rawData);
for (const j in arr) {
db.collection(collections[i]).add(arr[j])
.then(val => console.log(val))
.catch(err => console.log('ERRO: ', err))
}
}

Using gulp with a team on a Local Server. Error: EPERM: operation not permitted, chmod

I am working with a web team and we keep all our files on a local shared server in the office. ( we are slowly moving everything over to git so please no comments about how dumb we are for not using git. Thanks! )
We are using gulp to compile our sass to css and when one of us compiles we are fine but once someone else tries to run a node process and compile with gulp we get the following error....
[10:12:53] Starting 'sass'...
[10:12:53] Starting 'watch'...
[10:12:54] Finished 'watch' after 173 ms
[10:12:54] 'sass' errored after 442 ms
EPERM: operation not permitted, chmod '/the file path/'
I have tried using chmod to change the file permissions but I don't think that is the issue. I use atom as my editor and some of the other developers on the team use sublime.
I have read that some editors can lock files. Not sure if this is the cause but if it is I don't know how to fix this. Is the only solution to this problem to use git and have local copies on our own personal computers?
Thanks in advance!
gulpfile.js
// Include gulp
var gulp = require('gulp');
// Include Our Plugins
var sass = require('gulp-sass');
var plumber = require('gulp-plumber');
var cleanCSS = require('gulp-clean-css');
var sourcemaps = require('gulp-sourcemaps');
var sassOptions = {
errLogToConsole: true,
outputStyle: 'nested' // Styles: nested, compact, expanded, compressed
};
// Compile Sass file to CSS, and reload browser(s).
gulp.task('sass', function() {
return gulp.src('includes/scss/*.scss')
.pipe(plumber())
.pipe(sourcemaps.init())
.pipe(sass.sync(sassOptions))
.pipe(sass.sync().on('error', sass.logError))
.pipe(sourcemaps.write())
.pipe(gulp.dest('includes/css'));
});
gulp.task('minify-css', function() {
return gulp.src('includes/css/*.css')
.pipe(sourcemaps.init({loadMaps: true}))
.pipe(cleanCSS({compatibility: 'ie8'}))
.pipe(sourcemaps.write())
.pipe(gulp.dest('includes/css'));
});
// Watch Files For Changes
gulp.task('watch', function() {
gulp.watch('includes/scss/**/*.scss', ['sass']);
});
// Default Task
//gulp.task('serve', ['sass', 'minify-css', 'watch']);
gulp.task('serve', ['sass', 'watch']);
This happens because you need to run your gulpfile as admin.
So run sudo -i, insert your admin password, and then just run again.
I was on the same problem, it worked for me.
Sometimes this is caused by Watch. But more often this is because Gulp preserve the flags in the gulp.dest command. If you have a source file with read-only flags. You have to overwrite the flags each time your source file is included in gulp.dest command.
This way:
.pipe(gulp.dest('includes/css', mode: 0o777));
That problem has also happened to me. What I did was start from a terminal as root, and just write gulp to me I worked.
Just uninstall your gulp :
npm uninstall gulp -g
then
npm install gulp -g
Set path in environment valiable in windows.
Restart your system or cmd prompt.
I was getting the error on compile-sass. I had to delete the destination .css file, then all was well.

Unable to clone Git repository - "Object function ... has no method 'hasMagic'"

I am trying to clone a Git repository, which contains an Ember-CLI project (https://github.com/tgfischer/StockMarketApp). When I do that, I get the following error:
tom#tom-fischer:~/Desktop/StockMarketApp$ ember server
version: 0.2.0-beta.1
Could not find watchman, falling back to NodeWatcher for file system events
Livereload server on port 35729
Serving on http://0.0.0.0:4200/
Object function glob(pattern, options, cb) {
if (typeof options === "function") cb = options, options = {}
if (!options) options = {}
if (typeof options === "number") {
deprecated()
return
}
var g = new Glob(pattern, options, cb)
return g.sync ? g.found : g
} has no method 'hasMagic'
TypeError: Object function glob(pattern, options, cb) {
if (typeof options === "function") cb = options, options = {}
if (!options) options = {}
if (typeof options === "number") {
deprecated()
return
}
var g = new Glob(pattern, options, cb)
return g.sync ? g.found : g
} has no method 'hasMagic'
at rimraf (/home/tom/Desktop/StockMarketApp/node_modules/ember-cli/node_modules/broccoli-caching-writer/node_modules/rimraf/rimraf.js:57:13)
at lib$rsvp$node$$tryApply (/home/tom/Desktop/StockMarketApp/node_modules/ember-cli/node_modules/broccoli-caching-writer/node_modules/rsvp/dist/rsvp.js:1467:11)
at lib$rsvp$node$$handleValueInput (/home/tom/Desktop/StockMarketApp/node_modules/ember-cli/node_modules/broccoli-caching-writer/node_modules/rsvp/dist/rsvp.js:1567:20)
at fn (/home/tom/Desktop/StockMarketApp/node_modules/ember-cli/node_modules/broccoli-caching-writer/node_modules/rsvp/dist/rsvp.js:1555:18)
at /home/tom/Desktop/StockMarketApp/node_modules/ember-cli/node_modules/broccoli-caching-writer/index.js:100:14
at lib$rsvp$$internal$$tryCatch (/home/tom/Desktop/StockMarketApp/node_modules/ember-cli/node_modules/promise-map-series/node_modules/rsvp/dist/rsvp.js:489:16)
at lib$rsvp$$internal$$invokeCallback (/home/tom/Desktop/StockMarketApp/node_modules/ember-cli/node_modules/promise-map-series/node_modules/rsvp/dist/rsvp.js:501:17)
at lib$rsvp$$internal$$publish (/home/tom/Desktop/StockMarketApp/node_modules/ember-cli/node_modules/promise-map-series/node_modules/rsvp/dist/rsvp.js:472:11)
at Object.lib$rsvp$asap$$flush [as _onImmediate] (/home/tom/Desktop/StockMarketApp/node_modules/ember-cli/node_modules/promise-map-series/node_modules/rsvp/dist/rsvp.js:1290:9)
at processImmediate [as _immediateCallback] (timers.js:330:15)
Here are the steps I am following:
git clone https://github.com/tgfischer/StockMarketApp
cd StockMarketApp
bower install
npm install
ember server
ember server runs the project, and generates the error above.
When I run the version of the project that is locally on my computer (The project that is pushing to this repository), it works correctly. I've tried uninstalling/reinstalling Bower, Ember-CLI, PhantomJS. I've also tried cloning this project on my Windows parition without success.
Does anyone know what might be going wrong? Thanks for the help.
Looks like a glob#4.5.1 issue I just fixed it changing the package.json
to use a version that was working for me
Instead of "glob": "^4.0.5" use "glob": "4.4.0"
Then reinstall you packages
npm cache clean
rm -rf node_modules
npm install
(ember-cli#0.2.0 works for me with the default glob version, you are using the 0.2.0-beta.1 maybe if you update ember-cli it will work)
Just to add to this solution, which did work for me, there is now a recommended solution available on GitHub for Ember-cli:
https://github.com/ember-cli/ember-cli/issues/3486
Upgrading to Ember-cli 0.2.0 should fix the problem, but if that's not an option you can add "rimraf": "2.2.8" to your package.json and freeze glob at 4.0.5.
Tried juan's answer but it didn't work in my case. This solution did though:
npm explore ember-cli -- npm i glob#latest -S
npm explore ember-cli -- npm i bower
Many thanks to https://stackoverflow.com/users/175117/thock for the help!

CoffeeScript: No output after installation

I'm running Ubuntu 13.04, after installing using:
$ sudo npm install -g coffee-script
..with output..
npm http GET https://registry.npmjs.org/coffee-script
npm http 304 https://registry.npmjs.org/coffee-script
/usr/local/bin/coffee -> /usr/local/lib/node_modules/coffee-script/bin/coffee
/usr/local/bin/cake -> /usr/local/lib/node_modules/coffee-script/bin/cake
coffee-script#1.6.3 /usr/local/lib/node_modules/coffee-script
No commands yields any result, whatsoever:
$ coffee js.coffee
$ coffee -v
$ coffee GiveMeSomeCoffeePlease
I verified that it exists:
$ which coffee
/usr/local/bin/coffee
And the file has some contents:
$ cat `which coffee`
#!/usr/bin/env node
var path = require('path');
var fs = require('fs');
var lib = path.join(path.dirname(fs.realpathSync(__filename)), '../lib');
require(lib + '/coffee-script/command').run();
Also tried version 1.6.1 which works on my laptop. No difference on this computer though. Any ideas?
I finally found the solution. I had installed the package node on Ubuntu, which is something entirely different:
Amateur Packet Radio Node program (transitional package) The
existing node package has been renamed to ax25-node. This transitional
package exists to ease the upgrade path for existing users.
I went ahead and installed the nodejs package. But seems it didn't quite create the right binding anyway, I could run nodejs but not node. So I made an alias for it and now CoffeeScript is running just fine!
cd /usr/bin; sudo ln -s nodejs node
Same here .. In my expressjs app, instead of running via
node app
now it seems I have to run it via
nodejs app
I ll either create an alias or a symlink like Mika did. I am using Ubuntu 13.10 fyi.