I'm working on a sails app that I'll only use locally, so I'm using the sails-disk default for persistence. I would like to be able to backup and restore the data I put in there, though. Is this possible?
I didn't see anything in the waterline docs about working with the store other than via the API. I can write some sails code to export/import a dump, but if there's already something available or a standard way of doing this I'd prefer not to reinvent the wheel.
sails-disk, by default, stores data in .tmp/localDiskDb.db file inside your Sails project.
One way to take backup/restore is make copy of that file.
Related
We are working on an application that will be offered both as a web-based and as a cross-platform desktop solution by means of Electron.
Due to customer requirements, the desktop client cannot make use of "the cloud" to store data; all data should be stored in the local machine or, even better, the user should have the option to keep the database/data file on an external HDD so that another user on the same local network can use the same data file.
We've been looking at NeDB, PouchDB, etc, but all these use either Web SQL or IndexedDB on the browser itself to store the data.
NeDB can theoretically use the file system but that seems only possible for Node Webkit apps.
Another option is of course MongoDB, but it requires setting up a site on a web server. Seeing as how our users will set that up in on their own machines, that will work for one user only but would make it very hard for them to share the data (note: assume users with little technical know-how).
Is there a way to force NeDB to persist data in a file instead of the in-browser database?
Alternatively, does any one know of a file-based, compact database that plays well with electron/node?
We'd preferably like to use a NoSQL database, but options of file-based SQL databases will be considered as well.
I have some experience with NeDB in an Electron app and I can say it will definitely work on the filesystem.
How are you initializing NeDB (or whatever your database choice is)? Also, are you initializing it in the main or renderer process? If you can share that, I think we could trace the issue to a configuration issue.
This is how you start NeDB with a persistent data-store that saves to disk.
var Datastore = require('nedb')
, db = new Datastore({ filename: 'path/to/datafile', autoload: true });
I think MongoDB is going to be overkill for an Electron app (it's meant to be really a high performance, distributed database running in the cloud).
Another option you could consider is LevelDB (a key/value store that can persist to the filesystem) which is popular in the node community. (EDIT 4/17/17 IndexedDB uses LevelDB underneath the hood, so if you go that route, may as well just use that)
One aspect I would definitely evaluate carefully is: How difficult is this database going to be to package and distribute? How do I integrate it into my build system? Level and NeDB can be included simply via npm install and any native code compiling is handled seamlessly with node-gyp, which is as simple as it gets. However, bundling Mongo, for example, will require some work to get a working build for each different platform.
On the side of my Gooddata project, I maintain a small PostgreSQL database that contains a few tables.
I would like to be able to integrate both my ETL processes using the same tool, and it seems to me cloudconnect would be the easiest way, since I already have my whole GoodData ETL in it.
Here are the ways I tried to do it without success:
I tried to have a look in the documentation, and it seems to me that all the functionalities of CloverETL that enabled this (DBOutput, PostGreSQLDataWriter) are not available in Cloudconnect.
I managed to connect to the Agile Datawarehouse Service (Database attached to GoodData), but it seems that only the ADS database is able to understand the request:
COPY MyDataBaseTable (field1,field2) FROM LOCAL '${DATA_TMP_DIR}/CIforADS.csv'
even when I adapt the syntax to PostgreSQL because the dynamic addressing I use here does not seem to work.
Is there any way to proceed that I'm missing? Can anyone think of a workaround?
In general this could be achieved by using of "DBExecute" component, but
I'm not sure if I understand it well - do you want to load data into your own Postgres instance using CloudConnect?
I am writing a Chrome Packaged App that uses the IndexedDB for data storage. Chrome allows me to view the contents of the database, but I can't find any way to manually change the data. I need to update this data from time to time because, you know, I'm still writing the app. Any idea how to manually change the data in the database?
Any changes to the IndexedDB database have to be performed via the IndexedDB API. There are no utilities, data editors, query apps, loaders, importers, or any other kind of external utility, such as there is for MySQL, SQLite, Oracle, or any other such database.
Furthermore, it's not even theoretically possible to write such a utility, because an IndexedDB database is sandboxed inside a single app, and no other app can access it.
What I do is incorporate the needed update forms and commands (delete database, create database, count rows, etc.) as modules inside the app, perhaps accessible from a Maintenance or Admin menu item. Obviously, this is a lot of work, but there is no other way if you're using IndexedDB.
In addition, I have a "load database" menu item that loads it from JSON in an external file. I do that from time to time when I want the app to have some initial data, or test data. But, this is just an example of what I said in the first sentence, above.
HTML5 Storage Manager All in One folks promise they'll have indexedDB support soon.
They use some tricks to open extension window inside the same domain as debugged page, thus making indexedDb accessible.
Doesn't work successfully at the time of this writing, though.
I am using sails.js with postgres as the database. Although most of the actions can be easily handled via waterline ORM, there are certain cases I prefer to use native queries and sometimes even postgres' native stored functions. However, the challenge with stored functions is that they come with an overhead of code maintenance.
In my project repo, I have created a directory sql which contains all the SQL functions. Currently, I have to manually make sure that whenever I am making some changes to a function, I need to recompile it on database.
I need to configure it such that these are compiled whenever I restart the server, just like all the models are re-created. Is it possible to do this and how ?
Sails doesn't have any built-in support for compiling Postgres stored procedures, but you could make a Grunt task for this. Take a look at the documentation for tasks. These are run every time you lift Sails (and in some cases, when files are changed).
A quick Google found the grunt-pg-utils package, which might help you on your way.
I have used extensively core data in my applications. Generally I need to submit the applications using a pre-populated database.
I generally create zillions of lines of code to populate the database, then I extract it from the applications directory and include it on the bundle. Is there a easy way to do that? Is there any way to populate a core data database created on Xcode using, for example, a CSV file and an external app?
thanks.
The usual way is to create the entries at the first launch.
If you want to import data from a CSV file, then you need a parser.
I can recommend this one: https://github.com/davedelong/CHCSVParser
You don't have any easy way to do that with Xcode.
Basically what I do in this kind of case: I build a second target dedicate to pool the database and create object with a fixed persistent store url relative to the workspace in the filesystem.