I'm writing a simple full text search library, and need case folding to check if two words are equal. For this use case, the existing .to_lowercase() and .to_uppercase() methods are not enough.
From a quick search of crates.io, I can find libraries for normalization and word splitting but not case folding. regex-syntax does have case folding code, but it's not exposed in its API.
For my use case, I've found the caseless crate to be most useful.
As far as I know, this is the only library which supports normalization. This is important when you want e.g. "㎒" (U+3392 SQUARE MHZ) and "mhz" to match. See Chapter 3 - Default Caseless Matching in the Unicode Standard for details on how this works.
Here's some example code that matches a string case-insensitively:
extern crate caseless;
use caseless::Caseless;
let a = "100 ㎒";
let b = "100 mhz";
// These strings don't match with just case folding,
// but do match after compatibility (NFKD) normalization
assert!(!caseless::default_caseless_match_str(a, b));
assert!(caseless::compatibility_caseless_match_str(a, b));
To get the case folded string directly, you can use the default_case_fold_str function:
let s = "Twilight Sparkle ちゃん";
assert_eq!(caseless::default_case_fold_str(s), "twilight sparkle ちゃん");
Caseless doesn't expose a corresponding function that normalizes as well, but you can write one using the unicode-normalization crate:
extern crate unicode_normalization;
use caseless::Caseless;
use unicode_normalization::UnicodeNormalization;
fn compatibility_case_fold(s: &str) -> String {
s.nfd().default_case_fold().nfkd().default_case_fold().nfkd().collect()
}
let a = "100 ㎒";
assert_eq!(compatibility_case_fold(a), "100 mhz");
Note that multiple rounds of normalization and case folding are needed for a correct result.
(Thanks to BurntSushi5 for pointing me to this library.)
The unicase crate doesn't expose case folding directly, but it provides a generic wrapper type that implements Eq, Ord and Hash in a case insensitive manner. The master branch (unreleased) supports both ASCII case folding (as an optimization) and Unicode case folding (though only invariant case folding is supported).
If someone did want to stick to the standard library, I wanted some actual data
on this. I pulled the full list of two byte characters that fail with
to_lowercase or to_uppercase. I then ran this test:
fn lowercase(left: char, right: char) -> bool {
for c in left.to_lowercase() {
for d in right.to_lowercase() {
if c == d { return true }
}
}
false
}
fn uppercase(left: char, right: char) -> bool {
for c in left.to_uppercase() {
for d in right.to_uppercase() {
if c == d { return true }
}
}
false
}
fn main() {
let pairs = &[
&['\u{00E5}','\u{212B}'],&['\u{00C5}','\u{212B}'],&['\u{0399}','\u{1FBE}'],
&['\u{03B9}','\u{1FBE}'],&['\u{03B2}','\u{03D0}'],&['\u{03B5}','\u{03F5}'],
&['\u{03B8}','\u{03D1}'],&['\u{03B8}','\u{03F4}'],&['\u{03D1}','\u{03F4}'],
&['\u{03B9}','\u{1FBE}'],&['\u{0345}','\u{03B9}'],&['\u{0345}','\u{1FBE}'],
&['\u{03BA}','\u{03F0}'],&['\u{00B5}','\u{03BC}'],&['\u{03C0}','\u{03D6}'],
&['\u{03C1}','\u{03F1}'],&['\u{03C2}','\u{03C3}'],&['\u{03C6}','\u{03D5}'],
&['\u{03C9}','\u{2126}'],&['\u{0392}','\u{03D0}'],&['\u{0395}','\u{03F5}'],
&['\u{03D1}','\u{03F4}'],&['\u{0398}','\u{03D1}'],&['\u{0398}','\u{03F4}'],
&['\u{0345}','\u{1FBE}'],&['\u{0345}','\u{0399}'],&['\u{0399}','\u{1FBE}'],
&['\u{039A}','\u{03F0}'],&['\u{00B5}','\u{039C}'],&['\u{03A0}','\u{03D6}'],
&['\u{03A1}','\u{03F1}'],&['\u{03A3}','\u{03C2}'],&['\u{03A6}','\u{03D5}'],
&['\u{03A9}','\u{2126}'],&['\u{0398}','\u{03F4}'],&['\u{03B8}','\u{03F4}'],
&['\u{03B8}','\u{03D1}'],&['\u{0398}','\u{03D1}'],&['\u{0432}','\u{1C80}'],
&['\u{0434}','\u{1C81}'],&['\u{043E}','\u{1C82}'],&['\u{0441}','\u{1C83}'],
&['\u{0442}','\u{1C84}'],&['\u{0442}','\u{1C85}'],&['\u{1C84}','\u{1C85}'],
&['\u{044A}','\u{1C86}'],&['\u{0412}','\u{1C80}'],&['\u{0414}','\u{1C81}'],
&['\u{041E}','\u{1C82}'],&['\u{0421}','\u{1C83}'],&['\u{1C84}','\u{1C85}'],
&['\u{0422}','\u{1C84}'],&['\u{0422}','\u{1C85}'],&['\u{042A}','\u{1C86}'],
&['\u{0463}','\u{1C87}'],&['\u{0462}','\u{1C87}']
];
let (mut upper, mut lower) = (0, 0);
for pair in pairs.iter() {
print!("U+{:04X} ", pair[0] as u32);
print!("U+{:04X} pass: ", pair[1] as u32);
if uppercase(pair[0], pair[1]) {
print!("to_uppercase ");
upper += 1;
} else {
print!(" ");
}
if lowercase(pair[0], pair[1]) {
print!("to_lowercase");
lower += 1;
}
println!();
}
println!("upper pass: {}, lower pass: {}", upper, lower);
}
Result below. Interestingly, one of the pairs fails with both. But based on this,
to_uppercase is the best option.
U+00E5 U+212B pass: to_lowercase
U+00C5 U+212B pass: to_lowercase
U+0399 U+1FBE pass: to_uppercase
U+03B9 U+1FBE pass: to_uppercase
U+03B2 U+03D0 pass: to_uppercase
U+03B5 U+03F5 pass: to_uppercase
U+03B8 U+03D1 pass: to_uppercase
U+03B8 U+03F4 pass: to_lowercase
U+03D1 U+03F4 pass:
U+03B9 U+1FBE pass: to_uppercase
U+0345 U+03B9 pass: to_uppercase
U+0345 U+1FBE pass: to_uppercase
U+03BA U+03F0 pass: to_uppercase
U+00B5 U+03BC pass: to_uppercase
U+03C0 U+03D6 pass: to_uppercase
U+03C1 U+03F1 pass: to_uppercase
U+03C2 U+03C3 pass: to_uppercase
U+03C6 U+03D5 pass: to_uppercase
U+03C9 U+2126 pass: to_lowercase
U+0392 U+03D0 pass: to_uppercase
U+0395 U+03F5 pass: to_uppercase
U+03D1 U+03F4 pass:
U+0398 U+03D1 pass: to_uppercase
U+0398 U+03F4 pass: to_lowercase
U+0345 U+1FBE pass: to_uppercase
U+0345 U+0399 pass: to_uppercase
U+0399 U+1FBE pass: to_uppercase
U+039A U+03F0 pass: to_uppercase
U+00B5 U+039C pass: to_uppercase
U+03A0 U+03D6 pass: to_uppercase
U+03A1 U+03F1 pass: to_uppercase
U+03A3 U+03C2 pass: to_uppercase
U+03A6 U+03D5 pass: to_uppercase
U+03A9 U+2126 pass: to_lowercase
U+0398 U+03F4 pass: to_lowercase
U+03B8 U+03F4 pass: to_lowercase
U+03B8 U+03D1 pass: to_uppercase
U+0398 U+03D1 pass: to_uppercase
U+0432 U+1C80 pass: to_uppercase
U+0434 U+1C81 pass: to_uppercase
U+043E U+1C82 pass: to_uppercase
U+0441 U+1C83 pass: to_uppercase
U+0442 U+1C84 pass: to_uppercase
U+0442 U+1C85 pass: to_uppercase
U+1C84 U+1C85 pass: to_uppercase
U+044A U+1C86 pass: to_uppercase
U+0412 U+1C80 pass: to_uppercase
U+0414 U+1C81 pass: to_uppercase
U+041E U+1C82 pass: to_uppercase
U+0421 U+1C83 pass: to_uppercase
U+1C84 U+1C85 pass: to_uppercase
U+0422 U+1C84 pass: to_uppercase
U+0422 U+1C85 pass: to_uppercase
U+042A U+1C86 pass: to_uppercase
U+0463 U+1C87 pass: to_uppercase
U+0462 U+1C87 pass: to_uppercase
upper pass: 46, lower pass: 8
As of today (2023) the caseless crate looks unmaintained, while the ICU4X project seems the way to go. To compare strings according to language-dependent conventions, see the icu_collator crate. For a good introduction on how to correctly sort words in Rust, see here.
Related
I am creating a sandbox app as Api-platform practice and I have the following problem to address:
Let's consider following REST endpoint for user entity:
DISCLAIMER in the code examples there are a little more attributes but the whole concept applies regarding that
Collection-get(aka. /api/users) - only available for admin users(all attributes available, maybe we exclude hashed password)
POST - everyone should have access to following attributes: username, email, plainPassword(not persisted just in case someone asks)
PATCH/PUT - here it becomes quite tricky: I want those with ROLE_ADMIN to have access to username, email, plainPassword fields. And those who are the owners to only be able to alter plainPassword
DELETE - only ROLE_ADMIN and owners can delete
I will start with the resource config
resources:
App\Entity\User:
# attributes:
# normalization_context:
# groups: ['read', 'put', 'patch', 'post', 'get', 'collection:get']
# denormalization_context:
# groups: ['read', 'put', 'patch', 'post', 'get', 'collection:get']
collectionOperations:
get:
security: 'is_granted("ROLE_ADMIN")'
normalization_context: { groups: ['collection:get'] }
post:
normalization_context: { groups: ['admin:post', 'post'] }
itemOperations:
get:
normalization_context: { groups: ['admin:get', 'get'] }
security: 'is_granted("ROLE_ADMIN") or object == user'
put:
normalization_context: { groups: ['admin:put', 'put'] }
security: 'is_granted("ROLE_ADMIN") or object == user'
patch:
normalization_context: { groups: ['admin:patch', 'patch'] }
security: 'is_granted("ROLE_ADMIN") or object == user'
delete:
security: 'is_granted("ROLE_ADMIN") or object == user'
Here is the serializer config
App\Entity\User:
attributes:
username:
groups: ['post', 'admin:put', 'admin:patch', 'collection:get', 'get']
email:
groups: ['post', 'admin:put', 'admin:patch', 'collection:get', 'get']
firstName:
groups: ['post', 'admin:put', 'admin:patch', 'collection:get', 'get']
lastName:
groups: ['post', 'admin:put', 'admin:patch', 'collection:get', 'get']
plainPassword:
groups: ['post', patch]
createdAt:
groups: ['get', 'collection:get']
lastLoginDate:
groups: ['get', 'collection:get']
updatedAt:
groups: ['collection:get']
Here is the context group builder(Registered as service as it's stated in API-platform doc
<?php
namespace App\Serializer;
use Symfony\Component\HttpFoundation\Request;
use ApiPlatform\Core\Serializer\SerializerContextBuilderInterface;
use Symfony\Component\Security\Core\Authorization\AuthorizationCheckerInterface;
final class AdminContextBuilder implements SerializerContextBuilderInterface
{
private $decorated;
private $authorizationChecker;
public function __construct(SerializerContextBuilderInterface $decorated, AuthorizationCheckerInterface $authorizationChecker)
{
$this->decorated = $decorated;
$this->authorizationChecker = $authorizationChecker;
}
public function createFromRequest(Request $request, bool $normalization, ?array $extractedAttributes = null): array
{
$context = $this->decorated->createFromRequest($request, $normalization, $extractedAttributes);
if ($this->authorizationChecker->isGranted('ROLE_ADMIN')) {
switch($request->getMethod()) {
case 'GET':
$context['groups'][] = 'admin:get';
break;
case 'POST':
$context['groups'][] = 'admin:post';
case 'PUT':
$context['groups'][] = 'admin:put';
case 'PATCH':
$context['groups'][] = 'admin:patch';
}
}
return $context;
}
}
The issue is that even if I'm logged as a user with only ROLE_USER I am still able to alter username field which should be locked according to the admin:patch normalization group. I am pretty new to the api-platform and I can't quite understand why this does not work but I guess there will be an issue with the context builder. Thanks for your help I'll keep the question updated if I come up with something in the meantime
After investigating the docs and browsing youtube and most of all experimenting with the aformentioned user resource I came up with the solutionLet's start with the configuration again:
resources:
App\Entity\User:
collectionOperations:
get:
security: 'is_granted("ROLE_ADMIN")'
normalization_context: { groups: ['collection:get'] }
denormalization_context: { groups: ['collection:get'] }
post:
normalization_context: { groups: ['post'] }
denormalization_context: { groups: ['post'] }
itemOperations:
get:
normalization_context: { groups: ['get'] }
security: 'is_granted("ROLE_ADMIN") or object == user'
patch:
normalization_context: { groups: ['patch'] }
denormalization_context: { groups: ['patch'] }
security: 'is_granted("ROLE_ADMIN") or object == user'
delete:
security: 'is_granted("ROLE_ADMIN") or object == user'
The main difference between the starting point is that the admin actions should never be stated in the operation groups because they will be added by default to the context.
Next the property groups where we define all the operations available on certain property
App\Entity\User:
attributes:
id:
groups: ['get', 'collection:get']
username:
groups: ['post', 'admin:patch', 'get', 'collection:get']
email:
groups: ['post', 'admin:patch', 'get', 'collection:get']
plainPassword:
groups: ['post', 'patch', 'collection:get']
firstName:
groups: ['post', 'patch', 'get', 'collection:get']
lastName:
groups: ['post', 'get', 'collection:get']
createdAt:
groups: ['get', 'collection:get']
lastLoginDate:
groups: ['get', 'collection:get']
updatedAt:
groups: ['collection:get']
This is fairly the same from as in the question only thing we need to configure is which actions require to be the 'admin' this can be changed according to your needs whatever you program a blog, library, store or whatever and need some custom actions per role on your API.
At last is the custom context builder
<?php
namespace App\Serializer;
use Symfony\Component\HttpFoundation\Request;
use ApiPlatform\Core\Serializer\SerializerContextBuilderInterface;
use Symfony\Component\Security\Core\Authorization\AuthorizationCheckerInterface;
final class AdminContextBuilder implements SerializerContextBuilderInterface
{
private $decorated;
private $authorizationChecker;
public function __construct(SerializerContextBuilderInterface $decorated, AuthorizationCheckerInterface $authorizationChecker)
{
$this->decorated = $decorated;
$this->authorizationChecker = $authorizationChecker;
}
public function createFromRequest(Request $request, bool $normalization, ?array $extractedAttributes = null): array
{
$context = $this->decorated->createFromRequest($request, $normalization, $extractedAttributes);
if ($this->authorizationChecker->isGranted('ROLE_ADMIN')) {
$context['groups'][] = 'admin:patch';
$context['groups'][] = 'admin:post';
$context['groups'][] = 'admin:get';
}
return $context;
}
}
This is fairly simple and can be extended on your personal needs basically we check if the current user is an admin give him the properties from groups a, b, c etc. This can also be specified per entity(more on that you can find in API platform doc BookContextBuilder fairly simple
I am pretty sure this is the bread and butter anyone will ever need when building some simple or even complex API where roles will determine who can do what. If this answer will help you pleas be sure to up my answer thanks a lot and happy coding!
I'm trying to seperate my TypeORM project over multiple databases as it is growing in size, and its components are very discrete(yet interlinked, so i need to be able to have relations cross-database).
I am trying to do that using the database setting on the #Entity decorator, as described here: https://typeorm.io/#multiple-connections/using-multiple-databases-in-a-single-connection
I made a minimal reproducable example for this, with two entities that should in theory be put in different databases:
#Entity({ database: 'test' })
export default class Entity1 {
#PrimaryGeneratedColumn()
id?: number
#Column()
name?: string
#Column()
address?: string
}
and
#Entity({ database: 'database2' })
export default class Entity2 {
#PrimaryGeneratedColumn()
id?: number
#Column()
name?: string
#Column()
address?: string
}
Connection code:
import {createConnections} from "typeorm";
async function doDbExample() {
const connections = await createConnections([{
name: "db1Connection",
type: "postgres",
host: "db",
port: 5432,
username: "test",
password: "testPassword",
database: "test",
entities: [__dirname + "/entity/*{.js,.ts}"],
synchronize: true
}]);
console.log("Created connections")
}
doDbExample()
However, what happens is that both entities' table is put in the database of the connection. Am i doing something wrong, or is this a bug in TypeORM? It looks to me like it is not respecting the database setting any more.
I am running the code using ts-node-dev
I made a full minimal reproducable example, complete with dockerized setup of the database environment, on github: https://github.com/petterroea/TypeOrmBug-MRE
This is a setup issue. I solved it like this:
Modify the array entities so each connection/database has its own folder with entity files and name the entity you use the most as default:
// src/index.ts
await createConnections([
{
name: 'default',
host: 'SERVER1',
username: 'bob',
password: 'kiwi,
type: 'mssql',
database: 'db1',
...
"synchronize": true,
"entities": ["src/db1/entity/**/*.ts"],
},
{
name: 'connection2,
host: 'SERVER2',
username: 'Mike',
password: 'carrot',
type: 'mssql',
database: 'db2,
...
"synchronize": true,
"entities": ["src/db2/entity/**/*.ts"],
])
Create entity files for each database in its respective folder:
src/db1/entity/Fruit.ts > table in db1
src/db2/entity/Vegetables.ts > table in db2
With "synchronize": true each table will be created automatically in the correct database
Accessing data in the tables:
For the default connection::
import { Fruit} from 'src/db1/entity/Fruit.ts'
fruits() {
return Fruit.find()
}
For the non default connection:
import { getRepository } from 'typeorm'
import { Vegetable} from 'src/db2/entity/Vegetable.ts'
vegetables() {
return async () => await getRepository(Vegetable).find()
}
or
async vegetables() {
return await getRepository(vegetables, 'connection2').find()
}
I hope this helps someone else struggling with the same issues as you and me.
Hi I followed this Serverless + AWS REST API tutorial and it went great, I got it to work.
Now, I'm trying to modify it but have hit a wall while trying to submit data into the DynamoDB table.
Using Postman to submit a valid JSON object I get a 502 response. If I test the function in Lambda, I get the following error:
{
"errorType": "SyntaxError",
"errorMessage": "Unexpected token o in JSON at position 1",
"trace": [
"SyntaxError: Unexpected token o in JSON at position 1",
" at JSON.parse (<anonymous>)",
" at Runtime.module.exports.submit [as handler] (/var/task/api/interview.js:11:28)",
" at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)",
" at process._tickCallback (internal/process/next_tick.js:68:7)"
]
}
After searching for solutions, what I found out is that it seem like the event that is being passed as JSON.parse(event)is undefined.
Here's the serverless.yml:
service: interview
frameworkVersion: ">=1.1.0 <2.0.0"
provider:
name: aws
runtime: nodejs10.x
stage: dev
region: us-east-1
environment:
INTERVIEW_TABLE: ${self:service}-${opt:stage, self:provider.stage}
INTERVIEW_EMAIL_TABLE: "interview-email-${opt:stage, self:provider.stage}"
iamRoleStatements:
- Effect: Allow
Action:
- dynamodb:Query
- dynamodb:Scan
- dynamodb:GetItem
- dynamodb:PutItem
Resource: "*"
resources:
Resources:
CandidatesDynamoDbTable:
Type: 'AWS::DynamoDB::Table'
DeletionPolicy: Retain
Properties:
AttributeDefinitions:
-
AttributeName: "id"
AttributeType: "S"
KeySchema:
-
AttributeName: "id"
KeyType: "HASH"
ProvisionedThroughput:
ReadCapacityUnits: 1
WriteCapacityUnits: 1
StreamSpecification:
StreamViewType: "NEW_AND_OLD_IMAGES"
TableName: ${self:provider.environment.INTERVIEW_TABLE}
functions:
interviewSubmission:
handler: api/interview.submit
memorySize: 128
description: Submit interview information and starts interview process.
events:
- http:
path: interviews
method: post
and the interview.js
'use strict';
const uuid = require('uuid');
const AWS = require('aws-sdk');
AWS.config.setPromisesDependency(require('bluebird'));
const dynamoDb = new AWS.DynamoDB.DocumentClient();
module.exports.submit = (event, context, callback) => {
const requestBody = JSON.parse(event);
const fullname = requestBody.fullname;
const email = requestBody.email;
const test = requestBody.test;
const experience = requestBody.experience;
if (typeof fullname !== 'string' || typeof email !== 'string' || typeof experience !== 'number') {
console.error('Validation Failed');
callback(new Error('Couldn\'t submit interview because of validation errors.'));
return;
}
submitInterviewP(interviewInfo(fullname, email, experience, test))
.then(res => {
callback(null, {
statusCode: 200,
body: JSON.stringify({
message: `Sucessfully submitted interview with email ${email}`,
interviewId: res.id
})
});
})
.catch(err => {
console.log(err);
callback(null, {
statusCode: 500,
body: JSON.stringify({
message: `Unable to submit interview with email ${email}`
})
})
});
};
const submitInterviewP = interview => {
console.log('Submitting interview');
const interviewInfo = {
TableName: process.env.INTERVIEW_TABLE,
Item: interview,
};
return dynamoDb.put(interviewInfo).promise()
.then(res => interview);
};
const interviewInfo = (fullname, email, experience,test) => {
const timestamp = new Date().getTime();
return {
id: uuid.v1(),
fullname: fullname,
email: email,
experience: experience,
test: test,
submittedAt: timestamp,
updatedAt: timestamp,
};
};
If I replace the event param for a valid JSON object and then deploy again. I'm able to successfully insert the object into dynamoDB.
Any clues? Please let me know if there's anything I missing that could help.
Thanks!
API Gateway stringify the request body in event's body property.
Currently you are trying to parse event object const requestBody = JSON.parse(event); which is wrong. You need to parse event.body property:
const requestBody = JSON.parse(event.body);
Per the docs, StrongLoop doesn't support running custom sql statements.
https://docs.strongloop.com/display/public/LB/Executing+native+SQL
How anyone thinks you can build an enterprise app with just simple joins is beyond me, but I did find this post which says you can do it:
Execute raw query on MySQL Loopback Connector
But this is for MySql. When I try it with Postgres I get the error: "Invalid value for argument 'byId' of type 'object': 0. Received type was converted to number." And it returns no data. Here is my code:
module.exports = function(account) {
account.byId = function(byId, cb){
var ds=account.dataSource;
var sql = "SELECT * FROM account where id > ?";
ds.connector.execute(sql, [Number(byId)], function(err, accounts) {
if (err) console.error(err);
console.info(accounts);
cb(err, accounts);
});
};
account.remoteMethod(
'byId',
{
http: {verb: 'get'},
description: "Get accounts greater than id",
accepts: {arg: 'byId', type: 'integer'},
returns: {arg: 'data', type: ['account'], root: true}
}
);
};
For the part [Number(byId)], I've also tried [byId] and just byId. Nothing works.
Any ideas? So far I really like StrongLoop, but it looks like the Postgresql connector is not ready for production. I'll be doing a prototype with Sails next if this doesn't work. :-(
Here's the thing arg is of type 'integer' which is not a valid Loopback Type. Use `Number instead. Check the corrected code below:
module.exports = function(account) {
account.byId = function(byId, cb){
var ds = account.dataSource;
var sql = "SELECT * FROM account WHERE id > $1";
ds.connector.execute(sql, byId, function(err, accounts) {
if (err) console.error(err);
console.info(accounts);
cb(err, accounts);
});
};
account.remoteMethod(
'byId',
{
http: {verb: 'get'},
description: "Get accounts greater than id",
accepts: {arg: 'byId', type: 'Number'},
returns: {arg: 'data', type: ['account'], root: true} //here 'account' will be treated as 'Object'.
}
);
};
Note: MySQL's prepared statements natively use ? as the parameter placeholder, but PostgreSQL uses $1, $2 etc.
Hope this works for you. Else try with [byId] instead of byId as per the docs.
I've been all over SO and Sailsjs.org trying to figure out what's going wrong, and to no avail. Just trying to learn the basics of SailsJS. I have a UserController, whose create() method gets called when a POST request is sent to /user.
create: function (req, res) {
var params = req.params.all();
User.create({
name: params.FirstName + ' ' + params.LastName,
email: params.Email,
password: params.Password,
jobTitle: params.JobTitle
}).exec(function createCB(err,created)
{
created.save(function(err)
{
// No error . . . still nothing in db
});
return res.json({name: created.name, jobTitle: created.jobTitle, email: created.email, password: created.password});
});
}
No errors here. All the request params are coming in fine and going back to the client without trouble. But nothing is actually being written to the database.
In development.js:
connections: {
mongo: {
adapter: 'sails-mongo',
host: 'localhost',
port: 27017,
// user: 'username',
// password: 'password',
database: 'sails_test'
}
},
models: {
connection: 'mongo'
}
I've tried this with the above both there in development.js, as well as separately in connections.js and models.js, respectively. No difference.
In User.js:
attributes: {
FirstName : { type: 'string' },
LastName : { type: 'string' },
Email : { type: 'string' },
Password : { type: 'string' },
JobTitle : { type: 'string' }
}
My front end request:
$.ajax({
method: 'post',
url: '/user',
data: {
FirstName: 'Yo',
LastName: 'Momma',
Email: 'yourmom#yourdadshouse.com',
Password: 'YouWish123',
JobTitle: 'Home Maker Extraordinaire'
},
success: function (sailsResponse)
{
$('#result').html(sailsResponse).fadeIn();
},
error: function()
{
console.log('error');
}
});
Again, none of this is producing an explicit error. There is just nothing being inserted into the database. Or if there is, I don't know how to find it. I've confirmed the existence of this db in the mongo shell, thusly:
show dbs
My db, sails_test shows up in the list. And I've confirmed that there isn't anything in it like so:
db.sails_test.find()
I would very much appreciate some guidance here :)
Update:
Turns out the data is being written just fine. I'm just unable to query the database from the command line. I confirmed this by first creating a sample user, and then using Waterline's findOne() method:
User.findOne({FirstName: params.FirstName}).exec(function (err, user) {
if (err) {
res.send(400);
} else if (user) {
return res.json({firstName: user.FirstName, lastName: user.LastName, jobTitle: user.JobTitle, email: user.Email, password: user.Password});
} else {
return res.send('no users match those criteria');
}
});
The above works as expected. So my problem now is simply that I cannot interact with the database from the command line. db.<collectionName>.find({}) produces nothing.
This was simply a failure to understand the MongoDb docs. I read db.collection.find({}) as DatabaseName.CollectionName.find({}), when you literally need to use db. So if my database is Test, and my collection is Users, the query is use Test, and then db.Users.find({}).
Also of note, 3T Mongo Chef is a pretty rockin' GUI (graphical user interface) for nosql databases, and it's free for non-commercial use.