Firestore method snapshotChanges() for collection - google-cloud-firestore

Following is the code provided in Collections in AngularFirestore.
export class AppComponent {
private shirtCollection: AngularFirestoreCollection<Shirt>;
shirts: Observable<ShirtId[]>;
constructor(private readonly afs: AngularFirestore) {
this.shirtCollection = afs.collection<Shirt>('shirts');
// .snapshotChanges() returns a DocumentChangeAction[], which contains
// a lot of information about "what happened" with each change. If you want to
// get the data and the id use the map operator.
this.shirts = this.shirtCollection.snapshotChanges().map(actions => {
return actions.map(a => {
const data = a.payload.doc.data() as Shirt;
const id = a.payload.doc.id;
return { id, ...data };
});
});
}
}
Here method snapshotChanges() returns observable of DocumentChangeAction[]. So why using a map to read it when it has only one array and it will loop only one time?

Related

How to get data from react query "useQuery" hook in a specific type

When we get data from useQuery hook, I need to parse the data a specific type before it return to user. I want data which return from useQuery hook should be of "MyType" using the parsing function i created below. I am unable to find method to use my parsing function. Is there any way to do it? I don't want to rely on schema structure for data type.
type MyType = {
id: number;
//some more properties
}
function parseData(arr: any[]): MyType[]{
return arr.map((obj, index)=>{
return {
id: arr.id,
//some more properties
}
})
}
const {data} = await useQuery('fetchMyData', async ()=>{
return await axios.get('https://fake-domain.com')
}
)
I would take the response from the api and transform it inside the queryFn, before you return it to react-query. Whatever you return winds up in the query cache, so:
const { data } = await useQuery('fetchMyData', async () => {
const response = await axios.get('https://fake-domain.com')
return parseData(response.data)
}
)
data returned from useQuery should then be of type MyType[] | undefined
There are a bunch of other options to do data transformation as well, and I've written about them here:
https://tkdodo.eu/blog/react-query-data-transformations
I think you should create your own hook and perform normalisation there:
const useParseData = () => {
const { data } = await useQuery('fetchMyData', async () => {
return await axios.get('https://fake-domain.com')
}
return parseData(data)
}
And where you need this data you could just call const parsedData = useParseData()

Cannot read property forEach of undefined

The title of this question is just the error I am currently receiving, but what I really need help with is understanding observables and API calls. For whatever reason, I just haven't been able to get a good grasp of this concept, and I am hoping that someone might have an explanation that will finally click.
I am trying to create a new Angular service that retrieves JSON from an API. I then need to map the response to a model. Due to weird naming conventions, job descriptions and job requirements are used interchangeably here. Here is my service class.
import { CommunicationService } from './communication.service';
import { AiDescription } from '../models/ai-description.model';
import { Observable } from 'rxjs/Observable';
import { BehaviorSubject } from 'rxjs/BehaviorSubject';
#Injectable()
export class AiDescriptionService {
requirements: Observable<AiDescription[]>;
private aiDescriptionUrl: string = '/api/core/company/jobdescriptions';
private dataStore: {
requirements: AiDescription[]
};
private _requirements: BehaviorSubject<AiDescription[]>;
private emptyRequestParams = {
"company_id": "",
"carotene_id": "",
"carotene_version": "",
"city": "",
"state": "",
"country": ""
};
readonly caroteneVersion: string = "caroteneV3";
constructor(
private communicationService: CommunicationService
) {
this.dataStore = { requirements: [] };
this._requirements = new BehaviorSubject<AiDescription[]>([]);
this.requirements = this._requirements.asObservable();
}
LoadRequirements(params: Object) {
this.communicationService.postData(this.aiDescriptionUrl, params)
.subscribe(res => {
let jobDescriptions = [];
jobDescriptions = res.jobdescriptions;
jobDescriptions.forEach((desc: { id: string; description: string; }) => {
let aiDescription = new AiDescription();
aiDescription.id = desc.id;
aiDescription.description = desc.description;
});
this.dataStore.requirements = res;
this._requirements.next(Object.assign({}, this.dataStore).requirements);
});
}
CreateRequest(
companyID : string,
caroteneID : string,
city: string,
state: string,
country: string
): Object {
let newRequestParams = this.emptyRequestParams;
newRequestParams.company_id = companyID;
newRequestParams.carotene_id = caroteneID;
newRequestParams.carotene_version = this.caroteneVersion;
newRequestParams.city = city;
newRequestParams.state = state;
newRequestParams.country = country;
this.LoadRequirements(newRequestParams);
return this.dataStore;
}
}
The postData() function being called by this.communicationService is here:
postData(url: string, jobInformation: any): Observable<any> {
const start = new Date();
const headers = new HttpHeaders({ 'Content-Type': 'application/json' });
const body = JSON.stringify(jobInformation);
const options = { headers };
return this.http.post(url, body, options)
.catch(err => Observable.throw(err))
.do(() => {
this.analyticsLoggingService.TrackTiming('JobPostingService', 'PostSuccess', new Date().getTime() - start.getTime());
}, () => {
this.analyticsLoggingService.TrackError('JobPostingService', 'PostFailure');
});
}
I didn't write the postData function, and I would not be able to modify it. When running a unit test, I am getting this error: "TypeError: Cannot read property 'forEach' of undefined".
But more than simply fixing the error, I am really trying to get a better understanding of using Observables, which is something I haven't been able to get a good understanding of from other sources.
In your example, I recommend replacing any and Object with explicitly defined models.
Here's an example for Angular 8 for Subscription, Promise, and Observable API calls. You can get more info here: https://angular.io/tutorial/toh-pt6.
import { Injectable } from '#angular/core';
import { HttpClient, HttpHeaders, HttpErrorResponse } from '#angular/common/http';
import { Observable } from 'rxjs';
import { User } from './user.model';
#Injectable({ providedIn: 'root' })
export class UserService {
users: User[];
authHeaders = new HttpHeaders()
.set('Content-Type', 'application/json');
constructor(
private readonly http: HttpClient
) { }
getUsers() {
this.http.get(`https://myApi/users`, { headers: this.authHeaders })
.subscribe(
(data: User[]) => {
this.users = data;
}, (error: HttpErrorResponse) => { /* handle error */ });
}
async getUserPromise(userID: number): Promise<User> {
const url = `https://myApi/users/${userID}`;
return this.http.get<User>(url, { headers: this.authHeaders })
.toPromise();
}
getUserObservable(userID: number): Observable<User> {
const url = `https://myApi/users/${userID}`;
return this.http.get<User>(url, { headers: this.authHeaders });
}
}
I like to keep my class models in separate files. This example would have user.model.ts with content like:
export class User {
constructor(
public id: number,
public username: string,
public displayName: string,
public email: string
) { }
}
I've not included authentication headers or error handling for brevity; however, you might want to add those as needed.

Firestore get array from collection with id's to use in Polymer dom-repeat

im trying to use Firestore with Polymer, I obtain an array to send it to polymer in a dom-repeat like this:
var query=db.collection("operaciones");
db.collection("operaciones")
.onSnapshot((querySnapshot) => {
querySnapshot.forEach(function(doc) {
});
that.operacionesPorCliente=Array.from(querySnapshot.docs.map(doc=>doc.data()));
});
console.log (that.operacionesPorCliente); // this works but the ID doesnt exist here....
}
that works but that array doesnt contain the id from firestore, the problem is that I need that ID to update the data :( but it isn't in the array
Hope I explain my self, any help?
I make a Polymer Element (Polymer 3) to keep Firebase Firestore data synchronized. This component has a dom-repeat template element, to show the always-fresh collection data.
I think this will answer your question
import {html, PolymerElement} from '#polymer/polymer/polymer-element.js';
import {} from '#polymer/polymer/lib/elements/dom-repeat.js';
/**
* #customElement
* #polymer
*/
class FirebaseFirestoreApp extends PolymerElement {
static get template() {
return html`
<style>
:host {
display: block;
}
</style>
<h1>Firestore test</h1>
<template is="dom-repeat" items="[[elems]]">
<p>[[item.$id]] - [[item.name]]</p>
</template>
`;
}
static get properties() {
return {
elems: {
type: Array,
value: function() {
return [];
}
}
};
}
ready() {
super.ready();
var db = firebase.firestore();
const settings = {timestampsInSnapshots: true};
db.settings(settings);
db.collection("operaciones").onSnapshot((querySnapshot) => {
querySnapshot.docChanges().forEach((change) => {
if (change.type === "added") {
let newElem = this.makeElem(change);
this.push('elems', newElem);
}
if (change.type === "modified") {
let modifiedElement = this.makeElem(change);
let index = this.getElemIndex(change.doc.id);
this.set(`elems.${index}`, modifiedElement);
}
if (change.type === "removed") {
let deletedElement = this.getElemIndex(change.doc.id);
this.splice('elems', deletedElement, 1);
}
});
});
}
makeElem(change) {
let data = change.doc.data();
data.$id = change.doc.id;
return data;
}
getElemIndex(id) {
let index = this.elems.findIndex((elem) => {
if(elem.$id == id) {
return true;
}
});
return index;
}
}
window.customElements.define('firebase-firestore-app', FirebaseFirestoreApp);
The sync systems should works with all kind of Firebase Firestore collections, but the template dom-repeat suposses there is a property called "name" in the objects inside the collection.
So, the collection in the Firebase console looks like that.
Firebase console screenshot
I suggest you use Polymerfire Polymerfire that are polymer components for Firebase, but if you want to do it in javascript, can get the id directly in the doc: doc.id().

Why TextDocumentContentProvider dont call provideTextDocumentContent on update when query params changes?

as title says, when i wanna update TextDocumentContentProvider with different query params by calling update method provideTextDocumentContent is not called...
only way i managed to get it working was with same URI as in calling
vscode.commands.executeCommand('vscode.previewHtml', URI, 2, 'Storybook');
relevant part of code:
// calculates uri based on editor state - depends on actual caret position
// all uris will start with 'storybook://preview'
function getPreviewUri(editor: vscode.TextEditor): vscode.Uri;
// transforms uri, so web server will understand
// ex: 'storybook://preview?name=fred' -> 'http://localhost:12345/preview/fred?full=1'
function transformUri(uri: vscode.Uri): vscode.Uri;
class StorybookContentProvider implements vscode.TextDocumentContentProvider
{
provideTextDocumentContent(uri: vscode.Uri): string {
var httpUri = transformUri(uri);
return `<iframe src="${httpUri}" />`;
}
onDidChange = new vscode.EventEmitter<vscode.Uri>();
update(uri: vscode.Uri) {
this.onDidChange(uri);
}
}
export function activate(context: vscode.ExtensionContext)
{
vscode.workspace.onDidChangeTextDocument(
(e: vscode.TextDocumentChangeEvent) => {
if (e.document === vscode.window.activeTextEditor.document) {
const previewUri = getPreviewUri(vscode.window.activeTextEditor);
provider.update(previewUri);
}
}
);
vscode.window.onDidChangeTextEditorSelection(
(e: vscode.TextEditorSelectionChangeEvent) => {
if (e.textEditor === vscode.window.activeTextEditor) {
const previewUri = getPreviewUri(vscode.window.activeTextEditor);
provider.update(previewUri);
}
}
);
const provider = new StorybookContentProvider();
context.subscriptions.push(
vscode.commands.registerCommand('extension.showStorybook', () => {
vscode.commands.executeCommand('vscode.previewHtml', vscode.Uri.parse('storybook://preview'), 2, 'Storybook')
}),
vscode.workspace.registerTextDocumentContentProvider('storybook', provider)
);
}

apollostack/graphql-server - how to get the fields requested in a query from resolver

I am trying to figure out a clean way to work with queries and mongdb projections so I don't have to retrieve excessive information from the database.
So assuming I have:
// the query
type Query {
getUserByEmail(email: String!): User
}
And I have a User with an email and a username, to keep things simple. If I send a query and I only want to retrieve the email, I can do the following:
query { getUserByEmail(email: "test#test.com") { email } }
But in the resolver, my DB query still retrieves both username and email, but only one of those is passed back by apollo server as the query result.
I only want the DB to retrieve what the query asks for:
// the resolver
getUserByEmail(root, args, context, info) {
// check what fields the query requested
// create a projection to only request those fields
return db.collection('users').findOne({ email: args.email }, { /* projection */ });
}
Of course the problem is, getting information on what the client is requesting isn't so straightforward.
Assuming I pass in request as context - I considered using context.payload (hapi.js), which has the query string, and searching it through various .split()s, but that feels kind of dirty. As far as I can tell, info.fieldASTs[0].selectionSet.selections has the list of fields, and I could check for it's existence in there. I'm not sure how reliable this is. Especially when I start using more complex queries.
Is there a simpler way?
In case you don't use mongDB, a projection is an additional argument you pass in telling it explicitly what to retrieve:
// telling mongoDB to not retrieve _id
db.collection('users').findOne({ email: 'test#test.com' }, { _id: 0 })
As always, thanks to the amazing community.
2020-Jan answer
The current answer to getting the fields requested in a GraphQL query, is to use the graphql-parse-resolve-info library for parsing the info parameter.
The library is "a pretty complete solution and is actually used under the hood by postgraphile", and is recommended going forward by the author of the other top library for parsing the info field, graphql-fields.
Use graphql-fields
Apollo server example
const rootSchema = [`
type Person {
id: String!
name: String!
email: String!
picture: String!
type: Int!
status: Int!
createdAt: Float
updatedAt: Float
}
schema {
query: Query
mutation: Mutation
}
`];
const rootResolvers = {
Query: {
users(root, args, context, info) {
const topLevelFields = Object.keys(graphqlFields(info));
return fetch(`/api/user?fields=${topLevelFields.join(',')}`);
}
}
};
const schema = [...rootSchema];
const resolvers = Object.assign({}, rootResolvers);
// Create schema
const executableSchema = makeExecutableSchema({
typeDefs: schema,
resolvers,
});
Sure you can. This is actually the same functionality that is implemented on join-monster package for SQL based db's. There's a talk by their creator: https://www.youtube.com/watch?v=Y7AdMIuXOgs
Take a look on their info analysing code to get you started - https://github.com/stems/join-monster/blob/master/src/queryASTToSqlAST.js#L6-L30
Would love to see a projection-monster package for us mongo users :)
UPDATE:
There is a package that creates a projection object from info on npm: https://www.npmjs.com/package/graphql-mongodb-projection
You can generate MongoDB projection from info argument. Here is the sample code that you can follow
/**
* #description - Gets MongoDB projection from graphql query
*
* #return { object }
* #param { object } info
* #param { model } model - MongoDB model for referencing
*/
function getDBProjection(info, model) {
const {
schema: { obj }
} = model;
const keys = Object.keys(obj);
const projection = {};
const { selections } = info.fieldNodes[0].selectionSet;
for (let i = 0; i < keys.length; i++) {
const key = keys[i];
const isSelected = selections.some(
selection => selection.name.value === key
);
projection[key] = isSelected;
}
console.log(projection);
}
module.exports = getDBProjection;
With a few helper functions you can use it like this (typescript version):
import { parceGqlInfo, query } from "#backend";
import { GraphQLResolveInfo } from "graphql";
export const user = async (parent: unknown, args: unknown, ctx: unknown, info: GraphQLResolveInfo): Promise<User | null> => {
const { dbQueryStr } = parceGqlInfo(info, userFields, "id");
const [user] = await query(`SELECT ${dbQueryStr} FROM users WHERE id=$1;`, [1]);
return user;
};
Helper functions.
Few points:
gql_uid used as ID! string type from primary key to not change db types
required option is used for dataloaders (if field was not requested by user)
allowedFields used to filter additional fields from info like '__typename'
queryPrefix is used if you need to prefix selected fields like select u.id from users u
const userFields = [
"gql_uid",
"id",
"email"
]
// merge arrays and delete duplicates
export const mergeDedupe = <T>(arr: any[][]): T => {
// #ts-ignore
return ([...new Set([].concat(...arr))] as unknown) as T;
};
import { parse, simplify, ResolveTree } from "graphql-parse-resolve-info";
import { GraphQLResolveInfo } from "graphql";
export const getQueryFieldsFromInfo = <Required = string>(info: GraphQLResolveInfo, options: { required?: Required[] } = {}): string[] => {
const { fields } = simplify(parse(info) as ResolveTree, info.returnType) as { fields: { [key: string]: { name: string } } };
let astFields = Object.entries(fields).map(([, v]) => v.name);
if (options.required) {
astFields = mergeDedupe([astFields, options.required]);
}
return astFields;
};
export const onlyAllowedFields = <T extends string | number>(raw: T[] | readonly T[], allowed: T[] | readonly T[]): T[] => {
return allowed.filter((f) => raw.includes(f));
};
export const parceGqlInfo = (
info: GraphQLResolveInfo,
allowedFields: string[] | readonly string[],
gqlUidDbAlliasField: string,
options: { required?: string[]; queryPrefix?: string } = {}
): { pureDbFields: string[]; gqlUidRequested: boolean; dbQueryStr: string } => {
const fieldsWithGqlUid = onlyAllowedFields(getQueryFieldsFromInfo(info, options), allowedFields);
return {
pureDbFields: fieldsWithGqlUid.filter((i) => i !== "gql_uid"),
gqlUidRequested: fieldsWithGqlUid.includes("gql_uid"),
dbQueryStr: fieldsWithGqlUid
.map((f) => {
const dbQueryStrField = f === "gql_uid" ? `${gqlUidDbAlliasField}::Text AS gql_uid` : f;
return options.queryPrefix ? `${options.queryPrefix}.${dbQueryStrField}` : dbQueryStrField;
})
.join(),
};
};