I have a scenario in which I download parent entities from an api and save them to a database. I then want, once all of the parents have been saved, to download and save their children.
I've seen (or misunderstood) some comments about how this is a side-effect as I will not be passing the result of the parent save operation to the save children operation. I simply want to begin it when the parents are saved.
Could someone explain to me the best way of doing this?
Perhaps try something like this:
Observable
.Create<int>(o =>
{
var parentIds = new int?[] { null };
return
Observable
.While(
() => parentIds.Any(),
parentIds
.ToObservable()
.Select(parentId => Save(parentId)))
.Finally(() => { /* update `parentIds` here with next level */ })
.Subscribe(o);
})
.Subscribe(x => { });
This is effectively doing a breadth-first traversal of all of the entities, saving them as it goes, but outputting a single observable that you can subscribe to.
Related
I'm using react-query 4 to get some data from my server via JSON:API and create some objects:
export type QueryReturnQueue = QueueObject[] | false;
const getQueryQueue = async (query: string): Promise<QueryReturnQueue> => {
const data = await fetchAuth(query);
const returnData = [] as QueueObject[];
if (data) {
data.map((queueItem) => returnData.push(new QueueObject(queueItem)));
return returnData;
}
return false;
};
function useMyQueue(
queueType: QueueType,
): UseQueryResult<QueryReturnQueue, Error> {
const queryKey = ['getQueue', queueType];
return useQuery<QueryReturnQueue, Error>(
queryKey,
async () => {
const query = getUrl(queueType);
return getQueryQueue(query);
},
);
}
Then I have a component that displays the objects one at a time and the user is asked to make a choice (for example, "swipe left" or "swipe right"). This queue only goes in one direction-- the user sees a queueObject, processes the object, and then goes to the next one. The user cannot go back to a previous object, and the user cannot skip ahead.
So far, I've been using useContext() to track the index in the queue as state. However, I've been running into several bugs with this when the queue gets refreshed, which happens a lot, so I thought it would be easier to directly manipulate the data returned by useQuery().
How can I remove items as they are processed from the locally cached query results?
My current flow:
Fetch the queue data and generation objects with useQuery().
Display the queue objects one at a time using useContext().
Mutate the displayed object with useMutation() to modify useContext() and then show the next object in the cached data from useQuery().
My desired flow:
Fetch the queue data and generation objects with useQuery().
Mutate the displayed object with useMutation(), somehow removing the mutated item from the cached data from useQuery() (like what shift() does for arrays).
Sources I consulted
Best practices for editing data after useQuery call (couldn't find an answer relevant to my case)
Optimistic updates (don't know how to apply it to my case)
My desired flow:
Fetch the queue data and generation objects with useQuery().
Mutate the displayed object with useMutation(), somehow removing the mutated item from the cached data from useQuery() (like what shift() does for arrays).
This is the correct way to think about the data flow. But mutations shouldn't be updating the cache with data, they should be invalidating existing cache data.
You have defined your query correctly. Now you simply have to instruct your mutation function (which should be making an API call that updates the records queue) to invalidate all existing queries for the data in the onSuccess handler.
e.g.
function useMyMutation(recordId, queueType) {
const queryClient = useQueryClient();
return useMutation({
mutationFn: ({id, swipeDirection}) =>
asyncAPICall(`/swipes/${id}`, { swipeDirection }),
onSuccess: () => queryClient.invalidateQueries(['getQueue', queueType]);
});
}
As suggested by #Jakub Kotrs:
shift the first item from the list + only ever display the first
I was able to implement this in my useMutation() hook:
onMutate: async (queueObjectRemoved) => {
const queryKey = ['getQueue', queueType];
// Cancel any outgoing refetches
// (so they don't overwrite our optimistic update).
await queryClient.cancelQueries({
queryKey,
});
if (data?.[0]?.id === queueObjectRemvoed.data.id) {
// Optimistically update the data by removing the first item.
data.shift();
queryClient.setQueryData(queryKey, () => data);
} else {
throw new Error('Unable to set queue!');
}
},
onError: () => {
const queryKey = ['getQueue', queueType];
setShowErrorToast(true);
queryClient.invalidateQueries(
queryKey,
);
},
This way users can process all the items in the current queue before needing to refetch.
I’m iterating over a JSON that contains some rules to build my page. The loop is something like this:
flux.forEach(element => {
this.navCtrl.push(element.pageName);
});
My issue here is that I need to wait for this page to execute its action before call the next, this loop makes a stack. How can I make sort of a promise to wait the page to execute its duty before continue the loop?
Thank you all!
To solve promises in sequence, you can use reduce() as explained here.
element.reduce((promise,item) => {
return promise.then(() => {
return new Promise((resolve, reject)=> {
this.navCtrl.push(element.pageName);
resolve();
})
})
},Promise.resolve())
I'm wrapping an API that emits events in Observables and currently my datasource code looks something like this, with db.getEventEmitter() returning an EventEmitter.
const Datasource = {
getSomeData() {
return Observable.fromEvent(db.getEventEmitter(), 'value');
}
};
However, to actually use this, I need to both memoize the function and have it return a ReplaySubject, otherwise each subsequent call to getSomeData() would reinitialize the entire sequence and recreate more event emitters or not have any data until the next update, which is undesirable, so my code looks a lot more like this for every function
const someDataCache = null;
const Datasource = {
getSomeData() {
if (someDataCache) { return someDataCache; }
const subject = new ReplaySubject(1);
Observable.fromEvent(db.getEventEmitter(), 'value').subscribe(subject);
someDataCache = subject;
return subject;
}
};
which ends up being quite a lot of boilerplate for just one single function, and becomes more of an issue when there are more parameters.
Is there a better/more elegant design pattern to accomplish this? Basically, I'd like that
Only one event emitter is created.
Callers who call the datasource later get the most recent result.
The event emitters are created when they're needed.
but right now I feel like this pattern is fighting the Observable pattern, resulting a bunch of boilerplate.
As a followup to this question, I ended up commonizing the logic to leverage Observables in this way. publishReplay as cartant mentioned does get me most of the way to what I needed. I've documented what I've learned in this post, with the following tl;dr code:
let first = true
Rx.Observable.create(
observer => {
const callback = data => {
first = false
observer.next(data)
}
const event = first ? 'value' : 'child_changed'
db.ref(path).on(event, callback, error => observer.error(error))
return {event, callback}
},
(handler, {event, callback}) => {
db.ref(path).off(event, callback)
},
)
.map(snapshot => snapshot.val())
.publishReplay(1)
.refCount()
I'm doing a "Preview" action so that you can preview your changes on edits before doing them. Effectively what I'm doing is calling the update function (not action) without saving.
Update function code-bits:
// Updates existing schedules to new values; Working
UpdateSchedules(...);
// Removes existing schedules no longer in date-range; Not working
RemoveSchedules(...);
{
...
foreach(schedule in schedulesToRemove) { db.Entry(schedule).State = EntityState.Deleted; }
db.Schedules.RemoveRange(...);
...
}
// Adds schedules new to date-range; Working
AddSchedules(...);
{
...
db.Schedules.Add(...);
...
}
Retrieval Code:
// The results have the modified and added entities, but they also have the removed entities.
viewModel.Results = db.PaymentRecurringSchedules.Where(s => s.PaymentSetupHeaderID == headerID).OrderBy(s => s.PaymentDate).ToList();
My intention with this code is to not call db.SaveChanges() since it's just a preview, but I still want only the schedules that weren't removed (saved or unsaved).
What I've tried
I tried changing the state of the removed items as you can see above.
I tried doing a .Where(s => db.Entry(s).State != EntityState.Deleted) it didn't like that at all (i.e. execution errors).
How can I get the results where the items I removed but are unsaved will still be filtered out of the results list?
Version: 6.0 (In case it matters)
Don't have VS at hand, but i think that this should work as you requested:
List<Object> deletedEntities = context.ChangeTracker.Entries()
.Where(x => x.State == System.Data.EntityState.Deleted)
.Select(x => x.Entity)
.ToList();
Take care that you may want to filter also to specific Entity type/s.
I have paged interface. Given a starting point a request will produce a list of results and a continuation indicator.
I've created an observable that is built by constructing and flat mapping an observable that reads the page. The result of this observable contains both the data for the page and a value to continue with. I pluck the data and flat map it to the subscriber. Producing a stream of values.
To handle the paging I've created a subject for the next page values. It's seeded with an initial value then each time I receive a response with a valid next page I push to the pages subject and trigger another read until such time as there is no more to read.
Is there a more idiomatic way of doing this?
function records(start = 'LATEST', limit = 1000) {
let pages = new rx.Subject();
this.connect(start)
.subscribe(page => pages.onNext(page));
let records = pages
.flatMap(page => {
return this.read(page, limit)
.doOnNext(result => {
let next = result.next;
if (next === undefined) {
pages.onCompleted();
} else {
pages.onNext(next);
}
});
})
.pluck('data')
.flatMap(data => data);
return records;
}
That's a reasonable way to do it. It has a couple of potential flaws in it (that may or may not impact you depending upon your use case):
You provide no way to observe any errors that occur in this.connect(start)
Your observable is effectively hot. If the caller does not immediately subscribe to the observable (perhaps they store it and subscribe later), then they'll miss the completion of this.connect(start) and the observable will appear to never produce anything.
You provide no way to unsubscribe from the initial connect call if the caller changes its mind and unsubscribes early. Not a real big deal, but usually when one constructs an observable, one should try to chain the disposables together so it call cleans up properly if the caller unsubscribes.
Here's a modified version:
It passes errors from this.connect to the observer.
It uses Observable.create to create a cold observable that only starts is business when the caller actually subscribes so there is no chance of missing the initial page value and stalling the stream.
It combines the this.connect subscription disposable with the overall subscription disposable
Code:
function records(start = 'LATEST', limit = 1000) {
return Rx.Observable.create(observer => {
let pages = new Rx.Subject();
let connectSub = new Rx.SingleAssignmentDisposable();
let resultsSub = new Rx.SingleAssignmentDisposable();
let sub = new Rx.CompositeDisposable(connectSub, resultsSub);
// Make sure we subscribe to pages before we issue this.connect()
// just in case this.connect() finishes synchronously (possible if it caches values or something?)
let results = pages
.flatMap(page => this.read(page, limit))
.doOnNext(r => this.next !== undefined ? pages.onNext(this.next) : pages.onCompleted())
.flatMap(r => r.data);
resultsSub.setDisposable(results.subscribe(observer));
// now query the first page
connectSub.setDisposable(this.connect(start)
.subscribe(p => pages.onNext(p), e => observer.onError(e)));
return sub;
});
}
Note: I've not used the ES6 syntax before, so hopefully I didn't mess anything up here.