How to get Papa.parse results into an array - papaparse

Forewarning, I am new to js and I crafted the code below was taken from
Retrieve parsed data from CSV in Javascript object (using Papa Parse)
my goal = parse a csv file into an array, and use that array in a few other operations. I can see that the file is getting parsed properly via the "console.log("Row:", row.data);", but I cannot figure out how to get that entire array/dataset into a separate variable, much less into the "doSAtuff" function.
function parseData(url, callBack) {
Papa.parse(url, {
download: true,
header: true,
dynamicTyping: true,
comments: "*=",
step: function(row) {
console.log("Row:", row.data);
},
complete: function(results) {
callBack(results.data);
}
});
}
function doStuff(data) {
//Data should be usable here, but is emtp
console.log("doStuff - console log '" + data + "' ?");
}
parseData(inputFile, doStuff);
I think I want to do something like...
var csvArray = [];
csvArray = Papa.parse(url, {
download: true,
header: true,
dynamicTyping: true,
comments: "*
...
<some other stuff with csvArray>
but i'm a bit wrapped around the axle at the moment.

I know that if you remove the header: true setting then Papa.parse will return you an Array instead of an Object.
Alternatively you could just convert the object that is returning into an array using something like:
function _arrayify(obj) { return Object.keys(obj).map(function (k) { return obj[k]; }) }

const url = "https://www.papaparse.com/resources/files/normal.csv";
let results;
const csvData = Papa.parse(url, {
dynamicTyping: true,
download: true,
header: true,
comments: "*=",
complete: function(data) {
results = data.data
}
});
setTimeout(()=> {
console.log(results[0].ISSN)
}, 1500);
<script src="https://cdnjs.cloudflare.com/ajax/libs/PapaParse/4.5.0/papaparse.min.js">
</script>

Related

InstantSearch.js custom widget increase returned results

As you can see in the code bellow, I made 2 custom widgets, one for searching one for displaying results, but I cannot increase the limit of the returned results. When the page loads, it automatically makes a request and retrieves the default results with no query.
We have more than 500+ results which all have to be displayed on the UI, and I cannot modify the limit of the results anywhere. The limit is always 200. As a backend, I am not using algolia but rather meilisearch, but I do not this impacts this in any way.
As you can see in the images, the searchParamters are modified and set to limit 1000, but when the request is made, it sends it as 200.
const search = instantsearch({
indexName: "store",
searchClient: instantMeiliSearch(m_host, m_key),
});
search.addWidgets([
configure({
hitsPerPage: 1000,
limit: 1000,
filters: `image!=null AND is_active=1 AND (countries_ids=${country_id} OR countries_ids=0) AND goto_link != null`,
attributesToHighlight: [],
paginationLimitedTo: 1000,
}),
{
init: function (opts) {
const helper = opts.helper;
const input = document.querySelector('#searchbox');
input.addEventListener('input', function (e) {
helper.setQuery(e.currentTarget.value) // update the parameters
.search(); // launch the query
});
}
},
{
render: function (opts) {
let results = opts.results;
// read the hits from the results and transform them into HTML.
let res = toArray(groupBy(filter_stores(results.hits), 'category_id'));
$(`.stores-list`).empty()
$(`.categories-list`).hide()
res = res.map(it => {
let copy = it;
copy[1] = _.orderBy(it[1].map(store => {
store.default_rank = store.hasOwnProperty(`rank_country_${country_id}`) ? store[`rank_country_${country_id}`] : store.default_rank;
return store;
}), 'default_rank', 'asc');
return copy;
});
res.map(pairs => {
let [category_id, stores] = pairs;
$(`#category_${category_id}`).show()
let html = stores.map(h => create_store(h)).join('')
$(`#store_list_${category_id}`).html(html)
});
},
}
]);
search.start();

Uspert multiple documents with MongoDB/Mongoose

Say I have a list of models:
const documents = [{}, {}, {}];
And I want to insert these into the DB, or update them all, but only if a condition is met:
Model.update({isSubscribed: {$ne: false}}, documents, {upsert:true},(err, result) => {
});
The above signature is surely wrong - what I want to do is insert/update the documents, where the condition is met.
There is this Bulk API:
https://docs.mongodb.com/manual/reference/method/Bulk.find.upsert/
but I can't tell if it will work when inserting multiple documents.
Imagine this scenario: We have a list of employees and a form of some sorts to give them all a penalty, at once, not one by one :)
On the backend side, you would have your eg addBulk function. Something like this:
Penalty controller
module.exports = {
addBulk: (req, res) => {
const body = req.body;
for (const item of body) {
Penalty.create(item).exec((err, response) => {
if (err) {
res.serverError(err);
return;
}
});
res.ok('Penalties added successfully');
}
}
Then you'll probably have an API on your frontend that directs to that route and specific function (endpoint):
penaltyApi
import axios from 'axios';
import {baseApiUrl} from '../config';
const penaltyApi = baseApiUrl + 'penalty'
class PenaltyApi {
static addBulk(penalties) {
return axios({
method: 'post',
url: penaltyApi + '/addBulk',
data: penalties
})
}
}
export default PenaltyApi;
...and now let's make a form and some helper functions. I'll be using React for demonstration, but it's all JS by the end of the day, right :)
// Lets first add penalties to our local state:
addPenalty = (event) => {
event.preventDefault();
let penalty = {
amount: this.state.penaltyForm.amount,
unit: this.state.penaltyForm.unit,
date: new Date(),
description: this.state.penaltyForm.description,
employee: this.state.penaltyForm.employee.value
};
this.setState(prevState => ({
penalties: [...prevState.penalties, penalty]
}));
}
Here we are mapping over our formData and returning the value and passing it to our saveBulkEmployees() function
save = () => {
let penaltiesData = Object.assign([], this.state.penalties);
penaltiesData.map(penal => {
penal.employeeId = penal.employee.id;
delete penal.employee;
return penaltiesData;
});
this.saveBulkEmployees(penaltiesData);
}
...and finally, let's save all of them at once to our database using the Bulk API
saveBulkEmployees = (data) => {
PenaltyApi.addBulk(data).then(response => {
this.success();
console.log(response.config.data)
this.resetFormAndPenaltiesList()
}).catch(error => {
console.log('error while adding multiple penalties', error);
throw(error);
})
}
So, the short answer is YES, you can absolutely do that. The longer answer is above :) I hope this was helpful to you. If any questions, please let me know, I'll try to answer them as soon as I can.

Why my filter is not working in v2.ODataModel "read"?

I am using the OData model to read data. But it doesn't work. Check the code below:
getGuid: function(pernr) {
var self = this;
var url = "/PersonalDetailSet?$filter=Pernr eq '00000001'";
self.setBusy(true);
this.oModel.read(url, {
success: function(res) {
// ...
},
error: function() {
// ...
}
});
}
I don't know why the filter in url is not working now?
Check if your OData service supports the $filter query in the first place.
Use the read method correctly:myV2ODataModel.read("/PersonalDetailSet"/* No $filter queries here! */, {
filters: [ // <-- Should be an array, not a Filter instance!
new Filter({ // required from "sap/ui/model/Filter"
path: "myField",
operator: FilterOperator.EQ, // required from "sap/ui/model/FilterOperator"
value1: "..."
})
],
// ...
});
API reference: sap.ui.model.odata.v2.ODataModel#read
API reference: sap.ui.model.Filter
First you check whether you are getting model in the scope or not. As i can see this.oModel which is not proper way of getting model. Better use this.getModel() or this.getView().getModel() and then check the call. Passing filter is not the right way but still it should work.
If you want to apply additional URL Parameters in the read function you have to do this via the "urlParameters" parameter:
getGuid: function(pernr){
var self = this;
var url = "/PersonalDetailSet";
self.setBusy(true);
this.oModel.read(url, {
urlParameters: {
"$filter" : "Pernr eq '00000001'"
},
success: function(res){
self.setBusy(false);
self.guid = res.results[0].Guid;
},
error: function() {
self.setBusy(false);
}
});
}

Take advantage of blueprints / waterline findWhere inside custom controller

I have a bear model and I'm using it with blueprint REST.
// api/models/Bear.js
module.exports = {
attributes: {
name: {
type: 'string',
required: true
}
}
};
I'd like to perform some calculations to bears based on exactly the same criterias as the standard findWhere. Indeed I'd like to be able to request
GET /bear/details
exactly just like I request
GET /bear
So I could find bear details with :
complex query like ?where={}
fields like ?name=
but also sending json in body like {name: ''}
or maybe even using ?limit= etc.
The controller looks like this :
// api/controllers/BearController.js
module.exports = {
getDetails: function (req, res) {
Bear.find().exec(function (err, bears){
if (err) return res.serverError(err);
var bearsDetails = _.map(bears, function(bear) {
return {
id: bear.id,
nameLength: bear.name.length,
reversedName: bear.split('').reverse().join('')
};
});
return res.json(bearsDetails);
});
}
};
And I have a custom route that looks like this
// config/routes.js
module.exports.routes = {
'get /bear/details': 'BearController.getDetails'
}
=> How to automaticaly filter models exactly like in a findWhere request, in a custom controller, without reinventing the wheel ?
Apparently I figured it out myself digging into sails' find() source code. One can use actionUtil's parseCriteria(req). I personaly wrapped it into a service for cleanliness purpose.
Roughly :
api/services/ActionUtilService.js
module.exports = require('../../node_modules/sails/lib/hooks/blueprints/actionUtil');
api/controllers/BearController.js
module.exports = {
getDetails: function (req, res) {
let criteria = ActionUtilService.parseCriteria(req);
Bear.find(criteria).exec(function (err, bears){
if (err) return res.serverError(err);
var bearsDetails = _.map(bears, function(bear) {
return {
id: bear.id,
nameLength: bear.name.length,
reversedName: bear.split('').reverse().join('')
};
});
return res.json(bearsDetails);
});
}
};
For cleanliness I've wrapped it into

Why won't my insert callback run?

I'm running the following from the mongodb console (it's in a js file called via load()), but the callback to the insert never runs even though all the database insertions happen as expected. Can anyone see why?
db.oldsets.find().forEach(function (set) {
var newSet = {
tunes: [],
keys: []
};
set.tunes.forEach(function (arr) {
if (newSet) {
var tune = db.oldtunes.findOne({
_id: arr.tune
})
var newTune = db.getSiblingDB('jnr_local').tunes.findOne({
sessionId: tune.sessionId
})
if (newTune) {
newSet.tunes.push(newTune._id);
newSet.keys.push(arr.root + tune.mode);
} else {
newSet = null;
}
}
})
print('out') // outputs for every iteration
if (newSet) {
db.sets.insert(newSet, function (err, doc) {
print('in') // never outputs
db.practices.insert({
type: 'set',
// srcId: doc[0]._id,
stickyness: 0
});
});
} else {
print('else') // never outputs
db.dodgySets.insert(set);
}
});
It looks like you are mixing node.js and mongodb shell. In mongodb shell all code is synchronous and run line by line.
So the db.sets.insert will simply return inserted document.
So, try to rewrite it as follow:
if (newSet) {
insertedSet = db.sets.insert(newSet);
print('in') // never outputs
db.practices.insert({
type: 'set',
srcId: insertedSet._id,
stickyness: 0
});
}
Hope this helps!