I have uploaded csv files due to big size on server, I want to know if it's possible to read csv and convert in json.... in api side not client side - papaparse

I have uploaded csv files due to big size on server,
I want to know if it's possible to read csv and convert in json in api side not client side
const csvFilePath = require('../../../public/uploads/abc.csv');
Papa.parse(csvFilePath, {
header: true,
download: true,
skipEmptyLines: true,
step: function (row) {
console.log("Row:", row.data);
},
complete: function (results) {
console.log("bbbbbbbbbbbbbbbbbbbbbbbb");
console.log(results);
}
});
but unexpected.. result
Response.... getting...something...csv string but missing something

Related

Document AI Contract Processor - batchProcessDocuments ignores fieldMask

My aim is to reduce the json file size, which contains the base64 image sections of the documents by default.
I am using the Document AI - Contract Processor in US region, nodejs SDK.
It is my understanding that setting fieldMask attribute in batchProcessDocuments request filters out the properties that will be in the resulting json.
I want to keep only the entities property.
Here are my call parameters:
const documentai = require('#google-cloud/documentai').v1;
const client = new documentai.DocumentProcessorServiceClient(options);
let params = {
"name": "projects/XXX/locations/us/processors/3e85a4841d13ce5",
"region": "us",
"inputDocuments": {
"gcsDocuments": {
"documents": [{
"mimeType": "application/pdf",
"gcsUri": "gs://bubble-bucket-XXX/files/CymbalContract.pdf"
}]
}
},
"documentOutputConfig": {
"gcsOutputConfig": {
"gcsUri": "gs://bubble-bucket-XXXX/ocr/"
},
"fieldMask": {
"paths": [
"entities"
]
}
}
};
client.batchProcessDocuments(params, function(error, operation) {
if (error) {
return reject(error);
}
return resolve({
"operationName": operation.name
});
});
However, the resulting json is still containing the full set of data.
Am I missing something here?
The auto-generated documentation for the Node.JS Client Library is a little hard to follow, but it looks like the fieldMask should be a member of the gcsOutputConfig instead of the documentOutputConfig. (I'm surprised the API didn't throw an error)
https://cloud.google.com/nodejs/docs/reference/documentai/latest/documentai/protos.google.cloud.documentai.v1.documentoutputconfig.gcsoutputconfig
The REST Docs are a little more clear
https://cloud.google.com/document-ai/docs/reference/rest/v1/DocumentOutputConfig#gcsoutputconfig
Note: For a REST API call and for other client libraries, the fieldMask is structured as a string (e.g. text,entities,pages.pageNumber)
I haven't tried this with the Node Client libraries before, but I'd recommend trying this as well if moving the parameter doesn't work on its own.
https://cloud.google.com/document-ai/docs/send-request#async-processor

POST collection of objects in json-server

I am using json-server to fake the Api for the FrontEnd team.
We would like to have a feature to create the multiple objects (Eg. products) in one call.
In WebApi2 or actual RestApis, it can be done like the following:
POST api/products //For Single Creation
POST api/productCollections //For Multiple Creation
I don't know how I can achieve it by using json-server. I tried to POST the following data to api/products by using the postman, but it does not split the array and create items individually.
[
{
"id": "ff00feb6-b1f7-4bb0-b09c-7b88d984625d",
"code": "MM",
"name": "Product 2"
},
{
"id": "1f4492ab-85eb-4b2f-897a-a2a2b69b43a5",
"code": "MK",
"name": "Product 3"
}
]
It treats the whole array as the single item and append to the existing json.
Could you pls suggest how I could mock bulk insert in json-server? Or Restful Api should always be for single object manipulation?
This is not something that json-server supports natively, as far as I know, but it can be accomplished through a workaround.
I am assuming that you have some prior knowledge of node.js
You will have to create a server.js file which you will then run using node.js.
The server.js file will then make use of the json-server module.
I have included the code for the server.js file in the code snippet below.
I made use of lodash for my duplicate check. You will thus need to install lodash. You can also replace it with your own code if you do not want to use lodash, but lodash worked pretty well in my opinion.
The server.js file includes a custom post request function which accesses the lowdb instance used in the json-server instance. The data from the POST request is checked for duplicates and only new records are added to the DB where the id does not already exist. The write() function of lowdb persists the data to the db.json file. The data in memory and in the file will thus always match.
Please note that the API endpoints generated by json-server (or the rewritten endpoints) will still exist. You can thus use the custom function in conjunction with the default endpoints.
Feel free to add error handling where needed.
const jsonServer = require('json-server');
const server = jsonServer.create();
const _ = require('lodash')
const router = jsonServer.router('./db.json');
const middlewares = jsonServer.defaults();
const port = process.env.PORT || 3000;
server.use(middlewares);
server.use(jsonServer.bodyParser)
server.use(jsonServer.rewriter({
'/api/products': '/products'
}));
server.post('/api/productcollection', (req, res) => {
const db = router.db; // Assign the lowdb instance
if (Array.isArray(req.body)) {
req.body.forEach(element => {
insert(db, 'products', element); // Add a post
});
}
else {
insert(db, 'products', req.body); // Add a post
}
res.sendStatus(200)
/**
* Checks whether the id of the new data already exists in the DB
* #param {*} db - DB object
* #param {String} collection - Name of the array / collection in the DB / JSON file
* #param {*} data - New record
*/
function insert(db, collection, data) {
const table = db.get(collection);
if (_.isEmpty(table.find(data).value())) {
table.push(data).write();
}
}
});
server.use(router);
server.listen(port);
If you have any questions, feel free to ask.
The answer marked as correct didn't actually work for me. Due to the way the insert function is written, it will always generate new documents instead of updating existing docs. The "rewriting" didn't work for me either (maybe I did something wrong), but creating an entirely separate endpoint helped.
This is my code, in case it helps others trying to do bulk inserts (and modifying existing data if it exists).
const jsonServer = require('json-server');
const server = jsonServer.create()
const _ = require('lodash');
const router = jsonServer.router('./db.json');
const middlewares = jsonServer.defaults()
server.use(middlewares)
server.use(jsonServer.bodyParser)
server.post('/addtasks', (req, res) => {
const db = router.db; // Assign the lowdb instance
if (Array.isArray(req.body)) {
req.body.forEach(element => {
insert(db, 'tasks', element);
});
}
else {
insert(db, 'tasks', req.body);
}
res.sendStatus(200)
function insert(db, collection, data) {
const table = db.get(collection);
// Create a new doc if this ID does not exist
if (_.isEmpty(table.find({_id: data._id}).value())) {
table.push(data).write();
}
else{
// Update the existing data
table.find({_id: data._id})
.assign(_.omit(data, ['_id']))
.write();
}
}
});
server.use(router)
server.listen(3100, () => {
console.log('JSON Server is running')
})
On the frontend, the call will look something like this:
axios.post('http://localhost:3100/addtasks', tasks)
It probably didn't work at the time when this question was posted but now it does, call with an array on the /products endpoint for bulk insert.

How to split up Postgres data into JSON/object, or an easily usable format for Express

I'm using pg-promise to facilitate requests between Express and a Postgres database.
Using pg-promise's any:
db.any('select * from blog')
.then(function (data) {
console.log(data);
})
.catch(function (error) {
console.log('ERROR:', error);
});
There is only one row in the Postgres table, so the query above gets a block of data from Postgres that, when output to the terminal, looks like this:
[ { id: 1,
date: '2018-12-17',
title: 'Here is a Title',
article: "Okay, there's not much here yet.",
img1: null,
img2: null,
img3: null,
keywords: 'news' } ]
In an effort to break up this data into usable bits that I can then assign to values in Express, I attempted to use JSON.parse() on this data. I really didn't know what to expect from doing this.
db.any('select * from blog')
.then(function (data) {
data = JSON.parse(data); // attempting to parse the Postgres data
console.log(data);
})
An error occurred:
ERROR: SyntaxError: Unexpected token o in JSON at position 1
I also tried to call on this data as if it were an object.
db.any('select * from blog')
.then(function (data) {
console.log(data);
console.log(data.id); // attempting to get just the id from the data
})
Which output to the terminal as:
undefined
How can I use this data within a js environment like Express? Not sure if it makes a difference, but I'm trying to use Pug to template everything to the front-end.
The select query returns array of objects with each element in the array corresponding to a row in the table (blog) in your case and each key in the object corresponding to the table column.
I am not sure why you are trying to JSON-parse it, as JSON expects an object, and it cannot parse arrays.
Try JSON.parse(JSON.stringify(data));

vue js 2 - for loop in multiple rest calls fetchData

I am trying to get wp-rest and Vuejs 2 to work together, so far things are coming along nicely apart from this one rest call that requires another request for the design to be complete. Essentially I want to be able to iterate / loop through the first request and dynamically change update the second request.
And my second question is performance, overall the rest calls are taking a bit longer to load - is there something I can do to optimize?
Context:
The first result data gives me an id, slug and title to all the posts I want to display only on the homepage as featured - through that id or slug I want to pass it to the second request - so I can pull in more information about those posts - like featured image and other meta field data.
<pre>export default {
name: 'work',
data () {
return {
loading: false,
page: null,
pagesingle: null,
error: null
}
},
created() {
this.fetchData()
},
methods: {
fetchData() {
this.$http.get('/cms/wp-json/wp/v2/pages/?slug=work&_embed')
.then(result => {
this.page = result.data
this.$http.get('/cms/wp-json/wp/v2/cases-studes/?slug=case-study-name').then(
result => this.pagesingle = result.data
);
})
}
}
}</pre>
I think you want to look at Promise.all. It will take an array of promises, wait for them all to complete, and then resolve with an array of results.
You would build your array of promises based on the array of slugs and ids in your first request. Maybe something like
const promises = result.data.articles.map((article) =>
this.$http.get(`/cms/wp-json/wp/v2/cases-studies/?slug=${encodeURIComponent(article.slug)}`)
);
Getting the results is as easy as
Promise.all(promises).then((results) => {
this.arrayOfSinglePages = results.map((result) => result.data);
});
Now your this.page has the array of id (and stuff) and this.arrayOfSinglePages has the page details for each of them in the same order.

Store contents with rest proxy giving incorrect count

ExtJS 5.1.x, with several stores using rest proxy.
Here is an example:
Ext.define('cardioCatalogQT.store.TestResults', {
extend: 'Ext.data.Store',
alias: 'store.TestResults',
config:{
fields: [
{name: 'attribute', type: 'string'},
{name: 'sid', type: 'string'},
{name: 'value_s', type: 'string'},
{name: 'value_d', type: 'string'}
],
model: 'cardioCatalogQT.model.TestResult',
storeId: 'TestResults',
autoLoad: true,
pageSize: undefined,
proxy: {
url: 'http://127.0.0.1:5000/remote_results_get',
type: 'rest',
reader: {
type: 'json',
rootProperty: 'results'
}
}
}
});
This store gets populated when certain things happen in the API. After the store is populated, I need to do some basic things, like count the number of distinct instances of an attribute, say sid, which I do as follows:
test_store = Ext.getStore('TestResults');
n = test_store.collect('sid').length);
The problem is that I have to refresh the browser to get the correct value of 'n,' otherwise, the count is not right. I am doing a test_store.load() and indeed, the request is being sent to the server after the .load() is issued.
I am directly querying the backend database to see what data are there in the table and to get a count to compare to the value given by test_store.collect('sid').length);. The strange thing is that I am also printing out the store object in the debugger, and the expected records (when compared to the content in the database table) are displayed under data.items array, but the value given by test_store.collect('sid').length is not right.
This is all done sequentially in a success callback. I am wondering if there is some sort of asynchronous behavior giving me the inconsistent results between what is is the store and the count on the content of the store?
I tested this with another store that uses the rest proxy and it has the same behavior. On the other hand, using the localStorage proxy gives the correct count consistent with the store records/model instances.
Here is the relevant code in question, an Ajax request fires off and does its thing correctly, and hit this success callback. There really isn't very much interesting going on... the problem section is after the console.log('TEST STORE HERE'); where I get the store, print the contents of the store, load/sync then print the store (which works just fine) and then finally print the length of uniquely grouped items by the sid attribute (which is what is not working):
success: function(response) {
json = Ext.decode(response.responseText);
if(json !== null && typeof (json) !== 'undefined'){
for (i = 0, max = json.items.length; i < max; i += 1) {
if (print_all) {
records.push({
sid: json.items[i].sid,
attribute: json.items[i].attribute,
string: json.items[i].value_s,
number: json.items[i].value_d
});
}
else {
records.push({
sid: json.items[i].sid
})
}
}
//update store with data
store.add(records);
store.sync();
// only add to store if adding to search grid
if (!print_all) {
source.add({
key: payload.key,
type: payload.type,
description: payload.description,
criteria: payload.criteria,
atom: payload.atom,
n: store.collect('sid').length // get length of array for unique sids
});
source.sync();
}
console.log('TEST STORE HERE');
test_store = Ext.getStore('TestResults');
test_store.load();
test_store.sync();
console.log(test_store);
console.log(test_store.collect('sid').length)
}
// update grid store content
Ext.StoreMgr.get('Payload').load();
Ext.ComponentQuery.query('#searchGrid')[0].getStore().load();
}
For completeness, here is the data.items array output items:Array[2886]
which is equivalent count of unique items grouped by the attribute sid and finally the output of console.log(test_store.collect('sid').length), which gives the value from the PREVIOUS run of this: 3114...