GCP IoT Error on sending new device configuration: "Error 413 (Request Entity Too Large)!!" - error-code

I'm trying to send my GCP IoT device a new configuration. The Base64-encoded binary string is approximately 15k bytes in length. GCP IoT device config size limit is 64k according to the docs. But I'm still getting a 413 (request entity too large) error. What am I doing wrong? 15k seems incredibly tiny to be producing such an error. Thanks for any help.
Here's the JavaScript code that sends the config data:
sendDeviceConfig(deviceId, configPayload) {
const parentName = `projects/${this.projectId}/locations/${this.cloudRegion}`;
const registryName = `${parentName}/registries/${this.registryId}`;
const binaryData = Buffer.from(configPayload).toString('base64');
const request = {
name: `${registryName}/devices/${deviceId}`,
versionToUpdate: 0,
binaryData: binaryData,
};
return new Promise((resolve, reject)=>{
this.client.projects.locations.registries.devices.modifyCloudToDeviceConfig(
request,
(err) => {
if (err) {
this.logger.error('Could not update config:', deviceId);
reject(err);
} else {
resolve();
}
}
);
});
}
... And part of the HTML formatted (wtf?) error response:
<html lang=en>
<meta charset=utf-8>
<meta name=viewport content="initial-scale=1, minimum-scale=1, width=device-width">
<title>Error 413 (Request Entity Too Large)!!1</title>
...
</style>
<a href=//www.google.com/><span id=logo aria-label=Google></span></a>
<p><b>413.</b> <ins>That’s an error.</ins>
<p>Your client issued a request that was too large.

I think the payload will be ~ 22.4kb with base64 encoding.
However, if the config is over 16kb and put in the request header, google will return a 413. It should be in the body of a Post.

It looks like the legacy client library may be doing something weird, the following code, used as a drop-in for the sample code, works for me with larger configuration payloads:
const iot = require('#google-cloud/iot');
const newclient = new iot.v1.DeviceManagerClient({
// optional auth parameters.
});
const parentName = `projects/${projectId}/locations/${cloudRegion}`;
const registryName = `${parentName}/registries/${registryId}`;
const binaryData = Buffer.from(data).toString('base64');
const request = {
name: `${registryName}/devices/${deviceId}`,
binaryData: binaryData,
};
newclient.modifyCloudToDeviceConfig(request)
.then(responses => {
const response = responses[0];
// doThingsWith(response)
})
.catch(err => {
console.error(err);
});

Related

busboy-bodyparser changes my request so that GridFsStorage doesn't register the request-data in mongodb

I am a frontend developer trying to broaden my horizons, and making what will become a MERN application. I'm struggling with image uploads to mongodb.
First I used the express bodyparser:
app.use(express.urlencoded({ extended: true }));
and app.use(express.json());
when used like this I managed to upload the file fine, and the uploaded file showed up in MongoDB Compass.
I found out that this doesn't support multipart/form-data, so I've changed the bodyparser to busboy-bodyparser so that I can access both form-data and the file that is being uploaded. So I changed the bodyparser to:
app.use(busboyBodyParser());
and now it won't upload the request-data to mongodb.
My upload control looks like this:
const upload = require("../middleware/upload");
const uploadFile = async (req, res) => {
try {
req.file = req.files.file;
await upload(req, res);
if (req.file == undefined) {
return res.send(`You must select a file.`);
}
return res.send(`File has been uploaded.`);
} catch (error) {
console.log(error);
return res.send(`Error when trying upload image: ${error}`);
}
};
module.exports = {
uploadFile: uploadFile
};
the reason I've set req.file equals to req.files.file is because busboy-bodyparser sends the file from req.files.file and not req.file, and I thought that this change would make the request function properly, it did not.
My upload-middleware looks like this:
const promise = mongoose.connect(mongoURI, { useNewUrlParser: true, useUnifiedTopology: true });
const conn = mongoose.connection;
let gfs;
conn.once('open', () => {
gfs = Grid(conn, mongoose.mongo);
gfs.collection('uploads');
});
//create storage object
const storage = new GridFsStorage({
db: promise,
file: (req, file) => {
return new Promise((resolve, reject) => {
crypto.randomBytes(16, (err, buf) => {
if (err) {
return reject(err);
}
const filename = buf.toString('hex') + path.extname(file.originalname);
const fileInfo = {
filename: filename,
bucketName: 'uploads',
metadata: {
title: req.body.title,
orientation: req.body.orientation
}
};
resolve(fileInfo);
});
});
}
});
const uploadFile = multer({ storage }).single("file");
var uploadFilesMiddleware = util.promisify(uploadFile);
module.exports = uploadFilesMiddleware;
I believe this is the code that logs (node:15124) DeprecationWarning: Listening to events on the Db class has been deprecated and will be removed in the next major version.
(Use node --trace-deprecation ... to show where the warning was created)
which is another problem I'm unsure how to solve, but that's another problem for another day.
My end goal is to be able to send the file to mongodb, with the attached metadata (title and orientation).
with this code I'm able to get the "File has been uploaded" message from the upload-control, but in mongoDB compass no file/chunks has been uploaded. The uploads worked great on file-uploads (without the metadata) with the express-bodyparser, so when I changed that to the busboy-bodyparser I get both the file and the metadata as intended but it is not loaded into the db, which leads me to believe that the new bodyparser changes the request so that GridFsStorage no longer recognizes it and doesn't put the data into the db. But frankly I'm just speculating here, and I generally have very limited knowledge of backend.
I use the correct enctype on the form I believe:
<form
action="/upload"
method="POST"
enctype="multipart/form-data">
any tips or explanations is very much appreciated!
I am a complete beginner in backend, so don't be afraid to spell it our for me :)
I managed to fix it!
I'm unsure what caused it, but I believe that the req.body-fields hadn't been populated yet or something of that nature. I therefore switched out
metadata: {
title: req.body.title,
orientation: req.body.orientation
}
with: metadata: req.body and it just works.
For any other backend-newbie who might stumble upon this, also remember to name your inputs in html like this: <input name="title" type="text" /> it is the name-attribute that gets submitted with the html-form and gives the key to req.body, so that you can access for example req.body.title (which didn't work here, but still worth knowing)

why can't I send http requests outside of a web app?

when I try to make requests to a server running on localhost:8080, it sometimes gives a 404 or 405 error depending on the type of request. most recently this has happened when I try unit testing via JEST.
I know for a fact there's nothing wrong with the actual code because it works perfectly fine when other people run it on their PCs.
the exact same fetch requests that fail in unit tests work perfectly fine within a web application.
I first noticed this problem on Postman. The problem seems to occur when I try making requests just before or after the server has started listening, and then I have to wait for a bit before requests work again. Of course it makes sense that it wouldn't work when my server isn't listening yet, but the problem persists for a while after it does. This problem usually doesn't occur on postman anymore now that I make sure to wait for the server to start listening.
here's the response with a GET or POST request:
<!DOCTYPE html>
<html><head><title>Not Found</title></head>
<body>
<h2>Access Error: 404 -- Not Found</h2>
<pre>Cannot open document for: /myroute/mypath</pre>
</body>
</html>
with a PUT request:
<!DOCTYPE html>
<html><head><title>Method Not Allowed</title></head>
<body>
<h2>Access Error: 405 -- Method Not Allowed</h2>
<pre>The "PUT" method is not supported by file handler</pre>
</body>
</html>
simple versions of the code I'm working with (again, I know for a fact this code works on a PC other than mine)
const mongoose = require('mongoose');
require('../src/model/logboekTemplate.js');
const fetch = require('node-fetch');
const SERVERIP = "http://localhost:8080/logboeken/"
const dbName = 'rekenlogboek';
const Logboek = mongoose.model('Logboek');
describe('Logboeken route Tests', () => {
beforeAll(async () => {
mongoose.connect(`mongodb://localhost:27017/${dbName}`, { useUnifiedTopology: true, useNewUrlParser: true, retryWrites: false }).then().catch(err => {
console.log(err);
});
await Logboek.deleteMany({});
});
beforeEach(async () => {
await Logboek.create([
{
naam: "testlogboek",
},
]);
});
afterEach(async () => {
await Logboek.deleteMany({});
});
//laat de gehele lijst logboeken zien
test('get /logboeklijst', async () => {
let response = await fetch(SERVERIP + "/logboeklijst", {
method: 'get', // *GET, POST, PUT, DELETE, etc.
mode: 'cors', // no-cors, *cors, same-origin
});
expect(response.status).toEqual(201);
let body = await response.json()
expect(body[0].naam).toEqual("testlogboek");
})
})
This server works perfectly fine normally, it's just the unit tests that fail and even then only when I do it on my laptop.
const express = require('express')
const http = require('http')
const mongoose = require('mongoose');
const cors = require('cors');
const bodyParser = require('body-parser');
const app = express();
const server = http.createServer(app);
app.use(bodyParser.text());
app.use(bodyParser.json());
app.use(cors());
const { logboekRouter } = require('./routes/logboeken');
app.use('/logboeken', logboekRouter);
//-------------------------------------//
//Mongoose
//-------------------------------------//
let dbName;
dbName = "rekenlogboek";
mongoose.connect(`mongodb://localhost:27017/${dbName}`, { useUnifiedTopology: true, useNewUrlParser: true, retryWrites: false }).then().catch(err => {
console.log(err);
});
server.listen(8080,
function () {
console.log("The Server is listening on port 8080.")
});
and the command line
> project-tests#1.0.0 test C:\Users\Wouter\Documents\GitHub\{my project}
> jest --runInBand --verbose
(node:23248) DeprecationWarning: collection.ensureIndex is deprecated. Use createIndexes instead.
console.warn
Mongoose: looks like you're trying to test a Mongoose app with Jest's default jsdom test environment. Please make sure you read Mongoose's docs on configuring Jest to test Node.js apps: http://mongoosejs.com/docs/jest.html
at Object.<anonymous> (deployment/node_modules/mongoose/lib/helpers/printJestWarning.js:4:11)
at Object.<anonymous> (deployment/node_modules/mongoose/lib/index.js:46:1)
FAIL deployment/routes/logboeken.test.js
Logboeken route Tests
× get /logboeklijst (42 ms)
● Logboeken route Tests › get /logboeklijst
expect(received).toEqual(expected) // deep equality
Expected: 201
Received: 404
212 | mode: 'cors', // no-cors, *cors, same-origin
213 | });
> 214 | expect(response.status).toEqual(201);
| ^
215 | let body = await response.json()
216 | expect(body[0].naam).toEqual("testlogboek");
217 | })
at Object.<anonymous> (deployment/routes/logboeken.test.js:214:33)
at runMicrotasks (<anonymous>)

Image returned from REST API always displays broken

I am building a content management system for an art portfolio app, with React. The client will POST to the API which uses Mongoose to insert into a MongoDB. The API then queries the DB for the newly inserted image, and returns it to the client.
Here's my code to connect to MongoDB using Mongoose:
mongoose.connect('mongodb://localhost/test').then(() =>
console.log('connected to db')).catch(err => console.log(err))
mongoose.Promise = global.Promise
const db = mongoose.connection
db.on('error', console.error.bind(console, 'MongoDB connection error:'))
const Schema = mongoose.Schema;
const ImgSchema = new Schema({
img: { data: Buffer, contentType: String }
})
const Img = mongoose.model('Img', ImgSchema)
I am using multer and fs to handle the image file. My POST endpoint looks like this:
router.post('/', upload.single('image'), (req, res) => {
if (!req.file) {
res.send('no file')
} else {
const imgItem = new Img()
imgItem.img.data = fs.readFileSync(req.file.path)
imgItem.contentType = 'image/png'
imgItem
.save()
.then(data =>
Img.findById(data, (err, findImg) => {
console.log(findImg.img)
fs.writeFileSync('api/uploads/image.png', findImg.img.data)
res.sendFile(__dirname + '/uploads/image.png')
}))
}
})
I can see in the file structure that writeFileSync is writing the image to the disk. res.sendFile grabs it and sends it down to the client.
Client side code looks like this:
handleSubmit = e => {
e.preventDefault()
const img = new FormData()
img.append('image', this.state.file, this.state.file.name)
axios
.post('http://localhost:8000/api/gallery', img, {
onUploadProgress: progressEvent => {
console.log(progressEvent.loaded / progressEvent.total)
}
})
.then(res => {
console.log('responsed')
console.log(res)
const returnedFile = new File([res.data], 'image.png', { type: 'image/png' })
const reader = new FileReader()
reader.onloadend = () => {
this.setState({ returnedFile, returned: reader.result })
}
reader.readAsDataURL(returnedFile)
})
.catch(err => console.log(err))
}
This does successfully place both the returned file and the img data url on state. However, in my application, the image always displays broken.
Here's some screenshots:
How to fix this?
Avoid sending back base64 encoded images (multiple images + large files + large encoded strings = very slow performance). I'd highly recommend creating a microservice that only handles image uploads and any other image related get/post/put/delete requests. Separate it from your main application.
For example:
I use multer to create an image buffer
Then use sharp or fs to save the image (depending upon file type)
Then I send the filepath to my controller to be saved to my DB
Then, the front-end does a GET request when it tries to access: http://localhost:4000/uploads/timestamp-randomstring-originalname.fileext
In simple terms, my microservice acts like a CDN solely for images.
For example, a user sends a post request to http://localhost:4000/api/avatar/create with some FormData:
It first passes through some Express middlewares:
libs/middlewares.js
...
app.use(cors({credentials: true, origin: "http://localhost:3000" })) // allows receiving of cookies from front-end
app.use(morgan(`tiny`)); // logging framework
app.use(multer({
limits: {
fileSize: 10240000,
files: 1,
fields: 1
},
fileFilter: (req, file, next) => {
if (!/\.(jpe?g|png|gif|bmp)$/i.test(file.originalname)) {
req.err = `That file extension is not accepted!`
next(null, false)
}
next(null, true);
}
}).single(`file`))
app.use(bodyParser.json()); // parses header requests (req.body)
app.use(bodyParser.urlencoded({ limit: `10mb`, extended: true })); // allows objects and arrays to be URL-encoded
...etc
Then, hits the avatars route:
routes/avatars.js
app.post(`/api/avatar/create`, requireAuth, saveImage, create);
It then passes through some user authentication, then goes through my saveImage middleware:
services/saveImage.js
const createRandomString = require('../shared/helpers');
const fs = require("fs");
const sharp = require("sharp");
const randomString = createRandomString();
if (req.err || !req.file) {
return res.status(500).json({ err: req.err || `Unable to locate the requested file to be saved` })
next();
}
const filename = `${Date.now()}-${randomString}-${req.file.originalname}`;
const filepath = `uploads/${filename}`;
const setFilePath = () => { req.file.path = filepath; return next();}
(/\.(gif|bmp)$/i.test(req.file.originalname))
? fs.writeFile(filepath, req.file.buffer, (err) => {
if (err) {
return res.status(500).json({ err: `There was a problem saving the image.`});
next();
}
setFilePath();
})
: sharp(req.file.buffer).resize(256, 256).max().withoutEnlargement().toFile(filepath).then(() => setFilePath())
If the file is saved, it then sends a req.file.path to my create controller. This gets saved to my DB as a file path and as an image path (the avatarFilePath or /uploads/imagefile.ext is saved for removal purposes and the avatarURL or [http://localhost:4000]/uploads/imagefile.ext is saved and used for the front-end GET request):
controllers/avatars.js (I'm using Postgres, but you can substitute for Mongo)
create: async (req, res, done) => {
try {
const avatarurl = `${apiURL}/${req.file.path}`;
await db.result("INSERT INTO avatars(userid, avatarURL, avatarFilePath) VALUES ($1, $2, $3)", [req.session.id, avatarurl, req.file.path]);
res.status(201).json({ avatarurl });
} catch (err) { return res.status(500).json({ err: err.toString() }); done();
}
Then when the front-end tries to access the uploads folder via <img src={avatarURL} alt="image" /> or <img src="[http://localhost:4000]/uploads/imagefile.ext" alt="image" />, it gets served up by the microservice:
libs/server.js
const express = require("express");
const path = app.get("path");
const PORT = 4000;
//============================================================//
// EXPRESS SERVE AVATAR IMAGES
//============================================================//
app.use(`/uploads`, express.static(`uploads`));
//============================================================//
/* CREATE EXPRESS SERVER */
//============================================================//
app.listen(PORT);
What it looks when logging requests:
19:17:54 INSERT INTO avatars(userid, avatarURL, avatarFilePath) VALUES ('08861626-b6d0-11e8-9047-672b670fe126', 'http://localhost:4000/uploads/1536891474536-k9c7OdimjEWYXbjTIs9J4S3lh2ldrzV8-android.png', 'uploads/1536891474536-k9c7OdimjEWYXbjTIs9J4S3lh2ldrzV8-android.png')
POST /api/avatar/create 201 109 - 61.614 ms
GET /uploads/1536891474536-k9c7OdimjEWYXbjTIs9J4S3lh2ldrzV8-android.png 200 3027 - 3.877 ms
What the user sees upon successful GET request:

How to download files using axios

I am using axios for basic http requests like GET and POST, and it works well. Now I need to be able to download Excel files too. Is this possible with axios? If so does anyone have some sample code? If not, what else can I use in a React application to do the same?
Download the file with Axios as a responseType: 'blob'
Create a file link using the blob in the response from Axios/Server
Create <a> HTML element with a the href linked to the file link created in step 2 & click the link
Clean up the dynamically created file link and HTML element
axios({
url: 'http://api.dev/file-download', //your url
method: 'GET',
responseType: 'blob', // important
}).then((response) => {
// create file link in browser's memory
const href = URL.createObjectURL(response.data);
// create "a" HTML element with href to file & click
const link = document.createElement('a');
link.href = href;
link.setAttribute('download', 'file.pdf'); //or any other extension
document.body.appendChild(link);
link.click();
// clean up "a" element & remove ObjectURL
document.body.removeChild(link);
URL.revokeObjectURL(href);
});
Check out the quirks at https://gist.github.com/javilobo8/097c30a233786be52070986d8cdb1743
Full credits to: https://gist.github.com/javilobo8
More documentation for URL.createObjectURL is available on MDN. It's critical to release the object with URL.revokeObjectURL to prevent a memory leak. In the function above, since we've already downloaded the file, we can immediately revoke the object.
Each time you call createObjectURL(), a new object URL is created, even if you've already created one for the same object. Each of these must be released by calling URL.revokeObjectURL() when you no longer need them.
Browsers will release object URLs automatically when the document is unloaded; however, for optimal performance and memory usage, if there are safe times when you can explicitly unload them, you should do so.
When response comes with a downloadable file, response headers will be something like
Content-Disposition: "attachment;filename=report.xls"
Content-Type: "application/octet-stream" // or Content-type: "application/vnd.ms-excel"
What you can do is create a separate component, which will contain a hidden iframe.
import * as React from 'react';
var MyIframe = React.createClass({
render: function() {
return (
<div style={{display: 'none'}}>
<iframe src={this.props.iframeSrc} />
</div>
);
}
});
Now, you can pass the url of the downloadable file as prop to this component, So when this component will receive prop, it will re-render and file will be downloaded.
Edit: You can also use js-file-download module. Link to Github repo
const FileDownload = require('js-file-download');
Axios({
url: 'http://localhost/downloadFile',
method: 'GET',
responseType: 'blob', // Important
}).then((response) => {
FileDownload(response.data, 'report.csv');
});
Downloading Files (using Axios and Security)
This is actually even more complex when you want to download files using Axios and some means of security. To prevent anyone else from spending too much time in figuring this out, let me walk you through this.
You need to do 3 things:
Configure your server to permit the browser to see required HTTP headers
Implement the server-side service, and making it advertise the correct file type for the downloaded file.
Implementing an Axios handler to trigger a FileDownload dialog within the browser
These steps are mostly doable - but are complicated considerably by the browser's relation to CORS. One step at a time:
1. Configure your (HTTP) server
When employing transport security, JavaScript executing within a browser can [by design] access only 6 of the HTTP headers actually sent by the HTTP server. If we would like the server to suggest a filename for the download, we must inform the browser that it is "OK" for JavaScript to be granted access to other headers where the suggested filename would be transported.
Let us assume - for the sake of discussion - that we want the server to transmit the suggested filename within an HTTP header called X-Suggested-Filename. The HTTP server tells the browser that it is OK to expose this received custom header to the JavaScript/Axios with the following header:
Access-Control-Expose-Headers: X-Suggested-Filename
The exact way to configure your HTTP server to set this header varies from product to product.
See https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Expose-Headers for a full explanation and detailed description of these standard headers.
2. Implement the server-side service
Your server-side service implementation must now perform 2 things:
Create the (binary) document and assign the correct ContentType to the response
Assign the custom header (X-Suggested-Filename) containing the suggested file name for the client
This is done in different ways depending on your chosen technology stack. I will sketch an example using the JavaEE 7 standard which should emit an Excel report:
#GET
#Path("/report/excel")
#Produces("application/vnd.ms-excel")
public Response getAllergyAndPreferencesReport() {
// Create the document which should be downloaded
final byte[] theDocumentData = ....
// Define a suggested filename
final String filename = ...
// Create the JAXRS response
// Don't forget to include the filename in 2 HTTP headers:
//
// a) The standard 'Content-Disposition' one, and
// b) The custom 'X-Suggested-Filename'
//
final Response.ResponseBuilder builder = Response.ok(
theDocumentData, "application/vnd.ms-excel")
.header("X-Suggested-Filename", fileName);
builder.header("Content-Disposition", "attachment; filename=" + fileName);
// All Done.
return builder.build();
}
The service now emits the binary document (an Excel report, in this case), sets the correct content type - and also sends a custom HTTP header containing the suggested filename to use when saving the document.
3. Implement an Axios handler for the Received document
There are a few pitfalls here, so let's ensure all details are correctly configured:
The service responds to #GET (i.e. HTTP GET), so the Axios call must be 'axios.get(...)'.
The document is transmitted as a stream of bytes, so you must tell Axios to treat the response as an HTML5 Blob. (I.e. responseType: 'blob').
In this case, the file-saver JavaScript library is used to pop the browser dialog open. However, you could choose another.
The skeleton Axios implementation would then be something along the lines of:
// Fetch the dynamically generated excel document from the server.
axios.get(resource, {responseType: 'blob'}).then((response) => {
// Log somewhat to show that the browser actually exposes the custom HTTP header
const fileNameHeader = "x-suggested-filename";
const suggestedFileName = response.headers[fileNameHeader];
const effectiveFileName = (suggestedFileName === undefined
? "allergierOchPreferenser.xls"
: suggestedFileName);
console.log(`Received header [${fileNameHeader}]: ${suggestedFileName}, effective fileName: ${effectiveFileName}`);
// Let the user save the file.
FileSaver.saveAs(response.data, effectiveFileName);
}).catch((response) => {
console.error("Could not Download the Excel report from the backend.", response);
});
Axios.post solution with IE and other browsers
I've found some incredible solutions here. But they frequently don't take into account problems with IE browser. Maybe it will save some time to somebody else.
axios.post("/yourUrl",
data,
{ responseType: 'blob' }
).then(function (response) {
let fileName = response.headers["content-disposition"].split("filename=")[1];
if (window.navigator && window.navigator.msSaveOrOpenBlob) { // IE variant
window.navigator.msSaveOrOpenBlob(new Blob([response.data],
{ type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet' }
),
fileName
);
} else {
const url = window.URL.createObjectURL(new Blob([response.data],
{ type: 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet' }));
const link = document.createElement('a');
link.href = url;
link.setAttribute('download',
response.headers["content-disposition"].split("filename=")[1]);
document.body.appendChild(link);
link.click();
}
}
);
example above is for excel files, but with little changes can be applied to any format.
And on server I've done this to send an excel file.
response.contentType = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet"
response.addHeader(HttpHeaders.CONTENT_DISPOSITION, "attachment; filename=exceptions.xlsx")
The function to make the API call with axios:
function getFileToDownload (apiUrl) {
return axios.get(apiUrl, {
responseType: 'arraybuffer',
headers: {
'Content-Type': 'application/json'
}
})
}
Call the function and then download the excel file you get:
getFileToDownload('putApiUrlHere')
.then (response => {
const type = response.headers['content-type']
const blob = new Blob([response.data], { type: type, encoding: 'UTF-8' })
const link = document.createElement('a')
link.href = window.URL.createObjectURL(blob)
link.download = 'file.xlsx'
link.click()
})
It's very simple javascript code to trigger a download for the user:
window.open("<insert URL here>")
You don't want/need axios for this operation; it should be standard to just let the browser do it's thing.
Note: If you need authorisation for the download then this might not work. I'm pretty sure you can use cookies to authorise a request like this, provided it's within the same domain, but regardless, this might not work immediately in such a case.
As for whether it's possible... not with the in-built file downloading mechanism, no.
axios.get(
'/app/export'
).then(response => {
const url = window.URL.createObjectURL(new Blob([response]));
const link = document.createElement('a');
link.href = url;
const fileName = `${+ new Date()}.csv`// whatever your file name .
link.setAttribute('download', fileName);
document.body.appendChild(link);
link.click();
link.remove();// you need to remove that elelment which is created before.
})
The trick is to make an invisible anchor tag in the render() and add a React ref allowing to trigger a click once we have the axios response:
class Example extends Component {
state = {
ref: React.createRef()
}
exportCSV = () => {
axios.get(
'/app/export'
).then(response => {
let blob = new Blob([response.data], {type: 'application/octet-stream'})
let ref = this.state.ref
ref.current.href = URL.createObjectURL(blob)
ref.current.download = 'data.csv'
ref.current.click()
})
}
render(){
return(
<div>
<a style={{display: 'none'}} href='empty' ref={this.state.ref}>ref</a>
<button onClick={this.exportCSV}>Export CSV</button>
</div>
)
}
}
Here is the documentation: https://reactjs.org/docs/refs-and-the-dom.html. You can find a similar idea here: https://thewebtier.com/snippets/download-files-with-axios/.
There are a couple of critical points most of the answers are missing.
I will try to explain in much depth here.
TLDR;
If you are creating an a tag link and initiating a download through broswer request, then
Always call window.URL.revokeObjectURL(url);. Else there can be
unnecessary memory spikes.
There is NO need to append the created link to the document body using document.body.appendChild(link);, preventing the unnecessary need to remove the child later.
For Component code and a deeper analysis, read further
First is to figure out if the API endpoint from which you are trying to download the data is public or private. Do you have control over the server or not?
If the server responds with
Content-Disposition: attachment; filename=dummy.pdf
Content-Type: application/pdf
Browser will always try to download the file with the name 'dummy.pdf'
If the server responds with
Content-Disposition: inline; filename=dummy.pdf
Content-Type: application/pdf
Browser will first try to open a native file reader if available with the name 'dummy.pdf', else it will start file download.
If the server responds with neither of the above 2 headers
Browser (atleast chrome) will try to open the file if the download attribute is not set. If set, it will download the file. The name of the file will be the value of the last path param in cases where the url is not a blob.
Apart from that keep in mind to use Transfer-Encoding: chunked from server to transfer large volumes of data from the server. This will ensure the client knows when to stop reading from the current request in the absence of Content-Length header
For Private Files
import { useState, useEffect } from "react";
import axios from "axios";
export default function DownloadPrivateFile(props) {
const [download, setDownload] = useState(false);
useEffect(() => {
async function downloadApi() {
try {
// It doesn't matter whether this api responds with the Content-Disposition header or not
const response = await axios.get(
"http://localhost:9000/api/v1/service/email/attachment/1mbdoc.docx",
{
responseType: "blob", // this is important!
headers: { Authorization: "sometoken" },
}
);
const url = window.URL.createObjectURL(new Blob([response.data])); // you can mention a type if you wish
const link = document.createElement("a");
link.href = url;
link.setAttribute("download", "dummy.docx"); //this is the name with which the file will be downloaded
link.click();
// no need to append link as child to body.
setTimeout(() => window.URL.revokeObjectURL(url), 0); // this is important too, otherwise we will be unnecessarily spiking memory!
setDownload(false);
} catch (e) {} //error handling }
}
if (download) {
downloadApi();
}
}, [download]);
return <button onClick={() => setDownload(true)}>Download Private</button>;
}
For Public Files
import { useState, useEffect } from "react";
export default function DownloadPublicFile(props) {
const [download, setDownload] = useState(false);
useEffect(() => {
if (download) {
const link = document.createElement("a");
link.href =
"http://localhost:9000/api/v1/service/email/attachment/dummy.pdf";
link.setAttribute("download", "dummy.pdf");
link.click();
setDownload(false);
}
}, [download]);
return <button onClick={() => setDownload(true)}>Download Public</button>;
}
Good to know:
Always control file downloads from server.
Axios in the browser uses XHR under the hood, in which streaming of responses
is not supported.
Use onDownloadProgress method from Axios to implement progress bar.
Chunked responses from server do not ( cannot ) indicate Content-Length. Hence you need some way of knowing the response size if you are using them while building a progress bar.
<a> tag links can only make GET HTTP requests without any ability to send headers or
cookies to the server (ideal for downloading from public endpoints)
Brower request is slightly different from XHR request made in code.
Ref: Difference between AJAX request and a regular browser request
File download with custom header request. In this example, it shows how to send file download request with the bearer token. Good for downloadable content with authorization.
download(urlHere) {
axios.get(urlHere, {
headers: {
"Access-Control-Allow-Origin": "*",
Authorization: `Bearer ${sessionStorage.getItem("auth-token")}`,
}
}).then((response) => {
const temp = window.URL.createObjectURL(new Blob([response.data]));
const link = document.createElement('a');
link.href = temp;
link.setAttribute('download', 'file.csv'); //or any other extension
document.body.appendChild(link);
link.click();
});
}
You need to return File({file_to_download}, "application/vnd.ms-excel") from your backend to the frontend and in your js file you need to update the code that is written below:
function exportToExcel() {
axios.post({path to call your controller}, null,
{
headers:
{
'Content-Disposition': "attachment; filename=XYZ.xlsx",
'Content-Type': 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet'
},
responseType: 'arraybuffer',
}
).then((r) => {
const path= window.URL.createObjectURL(new Blob([r.data]));
const link = document.createElement('a');
link.href = path;
link.setAttribute('download', 'XYZ.xlsx');
document.body.appendChild(link);
link.click();
}).catch((error) => console.log(error));
}
For those who'd like to implement an authenticated native download.
I'm currently developing a SPA with Axios.
Unfortunately Axios does't allow stream response type in such case.
From documentation:
// `responseType` indicates the type of data that the server will respond with
// options are: 'arraybuffer', 'document', 'json', 'text', 'stream'
// browser only: 'blob'
But I figured out a workaround as mentioned in this topic.
The trick is to send a basic Form POST containing your token and the targeted file.
"That targets a new window. Once the browser reads the attachment header on the server response, it will close the new tab and begin the download."
Here's a sample:
let form = document.createElement('form');
form.method = 'post';
form.target = '_blank';
form.action = `${API_URL}/${targetedResource}`;
form.innerHTML = `'<input type="hidden" name="jwtToken" value="${jwtToken}">'`;
document.body.appendChild(form);
form.submit();
document.body.removeChild(form);
"You may need to mark your handler as unauthenticated/anonymous so that you can manually validate the JWT to ensure proper authorization."
Which results for my ASP.NET implementation in:
[AllowAnonymous]
[HttpPost("{targetedResource}")]
public async Task<IActionResult> GetFile(string targetedResource, [FromForm] string jwtToken)
{
var jsonWebTokenHandler = new JsonWebTokenHandler();
var validationParameters = new TokenValidationParameters()
{
// Your token validation parameters here
};
var tokenValidationResult = jsonWebTokenHandler.ValidateToken(jwtToken, validationParameters);
if (!tokenValidationResult.IsValid)
{
return Unauthorized();
}
// Your file upload implementation here
}
This Worked for me. i implemented this solution in reactJS
const requestOptions = {`enter code here`
method: 'GET',
headers: { 'Content-Type': 'application/json' }
};
fetch(`${url}`, requestOptions)
.then((res) => {
return res.blob();
})
.then((blob) => {
const href = window.URL.createObjectURL(blob);
const link = document.createElement('a');
link.href = href;
link.setAttribute('download', 'config.json'); //or any other extension
document.body.appendChild(link);
link.click();
})
.catch((err) => {
return Promise.reject({ Error: 'Something Went Wrong', err });
})
I had an issue where transferring one file I downloaded from axios const axiosResponse = await axios.get(pdf.url) to google drive googleDrive.files.create({media: {body: axiosResponse.data, mimeType}, requestBody: {name: fileName, parents: [parentFolder], mimeType}, auth: jwtClient}) uploaded a corrupted file.
The reason the file was corrupted was because axios transformed the axiosResponse.data to a string. To solve the issue, I had to ask axios to return a stream axios.get(pdf.url, { responseType: 'stream' }).
Implement an Axios handler for the Received document, the data format octect-stream,
data might look weird PK something JbxfFGvddvbdfbVVH34365436fdkln as its octet stream format, you might end up creating file with this data might be corrupt, {responseType: 'blob'} will make data into readable format,
axios.get("URL", {responseType: 'blob'})
.then((r) => {
let fileName = r.headers['content-disposition'].split('filename=')[1];
let blob = new Blob([r.data]);
window.saveAs(blob, fileName);
}).catch(err => {
console.log(err);
});
you might have tried solution which fails like this,
window.saveAs(blob, 'file.zip') will try to save file as zip but will wont work,
const downloadFile = (fileData) => {
axios.get(baseUrl+"/file/download/"+fileData.id)
.then((response) => {
console.log(response.data);
const blob = new Blob([response.data], {type: response.headers['content-type'], encoding:'UTF-8'});
const link = document.createElement('a');
link.href = window.URL.createObjectURL(blob);
link.download = 'file.zip';
link.click();
})
.catch((err) => console.log(err))
}
const downloadFile = (fileData) => {
axios.get(baseUrl+"/file/download/"+fileData.id)
.then((response) => {
console.log(response);
//const binaryString = window.atob(response.data)
//const bytes = new Uint8Array(response.data)
//const arrBuff = bytes.map((byte, i) => response.data.charCodeAt(i));
//var base64 = btoa(String.fromCharCode.apply(null, new Uint8Array(response.data)));
const blob = new Blob([response.data], {type:"application/octet-stream"});
window.saveAs(blob, 'file.zip')
// const link = document.createElement('a');
// link.href = window.URL.createObjectURL(blob);
// link.download = 'file.zip';
// link.click();
})
.catch((err) => console.log(err))
}
function base64ToArrayBuffer(base64) {
var binaryString = window.atob(base64);
var binaryLen = binaryString.length;
var bytes = new Uint8Array(binaryLen);
for (var i = 0; i < binaryLen; i++) {
var ascii = binaryString.charCodeAt(i);
bytes[i] = ascii;
};
return bytes;
}
another short solution is,
window.open("URL")
will keep opening new tabs unnecessarily and user might have to make allow popups for work this code, what if user want to download multiple files at the same time so go with solution first or if not try for other solutions also
This function will help you to download a ready xlsx, csv etc file download. I just send a ready xlsx static file from backend and it in react.
const downloadFabricFormat = async () => {
try{
await axios({
url: '/api/fabric/fabric_excel_format/',
method: 'GET',
responseType: 'blob',
}).then((response) => {
const url = window.URL.createObjectURL(new Blob([response.data]));
const link = document.createElement('a');
link.href = url;
link.setAttribute('download', 'Fabric Excel Format.xlsx');
document.body.appendChild(link);
link.click();
});
} catch(error){
console.log(error)
}
};
Basically, I solved the problem of the filename by reading it, if present, from the 'content-disposition' header:
const generateFile = async ({ api, url, payload }) => {
return await api({
url: url,
method: 'POST',
data: payload, // payload
responseType: 'blob'
}).catch((e) => {
throw e;
});
};
const getFileName = (fileBlob, defaultFileName) => {
const contentDisposition = fileBlob.headers.get('content-disposition');
if (contentDisposition) {
const fileNameIdentifier = 'filename=';
const filenamePosition = contentDisposition.indexOf(fileNameIdentifier);
if (~filenamePosition) {
return contentDisposition.slice(filenamePosition + fileNameIdentifier.length, contentDisposition.length).replace(/"/g,'');
}
}
return defaultFileName;
};
const downloadFile = (fileBlob, fileName) => {
const url = window.URL.createObjectURL(new Blob([fileBlob]));
const link = document.createElement('a');
link.href = url;
link.setAttribute('download', `${fileName}`);
document.body.appendChild(link);
link.click();
link.remove();
link.style.display = 'none';
window.URL.revokeObjectURL(url);
};
// "api" is an instance of Axios (axios.create)
// "payload" is the payload you submit to the server
const fileBlob = await generateFile({ api, '/url/to/download', payload });
const fileName = getFileName(fileBlob, "MyDownload.xls");
downloadFile(fileBlob.data, fileName);
For axios POST request, the request should be something like this:
The key here is that the responseType and header fields must be in the 3rd parameter of Post. The 2nd parameter is the application parameters.
export const requestDownloadReport = (requestParams) => async dispatch => {
let response = null;
try {
response = await frontEndApi.post('createPdf', {
requestParams: requestParams,
},
{
responseType: 'arraybuffer', // important...because we need to convert it to a blob. If we don't specify this, response.data will be the raw data. It cannot be converted to blob directly.
headers: {
'Content-Type': 'application/json',
'Accept': 'application/pdf'
}
});
}
catch(err) {
console.log('[requestDownloadReport][ERROR]', err);
return err
}
return response;
}
The answers using URL.CreateObject() have worked well for me.
I still want to point out the option of using HTTP Headers.
Using HttpHeaders has these advantages:
very widespread browser support
does not require creating a blob object in the browser's memory
does not require waiting for the full response from the server before showing giving the user feedback
no size limitations
Using HttpHeaders requires you to have access to the back-end server where the files are downloaded from (which seems to be the case for OP's Excel files)
HttpHeaders solution:
FRONT-END:
//...
// the download link
<a href="download/destination?parameter1=foo&param2=bar">
click me to download!
</a>
BACK-END
(C# in this example, but could be any language. Adapt as required)
...
var fs = new FileStream(filepath, FileMode.OpenOrCreate, FileAccess.Read);
Response.Headers["Content-Disposition"] = "attachment; filename=someName.txt";
return File(fs, "application/octet-stream");
...
This solution assumes you have control of the back-end server that responds.
https://github.com/eligrey/FileSaver.js/wiki/Saving-a-remote-file#using-http-header
My answer is a total hack- I just created a link that looks like a button and add the URL to that.
<a class="el-button"
style="color: white; background-color: #58B7FF;"
:href="<YOUR URL ENDPOINT HERE>"
:download="<FILE NAME NERE>">
<i class="fa fa-file-excel-o"></i> Excel
</a>
I'm using the excellent VueJs hence the odd anotations, however, this solution is framework agnostic. The idea would work for any HTML based design.

react-router - server side rendering match

I have this on my server
app.get('*', function(req, res) {
match({ routes, location: req.url }, (error, redirectLocation, renderProps) => {
const body = renderToString(<RouterContext {...renderProps} />)
res.send(`
<!DOCTYPE html>
<html>
<head>
<link href="//cdn.muicss.com/mui-0.6.5/css/mui.min.css" rel="stylesheet" type="text/css" />
</head>
<body>
<div id="root">${body}</div>
<script defer src="assets/app.js"></script>
</body>
</html>
`)
})
})
And this on the client side
import { Router, hashHistory, browserHistory, match } from 'react-router'
let history = browserHistory
//client side, will become app.js
match({ routes, location, history }, (error, redirectLocation, renderProps) => {
render(<Router {...renderProps} />, document.getElementById('root'))
})
the problem
It works only when I remove the (let history = browserHistory), but it adds the /#/ hash prefix to my url(which I don't want to happen).
When I leave the let (history = browserHistory) there, it throws an error
Warning: React attempted to reuse markup in a container but the checksum was invalid. This generally means that you are using server rendering and the markup generated on the server was not what the client was expecting. React injected new markup to compensate which works but you have lost many of the benefits of server rendering. Instead, figure out why the markup being generated is different on the client or server:
(client) < ! -- react-empty: 1 -
(server) < section data-reactro
The error message is pretty clear, however, I don't understand why it works with the hashHistory but fails with the browserHistory
version incompatibility issue
solution
{
"history": "^2.1.2",
"react-router": "~2.5.2"
}
links:
https://github.com/reactjs/react-router/issues/3003