How to get uploaded file size in Slim Framework? - slim

I'm unable to get the uploaded file size in slim framework. I wanted to do some validation on size. Here is my small code.
public function uploadFile(Request $request, Response $response){
$files = $request->getUploadedFiles();
}
$file['image'] is my file object.

I got it actually this was very easy.
$files['image']->getSize()

Related

Sitemap for dynamic website - codeigniter

I have a dynamic website designed with Codeigniter 3 and I am working on the sitemap part as a newbie.
I found the library sitemap-php from evert/sitemap-php but I can't make it run.
From now this is what I did, I put the Sitemap.php file into my library folder
Controller:
<?php
defined('BASEPATH') OR exit('No direct script access allowed');
class Deals extends CI_Controller {
public function __construct(){
parent::__construct();
$this->load->helper('url', 'form', 'security');
$this->load->library('form_validation');
$this->load->library('session');
$this->load->library('email');
$this->load->model('deal_model');
$this->load->helper(array('cookie','custom','text'));
}
public function Sitemap(){
$this->load->library('Sitemap');
$sitemap = new Sitemap('https://www.mywebsite.com');
$sitemap->setPath('/public_html/Sitemap/'); // I created a folder Sitemap into my public folder
$sitemap->setFilename('sitemap');
$sitemap->addItem('/', '1.0', 'daily', 'Today');
$sitemap->createSitemapIndex('https://www.mywebsite.com/sitemap/', 'Today');
}
Then when I go to https://www.mywebsite.com/sitemap/, I have an Error 404.
Could you guide me to solve my issue.
Thanks
The docs for that library describe that it generates a static XML file. The code you've shown will do that - but your code is in a Library, and you have not run it yet. You need to run it, then it will generate an XML file as you've specified, in /public_html/Sitemap/. From your description you are looking for the XML before doing anything to generate it, and it does not (yet) exist.
From your updated code, you now have the code to generate the static XML available as a Controller method. According to the standard Codeigniter routing conventions, the method you have created is accessible at:
http://your-site/deals/Sitemap
(Maybe you've also set up some routes so it is accessible at other URIs also.)
Visit that URL, once, to generate the static XML file at /public_html/Sitemap/sitemap.xml. Assuming your code works, you should then be able to browse the XML at
http://your-site/Sitemap/sitemap.xml
Side note: AFAIK Codeigniter convention is for capitalised Controller file and class names (Deal.php and Deal), but all lower-case method names (sitemap() instead of Sitemap()). You can see examples of this in the Controller docs I linked above. I am not sure if it matters, just pointing it out.

Retrieve file POST to Laravel 5

I'm trying to post a file to my upload function in Laravel. To keep things simple I'm using Postman to test it. According to the documentation, the way to access a file Request is:
$request->file('image')
so my controller function begins with :
if ($request->file('image')->isValid()) {
$file = $request->file('image');
}
And my post request:
As you can see the parameter is NULL.
So what is the correct way to access a file post request in the controller?
Thanks.
$image = $request->hasFile('image');
if(isset($image)){
//perform operation
}
The above should be what you need to check if there is a file in the request and then preform operations you want on the file.

Downloading from google cloud storage with php

Is there a way to achieve downloading via. the google-php-api? I have tried the following:
using the medialink and trying to curl the object (Returns "Login Required")
reading the guzzle response stream (comes back empty even though all the headers have the correct data)
I am able to see everything but the body of the file via. the API.
Edit:
I am of course able to download the file via the medialink, taken it is set to public - however that will not work for this situation.
The solution is as follows...
You must make an authorized HTTP request, to do this you must:
$object = $service->objects->listObjects(BUCKET, OBJECT);
$http = $client->authorize();
$request = new GuzzleHttp\Psr7\Request('GET', $object->getMediaLink());
$response = $http->send($request);
$body = $response->getBody()->read($object->getSize());
The above is a small snippet but the jist of what you need to get the contents of a file.

How can REST API pass large JSON?

I am building a REST API and facing this issue: How can REST API pass very large JSON?
Basically, I want to connect to Database and return the training data. The problem is in Database I have 400,000 data. If I wrap them into a JSON file and pass through GET method, the server would throw Heap overflow exception.
What methods we can use to solve this problem?
DBTraining trainingdata = new DBTraining();
#GET
#Produces("application/json")
#Path("/{cat_id}")
public Response getAllDataById(#PathParam("cat_id") String cat_id) {
List<TrainingData> list = new ArrayList<TrainingData>();
try {
list = trainingdata.getAllDataById(cat_id);
Gson gson = new Gson();
Type dataListType = new TypeToken<List<TrainingData>>() {
}.getType();
String jsonString = gson.toJson(list, dataListType);
return Response.ok().entity(jsonString).header("Access-Control-Allow-Origin", "*").header("Access-Control-Allow-Methods", "GET").build();
} catch (SQLException e) {
logger.warn(e.getMessage());
}
return null;
}
The RESTful way of doing this is to create a paginated API. First, add query parameters to set page size, page number, and maximum number of items per page. Use sensible defaults if any of these are not provided or unrealistic values are provided. Second, modify the database query to retrieve only a subset of the data. Convert that to JSON and use that as the payload of your response. Finally, in following HATEOAS principles, provide links to the next page (provided you're not on the last page) and previous page (provided you're not on the first page). For bonus points, provide links to the first page and last page as well.
By designing your endpoint this way, you get very consistent performance characteristics and can handle data sets that continue to grow.
The GitHub API provides a good example of this.
My suggestion is no to pass the data as a JSON but as a file using multipart/form-data. In your file, each line could be a JSON representing a data record. Then, it would be easy to use a FileOutputStream to receive te file. Then, you can process the file line by line to avoid memory problems.
A Grails example:
if(params.myFile){
if(params.myFile instanceof org.springframework.web.multipart.commons.CommonsMultipartFile){
def fileName = "/tmp/myReceivedFile.txt"
new FileOutputStream(fileName).leftShift(params.myFile.getInputStream())
}
else
//print or signal error
}
You can use curl to pass your file:
curl -F "myFile=#/mySendigFile.txt" http://acme.com/my-service
More details on a similar solution on https://stackoverflow.com/a/13076550/2476435
HTTP has the notion of chunked encoding that allows you send a HTTP response body in smaller pieces to prevent the server from having to hold the entire response in memory. You need to find out how your server framework supports chunked encoding.

How to see the request variables or files sent from the phonegap to MVC asp.net Controller

I have done coding
public JSONResult media(HttpPostedFileBase file)
{
=====done with some code===
}
Here i m getting the file as null always.
Note:
file is the file submitted from the Phonegap's JSON method.
My question is :
Is there any mechanism for the decoding the encoded multipart file before reading?
Got it :)
Use the request params for getting the file input and changed the method as
public JSONResult media()
added request.files and stored in HTTPcollectionfile base and got the same file information posted in the PhonegapAPI