In my yii2 project, I have image compression code. I want to compress images of any size to 200kb. I am using Yii2 imagine extension for compressing the image. My code is
Image::thumbnail($uploadPath . '/' . $file->name,$newwidth, $newheight)
->save($uploadPath . '/' . $file->name,['quality' => 100]);
$newwidth and $newheight are the original width and height of images we are uploading.
The compression is working fine. But it compressing the maximum. Suppose I uploaded a 1MB picture ,then the output image size will be like 30kb, I mean too small. So what I need is, I have to compress to 200Kb. So if any size, the output should be 200kb.
Is there any way to do that?? If we have any options with the core php also please let me know.
Try this extension: https://github.com/yiisoft/yii2-imagine
I am useing next code:
Image::getImagine()->open($originFile)
->thumbnail(new Box(800, 800))
->save($filesPath .'/'. $newImageName . '.' . $originFile->file->extension, ['quality' => 100]);
You must include in controller next:
use yii\imagine\Image;
use Imagine\Gd;
use Imagine\Image\Box;
use Imagine\Image\BoxInterface;
Related
I'm attempting to convert a multi page TIFF file into PDF using PIL but for some reason the DPI is always 300 despite me setting it otherwise. If the TIFF file is only one page it works but I can't seem to get it working for multi pages. Any idea where I am going wrong please?
from PIL import Image, ImageSequence
image = Image.open("test.tiff")
images = []
for i, page in enumerate(ImageSequence.Iterator(image)):
page = page.convert("RGB")
#page.
images.append(page)
if len(images) == 1:
images[0].save("test.pdf", resolution=600.0,optimize=True, quality=100)
else:
images[0].save("test.pdf", resolution=600.0,optimize=True, quality=100, save_all=True,append_images=images[1:])
I'm trying to find out the dimensions (width and height) of an asset file but without loading it into memory, or with as little memory usage as possible.
I'm currently doing:
Image _mainMenuTable = Image.asset("images/main_menu_table.png");
but won't that make a widget into memory and also load the image onto it ? Even if I don't display it anywhere.
I've also tried creating an AssetImage, which I'm already doing to set as the background image of a Container, but I can't retrieve the dimensions of the image from it.
Any ideas if there's a better way that Image.asset? This way, I am indeed getting my width and height dynamically, but it loads it into memory twice, no? Once as an Image and once as an AssetImage
Disclaimer: it's my second week into Flutter, so please be gentle in case the answer is obvious.
Cheers!
You just need to use the File class from the dart:io package.
Example -
import 'dart:io';
void main(){
var file = File('file_path');
file.length().then((len) => print(len)); //prints bytes
}
There is any way with code to export a .png or .tga file to local disk in Unity?
I need to write a converter that loads asset bundles and converts them to the original
source image files. I need to create those files, in a way that anyone could open them
with Photoshop, for instance. Any idea about how to do it?
Thanks.
David
void Save(Texture2D texture)
{
var bytes = texture.EncodeToPNG();
File.WriteAllBytes(EditorUtility.SaveFilePanel("Save PNG", Application.dataPath + "/../", "Font", "png"), bytes);
}
You can save image in TGA format using Encode To TGA plugin: http://u3d.as/rWt
I'm trying to grab a screenshot with renderer.domElement.toDataURL("image/png"), and save it to a file.
The image is the right size, but it's black.
I have preserveDrawingBuffer turned on.
I think I'm decoding and saving the file correctly, because when I hexdump it I can see the correct initial characters for the PNG format, as well as the IHDR and IDAT chunk headers. However the closing IEND is missing.
Any known issues here? Hints? Windows 7/Firefox up to date if it matters.
Thanks... (Sorry if this is dumb, I'm very new to three.js)
I had somewhat similar problems with Windows 7/Firefox. PNG Data URL's would be randomly truncated or something, much shorter than a successful PNG export. Trying to set that data url as image src resulted in "Image corrupt" exception or something in FF. As little sense it makse, setting a small window.setTimeout (10ms) between rendering and getting the data URL helped in my case. Maybe Firefox needs a rest from the JS engine before it refreshes some canvas internal state or something.. weird.
I switched to JPG format (smaller files => truncation less of an issue?) and still saw it not working, then I tried this tip which I found here
If you want to save data that is derived from a Javascript
canvas.toDataURL() function, you have to convert blanks into plusses.
If you do not do that, the decoded data is corrupted:
<?php
$encodedData = str_replace(' ','+',$encodedData);
$decodedData = base64_decode($encodedData);
?>
This worked. Thanks, Mekal.
This tip seems to apply to JPGs only. I saw PNGs decoding correctly without the + replacement, and corruptly with it. I can use JPGs so my personal problem is solved. However I never saw a PNG that wasn't black even when decoded correctly and not truncated.
Kind of a lousy situation either way, I feel like. What is up with the +'s?
A black texture is a sign that you did not indicate the texture needs to be updated.
Also, you do not need to use canvas.toDataURL(). You can pass in the canvas reference to the THREE.Texture object.
var canvas = document.getElementById('#myCanvas');
var texture = new THREE.Texture(canvas);
texture.needsUpdate = true;
// Now render the scene
I'm not sure if memory is the culprit here. I am trying to instantiate a GD image from data in memory (it previously came from a database). I try a call like this:
my $image = GD::Image->new($image_data);
$image comes back as undef. The POD for GD says that the constructor will return undef for cases of insufficient memory, so that's why I suspect memory.
The image data is in PNG format. The same thing happens if I call newFromPngData.
This works for very small images, like under 30K. However, slightly larger images, like ~70K will cause the problem. I wouldn't think that a 70K image should cause these problems, even after it is deflated.
This script is running under CGI through Apache 2.0, on OS 10.4, if that matters at all.
Are there any memory limitations imposed by Apache by default? Can they be increased?
Thanks for any insight!
EDIT: For clarification, the GD::Image object never gets created, so clearing out the $image_data from memory isn't really an option.
GD library eats many bytes per byte of image size. It's a well over a 10:1 ratio!
When a user uploads an image to our system, we start by checking the file size before loading it into a GD image. If it's over a threshold (1 Megabyte) we don't use it but instead report an error to the user.
If we really cared we could dump it to disk, use the command line "convert" tool to rescale it to a sane size, then load the output into the GD library and remove the temporary file.
convert -define jpeg:size=800x800 tmpfile.jpg -thumbnail '800x800' -
Will scale the image so it fits within an 800 x 800 square. It's longest edge is now 800px which should safely load. The above command will send the shrunk .jpg to STDOUT. The size= option should tell convert not to bother holding the huge image in memory, but just enough to scale to 800x800.
I've run into the same problem a few times.
One of my solutions was simply to increase the amount of memory available to my scripts. The other was to clear the buffer:
Original Script:
$src_img = imagecreatefromstring($userfile2);
imagecopyresampled($dst_img,$src_img,0,0,0,0,$thumb_width,$thumb_height,$origw,$origh);
Edited Script:
$src_img = imagecreatefromstring($userfile2);
imagecopyresampled($dst_img,$src_img,0,0,0,0,$thumb_width,$thumb_height,$origw,$origh);
imagedestroy($src_img);
By clearing out the memory of the first src_image, it freed up enough to handle more processing.