How can i change the brightness of an Image with slider using GPUImageBrightnessFilter ?
I tried,
-(void)updateBrightness {
GPUImageFilter *selectedFilter = nil;
[selectedFilter removeAllTargets];
selectedFilter = [[GPUImageFilter alloc] init];
CGFloat midpoint = [(UISlider *)sender value];
[(GPUImageBrightnessFilter *)settingsFilter setBrightness:midpoint];
UIImage *filteredImage = [selectedFilter imageByFilteringImage:_image_view.image];
fx_imageView.image = filteredImage;
}
There are several problems with the above code.
First, you're not actually using a brightness filter against your image, because you're calling -imageByFilteringImage: on selectedFilter, which is a generic GPUImageFilter that you allocated fresh. Your GPUImageBrightnessFilter of settingsFilter is never used.
Second, you don't want to be allocating a new filter with every time you update a parameter. Allocate your GPUImageBrightnessFilter once and simply update it as values change.
Third, you don't want to keep re-filtering UIImages. Going to and from UIImages is a slow process (and won't work properly when using -imageByFilteringImage: on the same filter, because of some caching I do). Instead, create a GPUImagePicture based on your original image, add a GPUImageBrightnessFilter to that as a target, and target your GPUImageBrightnessFilter at a GPUImageView. Use -processImage every time you update your brightness filter and your updates will be much, much faster. When you need to extract your final image, use -imageFromCurrentlyProcessedOutput.
these may help you...
-(void)viewDidLoad
{
[sliderChange setMinimumValue:-0.5];
[sliderChange setMaximumValue:0.5];
[sliderChange setValue:0.0];
brightnessFilter = [[GPUImageBrightnessFilter alloc] init];
}
-(IBAction)upDateSliderValue:(id)sender
{
GPUImagePicture *fx_image;
fx_image = [[GPUImagePicture alloc] initWithImage:originalImage];
[brightnessFilter setBrightness:self.sliderChange.value];
[fx_image addTarget:brightnessFilter];
[fx_image processImage];
UIImage *final_image = [brightnessFilter imageFromCurrentlyProcessedOutput];
self.selectedImageView.image = final_image;
}
Related
I am working a picture/video filter effect project. For making effect i am using GPUImage project. It works fine for pictures. Now i need to do the same effect on videos too. I grab images from video at 30 frames per second. Now for 1 min video i filtered about 1800 images. and filtering, for each image i allocate GPUImagePicture and GPUImageSepiaFilter classes, and release them manully. But these allocations is not released, and after processing on about 20 sec video application crashes due to memory warning.
Is it possible to allocate GPUImagePicture and Filter class once for makeing filter on all images. if yes, How?
Please tell me it will very helpfull.
Here is my code what actully i am doing...
The get image method is called by a NSTimer which is called 30 times per second, and getting image from app document directory and send for filtering to - (void)imageWithEffect:(UIImage *)image method.
-(void)getImage
{
float currentPlayBacktime = slider.value;
int frames = currentPlayBacktime*30;
NSString *str = [NSString stringWithFormat:#"image%i.jpg",frames+1];
NSString *fileName = [self.imgFolderPath stringByAppendingString:str];
if ([fileManager fileExistsAtPath:fileName])
[self imageWithEffect:[UIImage imageWithContentsOfFile:fileName]];
}
- (void)imageWithEffect:(UIImage *)image
{
GPUImagePicture *gpuPicture = [[GPUImagePicture alloc]initWithImage:img] ;
GPUImageSepiaFilter *filter = [[[GPUImageSepiaFilter alloc]init] autorelease];
gpuPicture = [gpuPicture initWithImage:image];
[gpuPicture addTarget:filter];
[gpuPicture processImage];
playerImgView.image = [filter imageFromCurrentlyProcessedOutputWithOrientation:0];
[gpuPicture removeAllTargets];
[filter release];
[gpuPicture release];
}
Why are you processing a movie as a series of still UIImages? Why are you allocating a new filter for each image? That's going to be incredibly slow.
Instead, if you need to process a movie file, use a GPUImageMovie input, and if you need access to the camera, use a GPUImageVideoCamera input. Both of these are tuned to provide fast video feeds through the filter pipeline. There's a lot of wasted processing cycles in converting still frames to and from UIImages.
Also, only set up a filter once, and reuse it as necessary. There's significant overhead in the creation of a GPUImageFilter (setting up an FBO, etc.), so you only want to create it once and then attach inputs and outputs as they change.
I have saved my images in database with type BLOB. Now I am extracting these images from database using sql stmt and saving it in a NSData object. see the following code snippet
NSData *img = [[NSData alloc] initWithBytes:sqlite3_column_blob(loadInfostmt, 4) length: sqlite3_column_bytes(loadInfostmt, 4)];
self.birdImage = [UIImage imageWithData:img];
After getting image I tried setting it at image attribute of UIImageView to load it to image view. I used following techniques but nothing seems to work.
1.
self.imgPikr = [[UIImageView alloc] initWithImage:brd.birdImage];
[self.imgPikr setImage:brd.birdImage];
2.
[self.imgPikr = [[UIImageView alloc] init];
self.imgpikr.image = brd.birdImage;
3.
[self.imgPikr = [[UIImageView alloc] init];
[self.imgpikr setImage:brd.birdImage];
Here imgPikr is a UIImageView object and birdImage is object of UIImage class.
every other data is displayed in view correctly except the image.
I dont have any error or warning or runtime exception but still cant load image.
Please tell me the solution. I have been working on it since a week. but couldnt find anything working.
Any answer or experimentation is welcomed.
thanx in advance
Option 1 is the best option, since -[UIImageView initWithImage:] will automatically resize the imageView to be the same size as the passed image (and it is also unnecessary to setImage: if you use this initializer).
However, as #Robot K points out in the comments, you still need to add self.imgPikr as a subview of a view that is visible on the screen.
Additionally, if your imgPikr property is declared as (retain), then you have a memory leak.
Im developing an app for an iPhone and I found that the following code is causing the memory allocation to increment.
-(UIImage *)createRecipeCardImage:(Process *)objectTBD atIndex:(int)indx
{
[objectTBD retain];
// bringing the image for the background
UIImage *rCard = [UIImage imageNamed:#"card_bg.png"];
CGRect frame = CGRectMake(00.0f, 80.0f, 330.0f, 330.0f);
// creating he UIImage view to contain the recipe's data
UIImageView *imageView = [[UIImageView alloc] initWithFrame:frame];
imageView.image = rCard;
[rCard release];
imageView.userInteractionEnabled = YES;
float titleLabelWidth = 150.0;
float leftGutter = 5.0;
float titleYPos = 25.0;
float space = 3.0;
float leftYPos = 0;
// locating Title label
float currentHeight = [self calculateHeightOfTextFromWidth:objectTBD.Title :titleFont :titleLabelWidth :UILineBreakModeWordWrap];
UILabel *cardTitle = [[UILabel alloc]initWithFrame:CGRectMake(leftGutter, titleYPos, titleLabelWidth, currentHeight)];
cardTitle.lineBreakMode = UILineBreakModeWordWrap;
cardTitle.numberOfLines = 0;
cardTitle.font = titleFont;
cardTitle.text = objectTBD.Title;
cardTitle.backgroundColor = [UIColor clearColor];
[imageView addSubview:cardTitle];
[cardTitle release];
leftYPos = titleYPos + currentHeight + space;
// locating brown line
UIView *brownLine = [[UIView alloc] initWithFrame:CGRectMake(5.0, leftYPos, 150.0, 2.0)];
brownLine.backgroundColor = [UIColor colorWithRed:0.647 green:0.341 blue:0.122 alpha:1.0];
[imageView addSubview:brownLine];
[brownLine release];
leftYPos = leftYPos + 2 + space + space + space;
// creating the imageView to place the image
UIImageView *processPhoto = [[UIImageView alloc] initWithFrame:CGRectMake(leftGutter, leftYPos, 150, 150)];
if((uniqueIndex == indx) && (uniqueImg.imageData != nil))
{
if([uniqueImg.rcpIden isEqualToString:objectTBD.iden])
{
objectTBD.imageData = [NSString stringWithFormat:#"%#", uniqueImg.imageData];
[recipesFound replaceObjectAtIndex:indx withObject:objectTBD];
NSData * imageData = [NSData dataFromBase64String:objectTBD.imageData];
UIImage *rcpImage = [[UIImage alloc] initWithData:imageData];
[imageData release];
processPhoto.image = rcpImage;
[rcpImage release];
}
}
else if(objectTBD.imageData != nil)
{
NSData * imageData = [NSData dataFromBase64String:objectTBD.imageData];
UIImage *rcpImage = [[UIImage alloc] initWithData:imageData];
processPhoto.image = rcpImage;
[rcpImage release];
[decodedBigImageDataPointers addObject:imageData];
}
else
{
UIImage * rcpImage = [UIImage imageNamed:#"default_recipe_img.png"];
processPhoto.image = rcpImage;
[rcpImage release];
}
NSlog(#" Process Photo Retain Count %i", [processPhoto retainCount]); // this prints a 1
[imageView addSubview:processPhoto];
NSlog(#" Process Photo Retain Count %i", [processPhoto retainCount]); // this prints a 2!!!!
//[processPhoto release]; // this line causes an error :(
// converting the UIImageView into a UIImage
UIGraphicsBeginImageContext(imageView.bounds.size);
[imageView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[objectTBD release];
for(UIView *eachSubview in imageView.subviews)
{
[eachSubview removeFromSuperview];
NSLog(#"each subview retainCount %i despues", [eachSubview retainCount]);
// here I find that the processPhoto view has a retain count of 2 (all other views have their retain count in 1)
}
return viewImage;
}
When I checked at the instruments object allocation I found that the "GeneralBlock-9216" growing up.
Breaking down the row I found that every time I call this code, one instance of:
2 0x5083800 00:18.534 ImageIO initImageJPEG
is being allocated. Checking the call stack, the following line is highlighted:
UIImage * objImage = [UIImage imageWithData:imageData];
Any help to find what the error is?
As TechZen said, the imageWithXXX: methods cache the image inside of them while you run the program (though you release the instances after using). I recommend initWithXXX: and release API sets instead of imageWithXXX:.
Well, if you embed several debug log on your source code, check how many times is it called, and check the retain count of the instances.
As far as I can explain, that is all.
I hope you will solve the problem.
Does anyone have an answer for this? It's tearing me apart trying to figure out why this image information keeps lingering. I've tried every solution.
The situation:
Images get downloaded and stored to the device, then loaded with imageWithContentsOfFile (or even initWithContentsOfFile, which doesn't help either). When the view goes away, the images don't, but they don't show up as leaks, they're just this initImageJPEG Malloc 9.00 KB that never goes away and keeps ramping up.
UPDATE: I believe I've figured this out: Check to make sure everything is actually getting dealloc'd when you're releasing whatever the parents (and/or grandparents) and etc of the images are. If the parents don't get deallocated, they never let go of their children images, and whatever data was in those images sticks around. So check retain counts of parent objects and make sure that everything's going away all the way up whenever you release the view at the top.
A good way to check for this is to put NSLogs into custom classes' dealloc methods. If they never show up, that object isn't going away, even though the reference to it might, and it (and whatever its subviews and properties are) will never ever disappear. In the case of images, this means a pretty sizable allocation every time that object is generated and never deallocated. It might not show up in leaks, especially if the parent of the topmost object you're thinking you're releasing but actually aren't persists and doesn't itself ever deallocate.
I hope this helps. It'll be useful to take some time to read through your code with a fine-toothed comb to make sure you're allocating and releasing things properly. (Search for "alloc]", start at the top of the file, and work your way down to make sure you're releasing and that the release isn't inside of some if() or something.)
Also, running "Build and Analyze" might lock up your machine for a bit, but its results can be really helpful.
Good luck!
I think you're seeing UIImage cacheing images. There used there used to be a method something like initWithData:cache that let you turn the cacheing off. Now I think the system always caches the images automatically behind the scenes even after you've deallocted the specific instances.
I don't think its an error on your part. I think it's the system keeping data around in the OpenGl subsystem. Unless it causes a major leak, I don't think it is a problem.
UPDATED Scroll down to see the question re-asked more clearly....
If I had the name of a particular UIImageView (IBOutlet) stored in a variable, how can I use it to change the image that is displayed. I tried this, but it does not work.
I'm still new to iphone programming, so any help would be appreciated.
NSString *TmpImage = #"0.png";
NSString *Tst = #"si1_1_2";
TmpImage = #"1.png";
UIImage *sampleimage = [[UIImage imageNamed:TmpImage] retain];
((UIImageView *) (Tst)).image = sampleimage; // This is the line in question
[sampleimage release];
RESTATED:
I have a bunch of images on the screen.... UIImageView *s1, *s2 ,*s3 etc up to *s10
Now suppose I want to update the image each displays to the same image.
Rather than doing
s1.image = sampleimage;
s2.image = sampleimage;
:
s10.image = sampleimage;
How could i write a for loop to go from 1 to 10 and then use
the loop var as part of the line that updates the image.
Something like this.
for ( i = 1; i <- 10; ++i )
s(i).image = sample; // I know that does not work
Basic question is how do I incorporate the variable as part of the statement to access the image? Don't get hung up on my example. The main question is how to use a variable as part of the access to some element/object.
Bottom Line... If I can build the name of a UIImageView into a NSString object, How can I then use that NSString object to manipulate the UIImageView.
Thanks!
Ugh! Your line in question:
((UIImageView *) (Tst)).image = sampleimage;
is casting a string pointer as a UIImageView pointer - you're basically saying that your pointer to a string is actually a pointer to a UIImageView! It will compile (because the compiler will accept your assertion happily) but will of course crash on running.
You need to declare a variable of type UIImageView. This can then hold whichever view you want to set the image of. So your code could look like the following:
NSString *TmpImage = #"0.png";
UIImageView *myImageView;
If (someCondition == YES) {
myImageView = si1_1_2; //Assuming this is the name of your UIImageView
} else {
myImageView = si1_1_3; //etc
}
UIImage *sampleimage = [UIImage imageNamed:TmpImage]; //no need to retain it
myImageView.image = sampleImage;
Hopefully this makes sense!
Edit: I should add, why are you trying to have multiple UIImageViews? Because a UIImageView's image can be changed at any time (and in fact can hold many), would it not be better to have merely one UIImageView and just change the image in it?
I have a sequence of images needed to display in a short time (PNG sequence). There are totally 31 PNGs in total, each with file size about 45KB. I have already loaded them with the following codes:
imgArray = [[NSMutableArray alloc] init];
for(int i = 0; i <= 30; i++) {
NSString * filename = [NSString stringWithFormat:#"img_000%.2d.png", i];
UIImage *temp = [UIImage imageNamed:filename];
[imgArray addObject:temp];
[temp release];
temp = nil;
}
I use the following codes for displaying the images:
CGImageRef image = [(UIImage *)[imgArray objectAtIndex:imgFrame] CGImage];
imgLayer.contents = (id)image;
if(imgFrame < 29) {
imgFrame++;
} else {
imgFrame = 0;
imgLayer.hidden = TRUE;
[imgTimer invalidate];
}
where imgLayer is a CALayer. (imgTimer is a repeating timer with interval 0.03s)
But I found that when I call the images out, it is very laggy at the first time. Except the 1st appearance, other appearance has no problem.
Is it related to preloading images? Or are my images too big in file size?
The reason for your lags are hard to tell without profiling data. But here is a trick that might help: Join all your images into one large file. Try to make it rectangular (maybe 6x6 in your case or 4*8). Then load this single file and crop each image out for display (i.e. create an image for display with the size of a tile and copy one tile from the big image after the other into the display image).
Images are loaded when they are used and displayed.
If you use the time profiler in Instruments, you will the lag you're experiencing. If you then zoom to that lag and look at what is causing it, you will usually see that "copyImageBlockSetPNG" is the function taking time, right before "inflate".
What you must find a way to do is create your images and force load them before you need them. That's another story apparently.