How can I mask an Image into a Layer? - coffeescript

These are my first steps with FramerJS and I would like to know how to mask an Image into a different layer when I scale the image with an animation.
Thanks.

layer = new Layer width: 256, height: 256
layer.style =
"-webkit-mask-image": "url(images/framer-icon.png)"
you can mask an image using "-webkit-mask-image". this property is only worked in Webkit-based browser, but framer is also worked in Webkit-based.

I reckon the best way is to put the image inside a Framer layer and mask it with a parent Framer layer.

Related

How can I paint or fill an image mask in flutter?

Good community, I am trying to create a mask with the image (A).
I am using ShaderMask (https://www.youtube.com/watch?v=7sUL66pTQ7Q).
And up to here I think that everything is correct, but the complications come when I want to paint the mask.
I don't know how to continue, to be able to fill or paint the mask, as shown in the image (B).
I am using the red color to paint and fill the mask.
Any ideas? Thanks in advance

MonoGame Different Size RenderTargets and Scaling issues

I've got some different controls I have that I'm drawing to different size render targets.
If I draw a 64x64 textureA to an 800x600 render target like this.
Graphics.SetRenderTarget(Canvas);
_spriteBatch.Draw(textureA, new Rectangle(0, 0, 64, 64), srcRectangle, Color.White)
And then draw the 800x600 render target to a 1600x900 screen with a call like this
Graphics.SetRenderTarget(null);
_spriteBatch.Draw(Canvas, new Rectangle(100,100,800,600), Color.White
Why is it drawing my textureA very warped and small on the screen.
If I make my Canvas the same size as the backbuffer then it shows up fine, but I'm wondering why this thinks it should shrink my texture, and is there a way to turn this off?
I want to able to create render targets of arbitrary sizes, and then draw them to the screen at their original sizes. Is this possible?
When you draw to a RenderTarget, the back buffer (or preferred back buffer) is still used. If you are scaling down from the back buffer dimension then any texture that is drawn to the RenderTarget will also be scaled and modified by any changes in the aspect ratio.
If you want to keep your textures in the same aspect, you will need to track the scale and aspect ratio to apply to the destination rectangle when you draw. Would be happy to elaborate or provide code snippets if you could elaborate a bit more on what you are trying to accomplish.

How to scale only specific parts of image in iPhone app?

I want to scale the image in iPhone app but not entirely. I just want to scale specific parts like the bottom part or the middle part...How do I do it?
Please help.
Thanks
It sounds like you want to do a form of 9-slice scaling or 3-slice scaling. Let's say you have the following image:
and you want to make it look like this:
(the diagonal end pieces do not stretch at all, the top and bottom pieces stretch horizontal, and the left and right pieces stretch vertical)
To do this, use -stretchableImageWithLeftCapWidth:topCapHeight: in iOS 4.x and earlier, or -resizableImageWithCapInsets: starting with iOS 5.
UIImage *myImage = [UIImage imageNamed:#"FancyButton"];
UIImage *myResizableImage = [myImage resizableImageWithCapInsets:UIEdgeInsetsMake(21.0, 13.0, 21.0, 13.0)];
[anImageView setImage:myResizableImage]
To help visualize the scaling, here is an image showing the above cap insets:
I'm not aware of any way to adjust the scale of just a part of a UIImage. I'd approach is slightly differently by creating seperate images from your primary image using CGImageCreateWithImageInRect and then scaling the seperate images with the different rates that you require.
See:
Cropping a UIImage
CGImage Reference
Quartz 2D Programming Guide

Edge blending of image with alpha on iOS after masking color

I have an image that has been masked with a range of colors using
CGImageCreateWithMaskingColors()
Everything works fine, but the edge of the image (we are using a green screen for profile pictures) is pixelated. I've tried drawing the image in an image context and a bitmap context with the proper antialiasing, but the edges remain the same.
Any suggestions on how to smooth the sides of the profile after it has been masked?
You could try zooming the mask image larger by some multiple (say 2X, 3X or 4X) before creating the mask. Then a size-reduced copy of this larger mask might be better antialiased.

Apply a mask to an iPhone application

i found a tutorial that explain how apply a mask on an UIImage, but i have a problem.
this is the link:
http://iosdevelopertips.com/cocoa/how-to-mask-an-image.html
If i apply a mask to an image taken by a picker with "picker.alowsEditing=YES" the mask is applyed well and the background of the image is the same color of the application's backgroung, and is good.
But if the option is "picker.alowsEditing=NO" when i apply the mask the image's background become black.
Hi i undstend that the problem is the alpha layer.
If an image have not an alpha layer can't use the mask, otherwise the image background will be colored Black.
To add an alpha layer (layer with alpha channel) to an image i found an userfull guide here:
http://pastie.org/418627