Can;t figure out the answer yet, but I have this code which is called for a view:
func gradient(fillView view: NSView, withGradientFromColors colors: Array<NSColor>) {
let gradientLayer = CAGradientLayer()
gradientLayer.frame = view.bounds
let color1 = colors[0].cgColor
let color2 = colors[1].cgColor
gradientLayer.colors = [color1, color2]
gradientLayer.locations = [0.0, 1.0]
view.layer?.addSublayer(gradientLayer)
This code takes two values of NSColor and should create a gradient background.
The code works! Buut, if i try to execute something else on this view for example i have a label on top of it which needs to be shown, it's actually not shown after this code is executed. I believe it is somehow behind the view that i draw?
Any fast way of resolving this issue?
Change the adding of sublayer as below:
view.layer?.insertSublayer(gradientLayer, at: 0)
Related
I have the following extension file defined where I am setting a gradient for my UIView:
import Foundation
import UIKit
extension UIView {
func setGradientBackground(colorOne: UIColor, colorTwo: UIColor) {
let gradientLayer = CAGradientLayer()
gradientLayer.frame = bounds
gradientLayer.colors = [colorOne.blue, colorTwo.red]
gradientLayer.locations = [0.0, 1.0]
gradientLayer.startPoint = CGPoint(x: 1.0, y: 1.0)
gradientLayer.endPoint = CGPoint(x: 0.0, y: 0.0)
layer.insertSublayer(gradientLayer, at: 0)
}
This links to a separate structs file, where I have defined four UIColors:
blue,red,black,white
My viewcontroller.swift file has the following code within viewDidLoad:
myView.setGradientBackground(colorOne: Colors.blue, colorTwo: Colors.red)
This seems to work fine.
What I am trying to achieve is, when I click my IBAction button, it changes the myView colors to black and white (the remaining two colors). So I tried this:
#IBAction func hitButton(_ sender: Any) {
myView.setGradientBackground(colorOne: Colors.black, colorTwo: Colors.white)
}
But this causes a crash: "unrecognized selector sent to instance 0x7fb25060a120"
Several things:
The code you posted creates and adds a gradient layer to the view, and then forgets about it. If you call it more than once, you will wind up with more than one gradient layer. That's not what you want.
You really want to keep the gradient layer as a property of the view. To do that you should subclass the view rather than adding an extension (extensions can't add stored properties to classes.)
If you pass in colors to use for the gradient, it doesn't make sense to then use colorOne.blue colorTwo.red as the colors. Just use the colors that are passed to you.
Core Animation layers like CAGradientLayers use CGColors, not UIColors. Taking those last 2 things together, this line:
gradientLayer.colors = [colorOne.blue, colorTwo.red]
Should read
gradientLayer.colors = [colorOne.cgColor, colorTwo.cgColor]
instead.
You don't say which line is crashing, but passing the wrong data type to gradientLayer.colors is a likely culprit.
I'm building a drawing application. I'm drawing using CGMutablePath.
I want the user to be able to select a part of the drawn paths and then move that part, like this:
I thought, a possible solution would be to mask a view to the drawn area and then take a screenshot in that view.
In here, you can see the area drawn in which I want to take a screenshot:
To take the screenshot, I get the last path drawn, being the area the screenshot is to be taken in:
let shapeLayer = CAShapeLayer()
shapeLayer.path = last.closedPath // returns CGPath.closeSubpath()
shapeLayer.lineWidth = 10
I then create an overlayView that's the view I'm taking the screenshot in.
let overlayView = UIView(frame: view.bounds)
overlayView.backgroundColor = .black
overlayView.alpha = 0.4
view.addSubview(overlayView)
view.bringSubview(toFront: overlayView)
I'm then masking the view to the path:
overlayView.mask(withPath: UIBezierPath(cgPath: last.closedPath!))
The .mask(withPath:) method comes from here:
extension UIView {
func mask(withPath path: UIBezierPath) {
let path = path
let maskLayer = CAShapeLayer()
maskLayer.path = path.cgPath
self.layer.mask = maskLayer
}
}
Then, I take the screenshot in overlayView:
let image: UIImage = {
UIGraphicsBeginImageContextWithOptions(overlayView.bounds.size, false, 0)
defer { UIGraphicsEndImageContext() }
drawView.drawHierarchy(in: overlayView.bounds, afterScreenUpdates: true)
return UIGraphicsGetImageFromCurrentImageContext()!
}()
UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil)
What happens, is that the overlayView has the screen's full size and also draws the screenshot in the full size.
When debugging the view hierarchy, I can also see that the overlayView is still full-size, not masked to the path.
So, instead of getting only the part drawn around as screenshot, I get an image of the whole view / screen.
Question
How do I successfully mask the view to the drawn area so I can take a screenshot in that part of the screen only?
I think the overlayView.frame equals self.view.frame, which is why the image is being taken full screen.
A solution to your issue may be solved as follows (although I may have understood incorrectly):
let shapeLayer = CAShapeLayer()
shapeLayer.path = last.closedPath // returns CGPath.closeSubpath()
shapeLayer.lineWidth = 10
let rect = shapeLayer.path.boundingBoxOfPath
let overlayView = UIView(frame: rect)
This is my code:
#objc func drawForm() {
i = Int(arc4random_uniform(UInt32(formNames.count)))
var drawPath = actualFormNamesFromFormClass[i]
shapeLayer.fillColor = UIColor.clear.cgColor
shapeLayer.strokeColor = UIColor.black.cgColor
shapeLayer.lineWidth = 6
shapeLayer.frame = CGRect(x: -115, y: 280, width: 350, height: 350)
var paths: [UIBezierPath] = drawPath()
let shapeBounds = shapeLayer.bounds
let mirror = CGAffineTransform(scaleX: 1,
y: -1)
let translate = CGAffineTransform(translationX: 0,
y: shapeBounds.size.height)
let concatenated = mirror.concatenating(translate)
for path in paths {
path.apply(concatenated)
}
guard let path = paths.first else {
return
}
paths.dropFirst()
.forEach {
path.append($0)
}
shapeLayer.transform = CATransform3DMakeScale(0.6, 0.6, 0)
shapeLayer.path = path.cgPath
self.view.layer.addSublayer(shapeLayer)
strokeEndAnimation.duration = 30.0
strokeEndAnimation.fromValue = 0.0
strokeEndAnimation.toValue = 1.0
shapeLayer.add(strokeEndAnimation, forKey: nil)
}
This code animates the drawing of the shapeLayer path, however I can't find anything online about removing this layer and stopping this basic animation or removing the cgPath that gets drawn... Any help would be greatly appreciated!
You said:
I can't find anything online about removing this layer ...
It is removeFromSuperlayer().
shapeLayer.removeFromSuperlayer()
You go on to say:
... and stopping this basic animation ...
It is removeAllAnimations:
shapeLayer.removeAllAnimations()
Note, this will immediately change the strokeEnd (or whatever property you were animating) back to its previous value. If you want to "freeze" it where you stopped it, you have to grab the presentation layer (which captures the layer's properties as they are mid-animation), save the appropriate property, and then update the property of the layer upon which you are stopping the animation:
if let strokeEnd = shapeLayer.presentation()?.strokeEnd {
shapeLayer.removeAllAnimations()
shapeLayer.strokeEnd = strokeEnd
}
Finally, you go on to say:
... or removing the cgPath that gets drawn.
Just set it to nil:
shapeLayer.path = nil
By the way, when you're browsing for the documentation for CAShapeLayer and CABasicAnimation, don't forget to check out the documentation for their superclasses, namely and CALayer and CAAnimation ยป CAPropertyAnimation, respectively. Bottom line, when digging around looking for documentation on properties or methods for some particular class, you often will have to dig into the superclasses to find the relevant information.
Finally, the Core Animation Programming Guide is good intro and while its examples are in Objective-C, all of the concepts are applicable to Swift.
You can use an animation delegate CAAnimationDelegate to execute additional logic when an animation starts or ends. For example, you may want to remove a layer from its parent once a fade out animation has completed.
Below code taken from a class that implements CAAnimationDelegate on the layer, when you call The fadeOut function animates the opacity of that layer and, once the animation has completed, animationDidStop(_:finished:) removes it from its superlayer.
extension CALayer : CAAnimationDelegate {
func fadeOut() {
let fadeOutAnimation = CABasicAnimation()
fadeOutAnimation.keyPath = "opacity"
fadeOutAnimation.fromValue = 1
fadeOutAnimation.toValue = 0
fadeOutAnimation.duration = 0.25
fadeOutAnimation.delegate = self
self.add(fadeOutAnimation,
forKey: "fade")
}
public func animationDidStop(_ anim: CAAnimation, finished flag: Bool) {
self.removeFromSuperlayer()
}
}
I have UICollectionViewCell with UIView called gradientView that is used as a background for labels on it.
This UIView needs to have gradient - it is light and almost transparent at the top of cell and becomes slightly darker in the bottom of the cell (simple linear gradient).
So I created func called addGradient that takes UIView and adds layer with gradient to it.
func addGradient(view : UIView){
view.backgroundColor = .clear
let color1 = somecolor1.cgColor
let color2 = somecolor2.cgColor
let gradient:CAGradientLayer = CAGradientLayer()
gradient.frame.size = view.frame.size
gradient.colors = [color1, color2]
view.layer.addSublayer(gradient)
}
Inside cellForItemAt I call addGradient(cell.gradientView) and the gradient is shown. But I have 2 problems:
1) each cell becomes darker and darker - it seems that layers are added one over another and I don't want it
2) sometimes this gradients are slighly misplaced - I also think that is because I don't delete this layers properly
So how and where I should clear sublayers and maybe my method of adding this gradient inside cellForItemAt is not right?
just a guess on number 1 without seeing more of your code: change frame to bounds
func addGradient(view : UIView){
view.backgroundColor = .clear
let color1 = somecolor1.cgColor
let color2 = somecolor2.cgColor
let gradient:CAGradientLayer = CAGradientLayer()
gradient.frame.size = view.bounds.size
gradient.colors = [color1, color2]
view.layer.addSublayer(gradient)
}
Given an arbitrary UIView on iOS, is there a way using Core Graphics (CAGradientLayer comes to mind) to apply a "foreground-transparent" gradient to it?
I can't use a standard CAGradientLayer because the background is more complex than a UIColor. I also can't overlay a PNG because the background will change as my subview is scrolled along its parent vertical scrollview (see image).
I have a non-elegant fallback: have my uiview clip its subviews and move a pre-rendered gradient png of the background as the parent scrollview is scrolled.
This was an embarrassingly easy fix: apply a CAGradientLayer as my subview's mask.
CAGradientLayer *gradientLayer = [CAGradientLayer layer];
gradientLayer.frame = _fileTypeScrollView.bounds;
gradientLayer.colors = [NSArray arrayWithObjects:(id)[UIColor whiteColor].CGColor, (id)[UIColor clearColor].CGColor, nil];
gradientLayer.startPoint = CGPointMake(0.8f, 1.0f);
gradientLayer.endPoint = CGPointMake(1.0f, 1.0f);
_fileTypeScrollView.layer.mask = gradientLayer;
Thanks to Cocoanetics for pointing me in the right direction!
This is how I'll do.
Step 1 Define a custom gradient view (Swift 4):
import UIKit
class GradientView: UIView {
override open class var layerClass: AnyClass {
return CAGradientLayer.classForCoder()
}
required init?(coder aDecoder: NSCoder) {
super.init(coder: aDecoder)
let gradientLayer = self.layer as! CAGradientLayer
gradientLayer.colors = [
UIColor.white.cgColor,
UIColor.init(white: 1, alpha: 0).cgColor
]
backgroundColor = UIColor.clear
}
}
Step 2 - Drag and drop a UIView in your storyboard and set its custom class to GradientView
As an example, this is how the above gradient view looks like:
https://github.com/yzhong52/GradientViewDemo
I used the accepted (OP's) answer above and ran into the same issue noted in an upvoted comment - when the view scrolls, everything that started offscreen is now transparent, covered by the mask.
The solution was to add the gradient layer as the superview's mask, not the scroll view's mask. In my case, I'm using a text view, which is contained inside a view called contentView.
I added a third color and used locations instead of startPoint and endPoint, so that items below the text view are still visible.
let gradientLayer = CAGradientLayer()
gradientLayer.frame = self.contentView!.bounds
gradientLayer.colors = [UIColor.white.cgColor, UIColor.clear.cgColor, UIColor.white.cgColor]
// choose position for gradient, aligned to bottom of text view
let bottomOffset = (self.textView!.frame.size.height + self.textView!.frame.origin.y + 5)/self.contentView!.bounds.size.height
let topOffset = bottomOffset - 0.1
let bottomCoordinate = NSNumber(value: Double(bottomOffset))
let topCoordinate = NSNumber(value: Double(topOffset))
gradientLayer.locations = [topCoordinate, bottomCoordinate, bottomCoordinate]
self.contentView!.layer.mask = gradientLayer
Before, the text that started offscreen was permanently invisible. With my modifications, scrolling works as expected, and the "Close" button is not covered by the mask.
I just ran into the same issue and wound up writing my own class. It seems like serious overkill, but it was the only way I could find to do gradients with transparency. You can see my writeup and code example here
It basically comes down to a custom UIView that creates two images. One is a solid color, the other is a gradient that is used as an image mask. From there I applied the resulting image to the uiview.layer.content.
I hope it helps,
Joe
I hate to say it, but I think that you are into the CUSTOM UIView land. I think that I would try to implement this in a custom UIView overiding the drawRect routine.
With this, you could have that view, place on top of your actual scrollview, and have your gradient view (if you will) "pass-on" all touch events (i.e. relinquish first responder).