SWIFTUI....Is it possible with PKCanvasView [closed] - swift

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed last month.
Improve this question
I have created a drawing app. I have the canvas and tools working correctly. I have also added the ability to insert some predefined shapes into the canvas.
My question or where I am running into difficulty is when you use the PKLassoTool() I am unable to size or rotate anything. I am only able to move the selected object around.
Is this by design or are there ways to extend the tool to be able to accomplish the size and rotate aspects?
I haven't been able to find anything that would allow me to accomplish the size and rotate aspects of the PKLassoTool().

class PKLassoTool: PKTool {
var scale: CGFloat = 1.0 // Scaling factor of the object
var rotation: CGFloat = 0.0 // Rotation angle of the object
func setScale(scale: CGFloat) {
self.scale = scale
}
func setRotation(rotation: CGFloat) {
self.rotation = rotation
}
}

Related

How to find the CGColor at a CGPoint in a CGImage [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
I am trying to find the CGColor at a CGPoint in a CGImage. There are multiple Stack Overflow posts that have something similar (like UIImage instead of CGImage or Objective C instead of Swift), but nothing specific enough to help me.
Does anyone know how to get the color at a point? Thanks.
Im assuming based on your macOS tag that you want to do this on macOS and thus can use NSImage and NSBitmapImageRep:
let cgImage = //your CGImage
let bitmap = NSBitmapImageRep(cgImage: cgImage)
if let colorAtPoint = bitmap.colorAt(x: 0, y: 0) {
let cgColor = colorAtPoint.cgColor
print(cgColor)
}

Add a Snap like effect to a Live Video based on tracking the users face in real time in Swift

I desire to add a Snap like effect to a Live Video based on tracking the users face in real time. My design would like to place streams of particles coming from they eyebrows, eyes, or lips. I already have a flexible effects library that can place the desired streams at any chosen points on the screen that can be updated in real time.
Apple provides a Swift demo project that I downloaded at this link :
https://developer.apple.com/documentation/vision/tracking_the_user_s_face_in_real_time
If you download and run that project without any changes, it will show you an overlay containing face landmarks such as left and right eyebrows, eyes, nose, lips, etc that tracks a persons face in real time.
There wasn’t much documentation on the coordinate system, layer drawing, etc. to enable me to extract CGPoint values that would correspond to face landmarks such as points on the left eyebrow for instance.
I made some progress in analyzing the drawing code used in the Apple demo but have had only limited success in getting the desired coordinates.
The left eyebrow appears to consist of an array of 6 points on a path connected by lines. I would just like to get a CGPoint that indicates the current location for one of the points on the left eyebrow.
Apple provides a routine called addPoints.
addPoints is called for both open and closed landmarks.
That routine is called for each face landmark. Since the eyebrow is not a closed path it corresponds to being of this type : openLandmarkRegions . The mouth and eyes corresponds to a slightly different type, closedLandmarkRegions, since they are closed paths where the start point and end point are the same.
fileprivate func addPoints(in landmarkRegion: VNFaceLandmarkRegion2D, to path: CGMutablePath, applying affineTransform: CGAffineTransform, closingWhenComplete closePath: Bool)
It really doesn’t matter if the path is open or closed. All I care about is getting a valid CGPoint on any of the landmarks. Eventually I will have some effects for the eyes and mouth as well, as soon as I figure out how to get a valid CGPoint for just one of the face landmarks.
This is what I tried. I declared some global variables and I added some logic inside Apples drawing code to try to help pick out CGPoints on the left eyebrow.
var sampleLeftEyebrowPoint = false
var mostRecentLeftEyebrowPoint = CGPoint()
Since addPoints is called in for loops over all the landmarks, I had to try to pick out the loop that corresponded to the left eyebrow.
In addPoints Apple has this line of code where they use the points on any given landmark :
let points: [CGPoint] = landmarkRegion.normalizedPoints
I added this code snippet just after that line of code :
if sampleLeftEyebrowPoint
{
mostRecentLeftEyebrowPoint = points[1]
mostRecentLeftEyebrowPoint = mostRecentLeftEyebrowPoint.applying(affineTransform)
sampleLeftEyebrowPoint = false
}
Note that points[1] is the 2nd point on the eyebrow, which is one of the middle points.
Note that I apply the same affine transform to the single point that Apple applies in their logic.
I set sampleLeftEyebrowPoint to true in this Apple routine with some logic that determines if the left eyebrow is currently being looped over :
fileprivate func addIndicators(to faceRectanglePath: CGMutablePath, faceLandmarksPath: CGMutablePath, for faceObservation: VNFaceObservation)
In that routine Apple has a for loop over the open landmarks as shown below. I added some logic to set sampleLeftEyebrowPoint so that the logic in addPoints will recognize the left eyebrow is currently in work so it can set .
for openLandmarkRegion in openLandmarkRegions where openLandmarkRegion != nil {
if openLandmarkRegion == landmarks.leftEyebrow
{
sampleLeftEyebrowPoint = true
}
The mostRecentLeftEyebrowPoint that I obtain seems to correlate somewhat in ways to my desired CGPoint, but not fully. The X coordinate seems to track but needs some scaling. But the Y coordinate seems inverted, with maybe something else going on.
Can anyone provide a routine that will get me the desired CGPoint corresponding to mostRecentLeftEyebrowPoint ?
Once I have that, I have already figured out how to hide the face landmarks, so that only my effect will be visible and my effect will track the left eyebrow in real time. To hide the face detection lines that are shown, just comment out Apples call to :
// self.updateLayerGeometry()

Object Sticks To The Floor When It Moves [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 2 years ago.
Improve this question
When the object moves up and down sometimes it stays stuck on the floor. What is the reason for this? And I am using RigidBody2D and my code in FixedUpdate() method also I use box collider.
I mean when I start the game a few times, it sometimes happens and sometimes doesn't. I would be very happy if you could help. Thanks in advance.
So let me explain in more detail what I want to tell you in more detail, I want to move the obstacles up and down with the help of the code, that is, the obstacles change their direction when they touch every floor or ceiling, and they move up and down, that is, when they hit each ceiling or floor, their directions change.but sometimes obstacles stick to the ceiling or floor when they need to change direction
I'd just make a fixed point for each game object to change direction instead of checking for collisions with a rigid body.
void FixedUpdate() {
if(obstacle.transform.position.y + obstacle.transform.localScale.y >
upperPointOfReturn)
{
obstacle.ChangeDirection();
}
if(obstacle.transform.position.y - obstacle.transform.localScale.y <
lowerPointOfReturn)
{
obstacle.ChangeDirection();
}
obstacle.transform.position = new Vector2(
obstacle.transform.position.y
+ velocity* Time.deltaTime * direction, 0
);
}
void ChangeDirection() {
direction *= -1;
}
Create an Physics Material 2D with 0 Friction and 0 Bounciness and attach to the Player's Collider.

Object not rotating properly in unity? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I can't really explane whats going on so here is the video:
I can't explain this
Something went wrong with the rotation axis? I don't know what cause it. I don't know how to fix it.
Its like someone messed something with the axis? I even don't know how to do that...
Please help me!
Don't worry, it's because you modify the value in Transform, in fact, the changed localPosition and localRotation are relative coordinate values; when this parameter is changed, it is relative to the parent node.
The objects we see in Scene are Position and Rotation, which are the values ​​in world coordinates, which are the relative coordinates plus the parent node's value in world coordinates.
So all you need to do is reset the Transform of the GameObject's parent node to the initial value.
then the modification of the relative coordinates will be consistent with the world coordinates.

How do I limit an objects movement in xcode for iphone?

this is my first ever question so apologies if it's not descriptive enough.
I currently have an object that can be moved along an x axis with dragging. I need to limit it's movement to half of the screen.
Can anyone help
Well that is simple. Assuming we are talking about moving a UIView:
// Only move the object if its right bounding box position is
// in the first half of the screen
if ((someObjectView.frame.origin.x + someObjectView.frame.size.width) <
(containingView.bounds.size.width / 2.0) {
// Move the object
}
You fill in the blanks :-)