20.2 C
Canberra
Tuesday, October 21, 2025

How one can use SwiftUI MagnifyGesture to scale pinched a part of the picture, not the middle?


The principle cause is to use amplify gesture to pinched a part of the picture. By default it really works completely however at all times zoom in/out middle of given picture. How one can manipulate it to zoom in precise a part of the picture? If I amplify left high nook of the picture it ought to keep in the identical place that a part of the picture I wish to zoom in.

The place is the problem? Whilst you pinch it, it scales the picture incorrectly. It appear that offset is calculated accurately, however do not know why it does not work.

Dragging works completely, whereas zooming NOT.

To run the code you simply have to ship any UIImage. That’s all.

Right here is my code:

struct WrapperForCropView: View {
    let picture = UIImage(
        dimension: CGSize(width: 2100, peak: 1200),
        gradientPoints:  [
            GradientPoint(location: 0, color: #colorLiteral(red: 0.7450980544, green: 0.1568627506, blue: 0.07450980693, alpha: 0.2530534771)),
            GradientPoint(location: 0.2, color: #colorLiteral(red: 0.9686274529, green: 0.78039217, blue: 0.3450980484, alpha: 0.5028884243)),
            GradientPoint(location: 0.4, color: #colorLiteral(red: 0.721568644, green: 0.8862745166, blue: 0.5921568871, alpha: 0.3388534331)),
            GradientPoint(location: 0.6, color: #colorLiteral(red: 0.2588235438, green: 0.7568627596, blue: 0.9686274529, alpha: 0.3458681778)),
            GradientPoint(location: 0.8, color: #colorLiteral(red: 0.2196078449, green: 0.007843137719, blue: 0.8549019694, alpha: 0.3851232394))
        ]
    )!
    var physique: some View {
        NavigationView {
            ZStack {
                Shade.crimson
                GeometryReader { proxy in
                    VStack {
                        DemoCroppingView(uiImage: picture, bounds: proxy.dimension)
                    }
                    .body(width: proxy.dimension.width, peak: proxy.dimension.peak)
                    .clipped()
                }
            }
        }
    }
}

@obtainable(iOS 17.0, *)
struct DemoCroppingView: View {
    var uiImage: UIImage
    var bounds: CGSize
    @State non-public var offsetLimit: CGSize = .zero
    @State non-public var offset: CGSize = .zero
    @State non-public var lastOffset: CGSize = .zero
    @State non-public var scale: CGFloat = 1
    @State non-public var lastScale: CGFloat = 1
    @State non-public var imageViewSize: CGSize = .zero
    non-public let masks = CGSize(width: 300, peak: 300)
    
    non-public var dragGesture: some Gesture {
        DragGesture()
            .onChanged { gesture in
                offsetLimit = getOffsetLimit()
                
                let width = min(
                    max(-offsetLimit.width, lastOffset.width + gesture.translation.width),
                    offsetLimit.width
                )
                let peak = min(
                    max(-offsetLimit.peak, lastOffset.peak + gesture.translation.peak),
                    offsetLimit.peak
                )
                
                offset = CGSize(width: width, peak: peak)
            }
            .onEnded { _ in
                lastOffset = offset
            }
    }
    
    non-public var scaleGesture: some Gesture {
        MagnifyGesture()
            .onChanged { gesture in
                let startX = gesture.startLocation.x
                let startY = gesture.startLocation.y
                let currentScale = lastScale * gesture.magnification
                scale = currentScale
                let offsetWidth = (startX - (startX - lastOffset.width) * scale)
                let offsetHeight = (startY - (startY - lastOffset.peak) * scale)
                offset = CGSize(width: offsetWidth, peak: offsetHeight)
            }
            .onEnded { _ in
                lastScale = scale
                lastOffset = offset
            }
    }
    
    var physique: some View {
        ZStack(alignment: .middle) {
            ZStack {
                Rectangle()
                    .fill(.black)
                Picture(uiImage: uiImage)
                    .resizable()
                    .scaledToFill()
                    .scaleEffect(scale)
                    .offset(offset)
            }
            .overlay {
                Shade.black.opacity(0.5)
            }
            Picture(uiImage: uiImage)
                .resizable()
                .scaledToFill()
                .scaleEffect(scale)
                .offset(offset)
                .masks(
                    Circle()
                        .body(width: masks.width, peak: masks.peak)
                )
                .overlay {
                    Circle()
                        .stroke(Shade.crimson, lineWidth: 1)
                        .body(width: masks.width, peak: masks.peak)
                }
        }
        .simultaneousGesture(dragGesture)
        .simultaneousGesture(scaleGesture)
        .clipped()
        .onChange(of: bounds) { _, _ in
            calculateImageViewSize()
        }
        .onChange(of: uiImage) { _, _ in
            calculateImageViewSize()
            lastOffset = .zero
            offset = .zero
            offsetLimit = .zero
        }
    }
    
    non-public func calculateImageViewSize() {
        let viewRatio = bounds.width / bounds.peak
        let width = uiImage.dimension.width
        let peak = uiImage.dimension.peak
        let imageRatio = width / peak
        let issue = viewRatio < imageRatio ? bounds.peak / peak : bounds.width / width
        imageViewSize.peak = peak * issue
        imageViewSize.width = width * issue
    }
    
    non-public func getOffsetLimit() -> CGSize {
        var offsetLimit: CGSize = .zero
        offsetLimit.width = ((imageViewSize.width * scale) - masks.width) / 2
        offsetLimit.peak = ((imageViewSize.peak * scale) - masks.peak) / 2
        return offsetLimit
    }
}

How one can use SwiftUI MagnifyGesture to scale pinched a part of the picture, not the middle?

And right here is the complete instance image:

enter image description here

What’s the objective?

To find out space that shall be cropped later. You simply can transfer and scale the picture with the next limitations:

  • 🏆 picture after scale should not be smaller than dimension of the masks
  • 🏆 offset after scale should not go away any empty area throughout the masks
  • dragging works completely and likewise can not go away ay empty area inside masks.

🏆Tomorrow I allow bounty value 1000 for that query.🏆

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

[td_block_social_counter facebook="tagdiv" twitter="tagdivofficial" youtube="tagdiv" style="style8 td-social-boxed td-social-font-icons" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjM4IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiMzAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3Njh9" custom_title="Stay Connected" block_template_id="td_block_template_8" f_header_font_family="712" f_header_font_transform="uppercase" f_header_font_weight="500" f_header_font_size="17" border_color="#dd3333"]
- Advertisement -spot_img

Latest Articles