Posts

Sort by:
Post not yet marked as solved
0 Replies
10 Views
I'm having an issue where when my asset catalog have more than 2 images (all have @1x @2x and @3x and PNG format), my NSImage in my NSImageView cannot be clicked. Does anyone know why this happens? Thanks in advance! import SwiftUI struct ContentView: View { @State private var window: NSWindow? var body: some View { VStack { Button("Open Window") { // Create and show the NSWindow self.window = NSWindow( contentRect: NSScreen.main?.frame ?? NSRect.zero, styleMask: [.borderless], backing: .buffered, defer: false ) // Set up window properties self.window?.isOpaque = false self.window?.hasShadow = false self.window?.backgroundColor = .clear self.window?.level = .screenSaver self.window?.collectionBehavior = [.canJoinAllSpaces] self.window?.makeKeyAndOrderFront(nil) // Create an NSImageView let petView = PetView() // Add the NSImageView to the window's content view if let contentView = self.window?.contentView { contentView.addSubview(petView) // Center the petView petView.centerXAnchor.constraint(equalTo: contentView.centerXAnchor).isActive = true petView.centerYAnchor.constraint(equalTo: contentView.centerYAnchor).isActive = true } } } } } class PetView: NSImageView { override init(frame frameRect: NSRect = .zero) { super.init(frame: frameRect) self.image = NSImage(named: "dog_idle-1") self.translatesAutoresizingMaskIntoConstraints = false } required init?(coder: NSCoder) { super.init(coder: coder) } override func mouseDown(with event: NSEvent) { print("woof!") } } I've tried changing the amount of images in my asset catalog and found that 2 is the maximum amount for my NSImage to be clickable. It suppose to print "woof!" when i click on it.
Posted
by
Post not yet marked as solved
0 Replies
8 Views
i am working application ABC where user can save configuration. Then later it can be open using same file. but when i am saving this configuration file. its not showing the icon. the application is built using qt c++.. here is my plist what is the possible cause? i tried directly providing path of icon in plist. i tried CFBundleSignature from ???? to ABCC
Posted
by
Post not yet marked as solved
0 Replies
16 Views
i made a virtual machine with the help of the tutorial on the apple developer website running macos on my m2 macbook pro. I am wondering how I can integrate my mouse in the virtual machine instead of it being on top of the app.
Posted
by
Post not yet marked as solved
0 Replies
19 Views
Hello, I am facing a passkey authentication during cross device authentication. After I called completeAssertionRequest with passkeyCredential, I am encountering the issue attach and I am unable to find where the RPID mismatch is coming from and referencing from. It would be great if I could know the source of where RPID hash (found). Thanks. Returned credential failed validation: Error Domain=com.apple.AuthenticationServicesCore.AuthorizationError Code=14 "RPID hash did not match expected value. Expected xMTokW1VIYg2DZVB9lCtheT+0n8NxHvx4HaxTPhH4bY=, found: eE1Ub2tXMVZJWWcyRFpWQjlsQ3RoZVQtMG44TnhIdng=." UserInfo={NSLocalizedFailureReason=RPID hash did not match expected value. Expected xMTokW1VIYg2DZVB9lCtheT+0n8NxHvx4HaxTPhH4bY=, found: eE1Ub2tXMVZJWWcyRFpWQjlsQ3RoZVQtMG44TnhIdng=.}
Posted
by
Post not yet marked as solved
0 Replies
32 Views
Hi, ho un'idea per un'applicazione che può rivoluzionare e agevolare riguardo l'utilizzo degli apparati video (telefoni, monitor, tv) per almeno il75/80% dell'intera popolazione mondiale. Vorrei che fosse un'operazione realizzata inizialmente da apple ios/os e in seguito su tutti i supporti. Mi piacerebbe creare, sviluppare o cedere l'idea ma non so proprio da dove iniziare. potete darmi info su questo? so che può sembrare una richiesta astrana, ma sono certo che questa app sarà una vera rivoluzione per tutti, o quasi Grazie Giovanni da Cagliari, Sardegna, Italia Hi, I have an idea for an application that can revolutionize and facilitate the use of video equipment (telephones, monitors, TVs) for at least 75/80% of the entire world population. I would like it to be an operation carried out initially by apple ios/os and later on all media. I would like to create, develop or sell the idea but I really don't know where to start. can you give me info on this? I know it may seem like a strange request, but I'm sure that this app will be a real revolution for everyone, or almost everyone Thank you Giovanni from Italy [Edited by Moderator]
Posted
by
Post not yet marked as solved
0 Replies
16 Views
I am using Pushkit + Callkit and I am facing issue: I receive incoming call when device is locked screen state then I answer and end call. The next incoming call will alway display on left corner screen device as below image. If I receive incoming call when device is unlocked screen state, my application receive, answer call and end call normally.
Posted
by
Post not yet marked as solved
0 Replies
20 Views
hello I am trying to detect the orientation of text in images. (each image has a label with a number but sometimes the the label is not in the right orientation and I would like two detect these cases and add a prefix to the image files) this code is working well but when the text is upside down it considers that the text is well oriented is it a way to distinguish the difference ? thanks for your help ! import SwiftUI import Vision struct ContentView: View { @State private var totalImages = 0 @State private var processedImages = 0 @State private var rotatedImages = 0 @State private var remainingImages = 0 var body: some View { VStack { Button(action: chooseDirectory) { Text("Choisir le répertoire des images") .padding() } Text("TOTAL: \(totalImages)") Text("TRAITEES: \(processedImages)") Text("ROTATION: \(rotatedImages)") Text("RESTANT: \(remainingImages)") } .padding() } func chooseDirectory() { let openPanel = NSOpenPanel() openPanel.canChooseDirectories = true openPanel.canChooseFiles = false openPanel.allowsMultipleSelection = false openPanel.begin { response in if response == .OK, let url = openPanel.url { processImages(in: url) } } } func processImages(in directory: URL) { DispatchQueue.global(qos: .userInitiated).async { do { let fileManager = FileManager.default let urls = try fileManager.contentsOfDirectory(at: directory, includingPropertiesForKeys: nil) let imageUrls = urls.filter { $0.pathExtension.lowercased() == "jpg" || $0.pathExtension.lowercased() == "png" } DispatchQueue.main.async { self.totalImages = imageUrls.count self.processedImages = 0 self.rotatedImages = 0 self.remainingImages = self.totalImages } for url in imageUrls { self.processImage(at: url) } } catch { print("Error reading contents of directory: \(error.localizedDescription)") } } } func processImage(at url: URL) { guard let image = NSImage(contentsOf: url), let cgImage = image.cgImage(forProposedRect: nil, context: nil, hints: nil) else { return } let request = VNRecognizeTextRequest { (request, error) in if let error = error { print("Error recognizing text: \(error.localizedDescription)") return } if let results = request.results as? [VNRecognizedTextObservation], !results.isEmpty { let orientationCorrect = self.isTextOrientationCorrect(results) if !orientationCorrect { self.renameFile(at: url) DispatchQueue.main.async { self.rotatedImages += 1 } } } DispatchQueue.main.async { self.processedImages += 1 self.remainingImages = self.totalImages - self.processedImages } } let handler = VNImageRequestHandler(cgImage: cgImage, options: [:]) do { try handler.perform([request]) } catch { print("Error performing text recognition request: \(error.localizedDescription)") } } func isTextOrientationCorrect(_ observations: [VNRecognizedTextObservation]) -> Bool { // Placeholder for the logic to check text orientation // This should be implemented based on your specific needs for observation in observations { if let recognizedText = observation.topCandidates(1).first { let boundingBox = observation.boundingBox let angle = atan2(boundingBox.height, boundingBox.width) if abs(angle) > .pi / 4 { return false } } } return true } func renameFile(at url: URL) { let fileManager = FileManager.default let directory = url.deletingLastPathComponent() let newName = "ROTATION_" + url.lastPathComponent let newURL = directory.appendingPathComponent(newName) do { try fileManager.moveItem(at: url, to: newURL) } catch { print("Error renaming file: \(error.localizedDescription)") } } } struct ContentView_Previews: PreviewProvider { static var previews: some View { ContentView() } }
Posted
by
Post not yet marked as solved
0 Replies
21 Views
My applescript calls a perl script to convert a file. When I call the perl from commandline, everything works fine. When I call the applescript (by dropping a file on it), this error occurs: Can't load '/Library/Perl/5.30/darwin-thread-multi-2level/auto/Encode/Encode.bundle' for module Encode: dlopen(/Library/Perl/5.30/darwin-thread-multi-2level/auto/Encode/Encode.bundle, 0x0001): tried: '/Library/Perl/5.30/darwin-thread-multi-2level/auto/Encode/Encode.bundle' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64')), '/ which I interprete that OSA runs in Rosetta mode now. This error did not occur at least until Mar 27, when I used it last. Of course on the same machine. How is it possible that OSA now is called under Rosetta? Has there been a chacnge in Ventura 13.6.6? How to control that this OSA runs as native arm64?
Posted
by
Post not yet marked as solved
0 Replies
17 Views
can the example from Support external cameras in your iPadOS app work on IOS 17.5 Iphone 15PRO ? https://developer.apple.com/videos/play/wwdc2023/10106/
Posted
by
Post not yet marked as solved
0 Replies
44 Views
I have an executable in macOS that I m launching as a User Agent. The same executable can be launched in multiple ways like either user can directly click the exe to launch it, or user can launch it from the terminal using ./ etc. One similar way is when the user launches the exe as a User Agent(i.e daemon in user session). In this scenarios, I want to identify in my exe If my user has launched it as agent to perform certain task. I wanted to know how can I accurately determine this? I have tried figuring out If there is some unique session that agents operate in, but I could not find anything. Can someone help here? Is this even possible?
Posted
by
Post not yet marked as solved
1 Replies
53 Views
Why does this Regex Builder code in my SwiftUI app not work? I'm parsing a string that might be a date and time with either AM or PM specified for the time. This bit of code looks for the optional AM or PM. The error I get is: The compiler is unable to type-check this expression in reasonable time; try breaking up the expression into distinct sub-expressions What would 'distinct sub-expressions' mean in this case? The code: let ampmRef = Reference<Substring>() let ampmReg = Regex { Capture(as: ampmRef) { ZeroOrMore { ChoiceOf { One("am") One("pm") } } } transform: { $0.lowercase } }.ignoresCase() In a related question, is there a way to return a default if the ChoiceOf fails both AM and PM?
Posted
by
Post not yet marked as solved
0 Replies
64 Views
I've started getting reports of this today and I am able to replicate it on my end but looking to see if anyone else can verify or if it's possibly regional to me (Canada). In Apple Maps (iOS or macOS), if you search a latitude and longitude -- for example: "49.110,-112.110" and search, it centers on the location as it always has and shows the "Directions" button. When you tap the directions button, I get "A route can't be shown because of a problem connecting to the server.". Alternatively, if you pass the coordinate in via Apple Maps URL (https://maps.apple.com/?daddr=49.110,-112.110) it will route but the route is no longer to those specific coordinates, Apple Maps alters them to some nearest known entity (in this case, the RM of Warner County). If you compare the suggested route end destination with the search results for specifically entering the coordinates, you will see they are different locations and mapping routes are not actually taking you to the coordinates anymore. In the last photo attached, the arrow points to where "49.110,-112.110" is actually located which tapping the "Directions" button cannot figure out a route because of a server issue. If you pass it in via URL, it changes the destination coordinates and begins a route quite a ways away from the intended coordinate. The problem started happening either this morning or last night. Can anyone else confirm this happens to them? Thanks, Mike
Posted
by
Post not yet marked as solved
1 Replies
82 Views
Dear developers, now that we have played with Vision Pro for 3 months, I am wondering why some features are missing on Vision Pro, especially some seem to be very basic/fundamental. So I would like to see if you know more about the reasons or correct me if I'm wrong! You are also welcome to share features that you think is fundamental, but missing on Vision Pro. My list goes below: (1) GPS/Compass: cost? heat? battery? (2) Moving image tracking: surrounding environment processing is already too computing intensive? (3) 3D object tracking: looks like only supported on iOS and iPadOS, but not visionOS (4) Does not invoke application focus/pause callback: maybe I'm wrong? But we were not able to detect if an app has been put on background or brought to foreground to invoke a callback
Posted
by
Post not yet marked as solved
0 Replies
56 Views
I'm trying to understand better how to 'navigate' around a large USD scene inside a RealityView in SwiftUI (itself in a volume on VisionOS). With a little trial and error I have been able to understand scale and translate transforms, and I can have the USD zoom to 'presets' of different scale and translation transforms. Separately I can also rotate an unscaled and untranslated USD, and have it rotate in place 90 degrees at a time to return to a rotation of 0 degrees. But if I try to combine the two activities, the rotation occurs around the center of the USD, not my zoomed location. Is there a session or sample code available that combines these activities? I think I would understand relatively quickly if I saw it in action. Thanks for any pointers available!
Posted
by
Post not yet marked as solved
0 Replies
63 Views
I'm experiencing an issue with WKWebView and localStorage. I've set up a standard WKWebView with the configuration: configuration.websiteDataStore = WKWebsiteDataStore.default() Everything works fine in the emulator (iOS 16.x, 17.0), but on my iPhone 13 running iOS 17.4, I encounter a problem. When I set a localStorage value on my local HTML page, navigate to another URL within the webview, and then return to the original page, the localStorage is cleared. This behavior is new and wasn't happening before. Has anyone else encountered this or have any suggestions on how to fix it? The localstorage should be persistent as it always has been.
Posted
by
Post not yet marked as solved
0 Replies
60 Views
I have noticed that my Apple Watch app seems to randomly quit from time to time. It's not crashing and I have not been able to reproduce it in a controlled setting, but have noticed that it seems to only happen when I'm in very high or low temperatures. For instance, I was skiing recently and my app was supposed to stay running the whole time, but often when I would raise my wrist it would be back on the home screen and my app wasn't running. This also happened when I was on the beach on a very hot day. But when I'm testing it at home I can keep it running for hours and it never crashes, which leads me to believe it may have to do with the temperature. Does the OS kill apps when it's running in very high or low temperatures? If so, is there anything I can do to prevent this from occurring? Would doing less things in my app possibly prevent this? For instance, I have a timer, and use a bunch of sensors, would turning those off at times and using less of the display make a difference or does the OS not care what the apps are actually doing? If not, any other ideas? Thanks
Posted
by
Post not yet marked as solved
0 Replies
60 Views
Hello everyone, I hope you’re all doing well. I have a question regarding the use of Apple's Find My network. I’m in the early stages of developing an app that would track third-party Find My-compatible tags. Before proceeding further, I want to ensure that I am compliant with Apple’s guidelines and policies. Can anyone provide insight into whether Apple allows developers to use the Find My crowd-sourced network for their own apps? Specifically, I'm interested in tracking third-party Find My tags through my app. Any guidance or resources you can share would be greatly appreciated! Thank you!
Posted
by
Post not yet marked as solved
0 Replies
70 Views
I have built the app on xcode and deployed it on my ios device. I am using a personal team account using my apple id. It is an xcode managed profile. I go to VPN & Device Management and accept the developer which works, but then when I hit "Verify App" there is a blink and nothing happens and no error code. I have fine internet I am able to access websites and I do not have a VPN.
Posted
by
Post not yet marked as solved
0 Replies
63 Views
Hi, I am new to Swift and would like to write a simple Swift script to show some HDR images or video on my VisionPro. I tried to find some code online as shown in the attachment to put one HDR image and one SDR side by side, but it seems like not to take any HDR effect. Thanks in advance.
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all