Posts

Sort by:
Post not yet marked as solved
0 Replies
73 Views
i made a virtual machine with the help of the tutorial on the apple developer website running macos on my m2 macbook pro. I am wondering how I can integrate my mouse in the virtual machine instead of it being on top of the app.
Posted Last updated
.
Post not yet marked as solved
0 Replies
66 Views
Hello, I am facing a passkey authentication during cross device authentication. After I called completeAssertionRequest with passkeyCredential, I am encountering the issue attach and I am unable to find where the RPID mismatch is coming from and referencing from. It would be great if I could know the source of where RPID hash (found). Thanks. Returned credential failed validation: Error Domain=com.apple.AuthenticationServicesCore.AuthorizationError Code=14 "RPID hash did not match expected value. Expected xMTokW1VIYg2DZVB9lCtheT+0n8NxHvx4HaxTPhH4bY=, found: eE1Ub2tXMVZJWWcyRFpWQjlsQ3RoZVQtMG44TnhIdng=." UserInfo={NSLocalizedFailureReason=RPID hash did not match expected value. Expected xMTokW1VIYg2DZVB9lCtheT+0n8NxHvx4HaxTPhH4bY=, found: eE1Ub2tXMVZJWWcyRFpWQjlsQ3RoZVQtMG44TnhIdng=.}
Posted Last updated
.
Post not yet marked as solved
0 Replies
80 Views
Hi, ho un'idea per un'applicazione che può rivoluzionare e agevolare riguardo l'utilizzo degli apparati video (telefoni, monitor, tv) per almeno il75/80% dell'intera popolazione mondiale. Vorrei che fosse un'operazione realizzata inizialmente da apple ios/os e in seguito su tutti i supporti. Mi piacerebbe creare, sviluppare o cedere l'idea ma non so proprio da dove iniziare. potete darmi info su questo? so che può sembrare una richiesta astrana, ma sono certo che questa app sarà una vera rivoluzione per tutti, o quasi Grazie Giovanni da Cagliari, Sardegna, Italia Hi, I have an idea for an application that can revolutionize and facilitate the use of video equipment (telephones, monitors, TVs) for at least 75/80% of the entire world population. I would like it to be an operation carried out initially by apple ios/os and later on all media. I would like to create, develop or sell the idea but I really don't know where to start. can you give me info on this? I know it may seem like a strange request, but I'm sure that this app will be a real revolution for everyone, or almost everyone Thank you Giovanni from Italy [Edited by Moderator]
Posted Last updated
.
Post not yet marked as solved
0 Replies
52 Views
I am using Pushkit + Callkit and I am facing issue: I receive incoming call when device is locked screen state then I answer and end call. The next incoming call will alway display on left corner screen device as below image. If I receive incoming call when device is unlocked screen state, my application receive, answer call and end call normally.
Posted
by ldev_mob.
Last updated
.
Post not yet marked as solved
0 Replies
55 Views
hello I am trying to detect the orientation of text in images. (each image has a label with a number but sometimes the the label is not in the right orientation and I would like two detect these cases and add a prefix to the image files) this code is working well but when the text is upside down it considers that the text is well oriented is it a way to distinguish the difference ? thanks for your help ! import SwiftUI import Vision struct ContentView: View { @State private var totalImages = 0 @State private var processedImages = 0 @State private var rotatedImages = 0 @State private var remainingImages = 0 var body: some View { VStack { Button(action: chooseDirectory) { Text("Choisir le répertoire des images") .padding() } Text("TOTAL: \(totalImages)") Text("TRAITEES: \(processedImages)") Text("ROTATION: \(rotatedImages)") Text("RESTANT: \(remainingImages)") } .padding() } func chooseDirectory() { let openPanel = NSOpenPanel() openPanel.canChooseDirectories = true openPanel.canChooseFiles = false openPanel.allowsMultipleSelection = false openPanel.begin { response in if response == .OK, let url = openPanel.url { processImages(in: url) } } } func processImages(in directory: URL) { DispatchQueue.global(qos: .userInitiated).async { do { let fileManager = FileManager.default let urls = try fileManager.contentsOfDirectory(at: directory, includingPropertiesForKeys: nil) let imageUrls = urls.filter { $0.pathExtension.lowercased() == "jpg" || $0.pathExtension.lowercased() == "png" } DispatchQueue.main.async { self.totalImages = imageUrls.count self.processedImages = 0 self.rotatedImages = 0 self.remainingImages = self.totalImages } for url in imageUrls { self.processImage(at: url) } } catch { print("Error reading contents of directory: \(error.localizedDescription)") } } } func processImage(at url: URL) { guard let image = NSImage(contentsOf: url), let cgImage = image.cgImage(forProposedRect: nil, context: nil, hints: nil) else { return } let request = VNRecognizeTextRequest { (request, error) in if let error = error { print("Error recognizing text: \(error.localizedDescription)") return } if let results = request.results as? [VNRecognizedTextObservation], !results.isEmpty { let orientationCorrect = self.isTextOrientationCorrect(results) if !orientationCorrect { self.renameFile(at: url) DispatchQueue.main.async { self.rotatedImages += 1 } } } DispatchQueue.main.async { self.processedImages += 1 self.remainingImages = self.totalImages - self.processedImages } } let handler = VNImageRequestHandler(cgImage: cgImage, options: [:]) do { try handler.perform([request]) } catch { print("Error performing text recognition request: \(error.localizedDescription)") } } func isTextOrientationCorrect(_ observations: [VNRecognizedTextObservation]) -> Bool { // Placeholder for the logic to check text orientation // This should be implemented based on your specific needs for observation in observations { if let recognizedText = observation.topCandidates(1).first { let boundingBox = observation.boundingBox let angle = atan2(boundingBox.height, boundingBox.width) if abs(angle) > .pi / 4 { return false } } } return true } func renameFile(at url: URL) { let fileManager = FileManager.default let directory = url.deletingLastPathComponent() let newName = "ROTATION_" + url.lastPathComponent let newURL = directory.appendingPathComponent(newName) do { try fileManager.moveItem(at: url, to: newURL) } catch { print("Error renaming file: \(error.localizedDescription)") } } } struct ContentView_Previews: PreviewProvider { static var previews: some View { ContentView() } }
Posted
by cloum.
Last updated
.
Post not yet marked as solved
0 Replies
48 Views
My applescript calls a perl script to convert a file. When I call the perl from commandline, everything works fine. When I call the applescript (by dropping a file on it), this error occurs: Can't load '/Library/Perl/5.30/darwin-thread-multi-2level/auto/Encode/Encode.bundle' for module Encode: dlopen(/Library/Perl/5.30/darwin-thread-multi-2level/auto/Encode/Encode.bundle, 0x0001): tried: '/Library/Perl/5.30/darwin-thread-multi-2level/auto/Encode/Encode.bundle' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64')), '/ which I interprete that OSA runs in Rosetta mode now. This error did not occur at least until Mar 27, when I used it last. Of course on the same machine. How is it possible that OSA now is called under Rosetta? Has there been a chacnge in Ventura 13.6.6? How to control that this OSA runs as native arm64?
Posted
by Steffie.
Last updated
.
Post not yet marked as solved
0 Replies
49 Views
can the example from Support external cameras in your iPadOS app work on IOS 17.5 Iphone 15PRO ? https://developer.apple.com/videos/play/wwdc2023/10106/
Posted
by uwe2.
Last updated
.
Post not yet marked as solved
3 Replies
199 Views
Hello, financial reports for April 2024 are ready since the beginning of May in AppStore connect but the “Expected payment date“ is not present in the top right of the page. Anyone facing the same issue? Do you know the reason for that?
Posted
by marcozabo.
Last updated
.
Post not yet marked as solved
1 Replies
73 Views
Long story short, I had my App and Watch app already uploaded to the app store. However, I needed to add a WatchConnectivity to have App to Watch communication. At the beginning My app bundle id was: com.x My watch bundle id was: com.x.watchkitapp However, while developing Watch Connectivity, I noticed that my Apps are not connected unless I changed it to com.x.watchkitapp -> com.x.watch However after changing it I cannot submit my bundle anymore. I'm getting Asset validation failed error Invalid Bundle Identifier. Attempting to change bundle identifier from com.x.watchkitapp to com.x.watch is disallowed for bundle x.app/Watch/WatchX Watch App.app. (ID: 75a4621a-7e28-411d-a2a7-84674e460656) Any ideas how this could be solved?
Posted Last updated
.
Post not yet marked as solved
0 Replies
72 Views
I have an executable in macOS that I m launching as a User Agent. The same executable can be launched in multiple ways like either user can directly click the exe to launch it, or user can launch it from the terminal using ./ etc. One similar way is when the user launches the exe as a User Agent(i.e daemon in user session). In this scenarios, I want to identify in my exe If my user has launched it as agent to perform certain task. I wanted to know how can I accurately determine this? I have tried figuring out If there is some unique session that agents operate in, but I could not find anything. Can someone help here? Is this even possible?
Posted Last updated
.
Post not yet marked as solved
2 Replies
136 Views
Our app needs to scan QR codes (or a similar mechanism) to populate it with content the user wants to see. Is there any update on QR code scanning availability on this platform? I asked this before, but never got any feedback. I know that there is no way to access the camera (which is an issue in itself), but at least the system could provide an API to scan codes. (It would be also cool if we were able to use the same codes Vision Pro uses for detecting the Zeiss glasses, as long as we could create these via server-side JavaScript code.)
Posted
by waldgeist.
Last updated
.
Post not yet marked as solved
1 Replies
128 Views
Dear developers, now that we have played with Vision Pro for 3 months, I am wondering why some features are missing on Vision Pro, especially some seem to be very basic/fundamental. So I would like to see if you know more about the reasons or correct me if I'm wrong! You are also welcome to share features that you think is fundamental, but missing on Vision Pro. My list goes below: (1) GPS/Compass: cost? heat? battery? (2) Moving image tracking: surrounding environment processing is already too computing intensive? (3) 3D object tracking: looks like only supported on iOS and iPadOS, but not visionOS (4) Does not invoke application focus/pause callback: maybe I'm wrong? But we were not able to detect if an app has been put on background or brought to foreground to invoke a callback
Posted
by jjjjjom.
Last updated
.
Post not yet marked as solved
5 Replies
668 Views
In our PacketTunnelProvider we are seeing behavior for enforceRoutes which appears to contradict the documentation. According to the developer documentation (my emphasis): If this property is YES when the includeAllNetworks property is NO, the system scopes the included routes to the VPN and the excluded routes to the current primary network interface. If we set these IPv4 settings: IPv4Settings = { configMethod = manual addresses = ( 172.16.1.1, ) subnetMasks = ( 255.255.255.255, ) includedRoutes = ( { destinationAddress = 0.0.0.0 destinationSubnetMask = 0.0.0.0 }, ) excludedRoutes = ( { destinationAddress = 10.10.0.0 destinationSubnetMask = 255.255.255.0 }, ) overridePrimary = YES } Then if enforceRoutes is set to YES, then we do not see traffic for the excluded network, which is the expected behavior. If enforceRoutes is set to NO, then we do see traffic for the excluded network. In both cases includeAllNetworks and excludeLocalNetworks are both NO. The excluded network is not one of the local LANs. Is this a known issue? Is there some documented interaction that I missed here? Is there a workaround we can use to make this function as intended, with enforceRoutes set to YES?
Posted
by kbrock.
Last updated
.
Post not yet marked as solved
4 Replies
117 Views
I've read the definitive "Recording Private Data in the System Log" by @eskimo and the words at man 5 os_log and written code to, specifically, turn on "Enable-Private-Data" in my app. My application is a command line and I've configured Xcode to insert what I believe to be the appropriate incantations in an Info.plist file into the unstructured executable binary. When I run the app with Terminal, I see <private> output in the Console app where I expect values to be displayed in a public manner. Nothing I've read says that <key>Enable-Private-Data</key><true/> doesn't apply to command line apps, and my own understanding of the value of of the logging mechanism rejects that notion because logging is performed all over macOS, not just in a ***.app environment. A this point, I'm firmly convinced this unexpected behavior is of my own doing, but I have paused the search for my (probably embarrassing) mistake, to write this note because of a 1% doubt I'm wrong. I'd be very happy to receive the, expected, assurance that logging configuration via an embedded Info.plist in a command line app does influence logging behavior. With that assurance, I'll know it's my problem and I'll search/find/fix. On there way there, I'll create the simplest command line app that exhibits this anomaly -- which will likely reveal my error and, if not, it'll be fodder for a bug report. Embedding an Info.plist into a command line app is a tad out of the ordinary but I've done it before (using Xcode or SPM) to carry knowledge into a CLI via a mainBundle.infoDictionary .. and in the particular case described above, I've printed that infoDictionary to show the successful embedding, viz: . . . . "OSLogPreferences": { "com.ramsaycons" = { "DEFAULT-OPTIONS" = { "Enable-Private-Data" = 1; }; }; }, . . . . Sonoma 14.5 / Xcode 15.4 / MBP (Apple M1 Max)
Posted Last updated
.
Post not yet marked as solved
1 Replies
82 Views
Why does this Regex Builder code in my SwiftUI app not work? I'm parsing a string that might be a date and time with either AM or PM specified for the time. This bit of code looks for the optional AM or PM. The error I get is: The compiler is unable to type-check this expression in reasonable time; try breaking up the expression into distinct sub-expressions What would 'distinct sub-expressions' mean in this case? The code: let ampmRef = Reference<Substring>() let ampmReg = Regex { Capture(as: ampmRef) { ZeroOrMore { ChoiceOf { One("am") One("pm") } } } transform: { $0.lowercase } }.ignoresCase() In a related question, is there a way to return a default if the ChoiceOf fails both AM and PM?
Posted
by RJStover.
Last updated
.
Post not yet marked as solved
1 Replies
90 Views
I've started getting reports of this today and I am able to replicate it on my end but looking to see if anyone else can verify or if it's possibly regional to me (Canada). In Apple Maps (iOS or macOS), if you search a latitude and longitude -- for example: "49.110,-112.110" and search, it centers on the location as it always has and shows the "Directions" button. When you tap the directions button, I get "A route can't be shown because of a problem connecting to the server.". Alternatively, if you pass the coordinate in via Apple Maps URL (https://maps.apple.com/?daddr=49.110,-112.110) it will route but the route is no longer to those specific coordinates, Apple Maps alters them to some nearest known entity (in this case, the RM of Warner County). If you compare the suggested route end destination with the search results for specifically entering the coordinates, you will see they are different locations and mapping routes are not actually taking you to the coordinates anymore. In the last photo attached, the arrow points to where "49.110,-112.110" is actually located which tapping the "Directions" button cannot figure out a route because of a server issue. If you pass it in via URL, it changes the destination coordinates and begins a route quite a ways away from the intended coordinate. The problem started happening either this morning or last night. Can anyone else confirm this happens to them? Thanks, Mike
Posted Last updated
.
Post not yet marked as solved
3 Replies
641 Views
I have an app which uses SwiftUI and Mac Catalyst. When running on a Mac I want to provide a preferences menu entry with the usual keyboard shortcut Command + ,. An implementation via the Settings bundle is out of question since my preferences are too complex for this. Here is a reduced example of my implementation: import SwiftUI @main struct PreferencesMenuTestApp: App { @UIApplicationDelegateAdaptor private var appDelegate: AppDelegate var body: some Scene { WindowGroup { ContentView() } } } class AppDelegate: UIResponder, UIApplicationDelegate { override func buildMenu(with builder: UIMenuBuilder) { let preferencesCommand = UIKeyCommand(title: "Preferences…", action: #selector(showPreferences), input: ",", modifierFlags: .command) // let preferencesCommand = UIAction(title: "Preferences…") { action in // debugPrint("show preferences") // } let menu = UIMenu(title: "Preferences…", options: .displayInline, children: [preferencesCommand]) builder.insertSibling(menu, afterMenu: .about) } @objc func showPreferences() { debugPrint("show preferences") } } The problem is that the menu entry is disabled. Obviously the provided selector is not recognised. When I mark the AppDelegate with @main, then the menu entry is enabled. Of course then the app's window is empty. When I switch to the UIAction implementation (the out commented code) it works fine. But since one cannot provide a keyboard shortcut for UIActions this is not a good solution. What am I missing? How would one implement a preferences menu entry that actually works?
Posted
by RayWo.
Last updated
.
Post not yet marked as solved
1 Replies
108 Views
I've created a Full Immersive VisionOS project and added a spacial video player in the ImmersiveView swift file. I have a few buttons on a different VideosView swift file on a floating window and i'd like switch the video playing in ImmersiveView when i click on a button in VideosView file. Video player working great in ImmersiveView: RealityView { content in if let videoEntity = try? await Entity(named: "Video", in: realityKitContentBundle) { guard let url = Bundle.main.url(forResource: "video1", withExtension: "mov") else {fatalError("Video was not found!")} let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer() videoEntity.components[VideoPlayerComponent.self] = .init(avPlayer: player) content.add(videoEntity) player.replaceCurrentItem(with: playerItem) player.play() }else { print("file not found!") } } Buttons in floating window from VideosView: struct VideosView: View { var body: some View { VStack{ Button(action: {}) { Text("video 1").font(.title) } Button(action: {}) { Text("video 2").font(.title) } Button(action: {}) { Text("video 3").font(.title) } } } } In general how do I control the video player across views and how do I replace the video when each button is selected. Any help/code/links would be greatly appreciated.
Posted Last updated
.
Post not yet marked as solved
3 Replies
90 Views
Xcode15 crash reloadItems - Invalid update: invalid number of items in section When building with xcode14, there was no crash when calling collectionView reloadItems. If you build with xcode15 A crash pops up saying Invalid update: invalid number of items in section. The current issue has been resolved, but I would like to know the exact cause of the crash in xcode 15. Does anyone know?
Posted
by kodaewon.
Last updated
.

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all