Posts

Sort by:
Post not yet marked as solved
0 Replies
2 Views
No matter what I have tried, I can't get my test "What would you like to do?" to appear at the top of the screen. What have I missed? Is something g to do with the ZStack? import SwiftUI import SwiftData struct ContentView: View { @Environment(\.modelContext) private var context @Query private var readings: [Readings] @State private var readingTimeStamp: Date = Date() var body: some View { ZStack { Image("IPhone baqckgound 3") .resizable() .aspectRatio(contentMode: .fill) .padding(.top, 40) VStack { Text("What would you like to do?") .font(.title) .fontWeight(.bold) } .padding() Spacer() } } }
Posted
by
Post not yet marked as solved
0 Replies
18 Views
When attempting to add a recent build to an External test group, I am receiving the error: This build is using a beta version of Xcode and can’t be submitted. Make sure you’re using the latest version of Xcode or the latest seed release found on the releases tab in News and Updates This build is a macOS app built with Unity, which is then codesigned, packaged with productbuild, and then uploaded with xcrun altool --upload-package. The version of Xcode being used was released about a week ago. Xcode Version 15.4 (15F31d), built of an Intel Mac Mini running macOS 14.4.1 (23E224). I have seen threads in the past about this, but nothing recent. (1,2)
Posted
by
Post not yet marked as solved
0 Replies
21 Views
I get a crash every time I try to swap this texture for a drawable queue. I have a DrawableQueue leftQueue created on the main thread, and I invoke this block on the main thread. Scene.usda contains a reference to a model cinema. It crashes on the line with the replace(). if let shaderMaterial = try? await ShaderGraphMaterial(named: "/Root/cinema/_materials/Screen", from: "Scene.usda", in: theaterBundle) { if let imageParam = shaderMaterial.getParameter(name: "image"), case let .textureResource(imageTexture) = imageParam { imageTexture.replace(withDrawables: leftQueue) } } One weird thing, the imageParam has an invalid value when it crashes. imageParam RealityFoundation.MaterialParameters.Value <invalid> (0x0) Here is the stack trace is: * thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=1, address=0x9) frame #0: 0x0000000191569734 libobjc.A.dylib`objc_release + 8 frame #1: 0x00000001cb9e5134 CoreRE`re::SharedPtr<re::internal::AssetReference>::reset(re::internal::AssetReference*) + 64 frame #2: 0x00000001cba77cf0 CoreRE`re::AssetHandle::operator=(re::AssetHandle const&) + 36 frame #3: 0x00000001ccc84d14 CoreRE`RETextureAssetReplaceDrawableQueue + 228 frame #4: 0x00000001acf9bbcc RealityFoundation`RealityKit.TextureResource.replace(withDrawables: RealityKit.TextureResource.DrawableQueue) -> () + 164 * frame #5: 0x00000001006d361c Screenlit`TheaterView.setupMaterial(self=Screenlit.TheaterView @ 0x000000011e2b7a30) at TheaterView.swift:245:74 frame #6: 0x00000001006e0848 Screenlit`closure #1 in closure #1 in closure #1 in closure #1 in TheaterView.body.getter(self=Screenlit.TheaterView @ 0x000000011e2b7a30) at TheaterView.swift:487 frame #7: 0x00000001006f1658 Screenlit`partial apply for closure #1 in closure #1 in closure #1 in closure #1 in TheaterView.body.getter at <compiler-generated>:0 frame #8: 0x00000001004fb7d8 Screenlit`thunk for @escaping @callee_guaranteed @Sendable @async () -> (@out τ_0_0) at <compiler-generated>:0 frame #9: 0x0000000100500bd4 Screenlit`partial apply for thunk for @escaping @callee_guaranteed @Sendable @async () -> (@out τ_0_0) at <compiler-generated>:0
Posted
by
Post not yet marked as solved
0 Replies
23 Views
Hello, I can't delete my app because my subscription group was in 'Waiting for Review' status. Then I deleted subscriptions and all subscription groups. The steps I followed: Submitted app to the review (new app) App Store Team Rejected I decided to make changes in the app Submitted new build for the review App was in the status "Waiting for the review" I decided to don't publish my app Removed Subscription groups and removed app from the sale Tried to delete the app Now I can't delete the app. Is there way to delete this app? I don't like having an app that isn't published to App Store. In other words, I don't want to see this app in the App Store Connect
Posted
by
Post not yet marked as solved
0 Replies
26 Views
I'm having an issue where when my asset catalog have more than 2 images (all have @1x @2x and @3x and PNG format), my NSImage in my NSImageView cannot be clicked. Does anyone know why this happens? Thanks in advance! import SwiftUI struct ContentView: View { @State private var window: NSWindow? var body: some View { VStack { Button("Open Window") { // Create and show the NSWindow self.window = NSWindow( contentRect: NSScreen.main?.frame ?? NSRect.zero, styleMask: [.borderless], backing: .buffered, defer: false ) // Set up window properties self.window?.isOpaque = false self.window?.hasShadow = false self.window?.backgroundColor = .clear self.window?.level = .screenSaver self.window?.collectionBehavior = [.canJoinAllSpaces] self.window?.makeKeyAndOrderFront(nil) // Create an NSImageView let petView = PetView() // Add the NSImageView to the window's content view if let contentView = self.window?.contentView { contentView.addSubview(petView) // Center the petView petView.centerXAnchor.constraint(equalTo: contentView.centerXAnchor).isActive = true petView.centerYAnchor.constraint(equalTo: contentView.centerYAnchor).isActive = true } } } } } class PetView: NSImageView { override init(frame frameRect: NSRect = .zero) { super.init(frame: frameRect) self.image = NSImage(named: "dog_idle-1") self.translatesAutoresizingMaskIntoConstraints = false } required init?(coder: NSCoder) { super.init(coder: coder) } override func mouseDown(with event: NSEvent) { print("woof!") } } I've tried changing the amount of images in my asset catalog and found that 2 is the maximum amount for my NSImage to be clickable. It suppose to print "woof!" when i click on it.
Posted
by
Post not yet marked as solved
0 Replies
24 Views
i am working application ABC where user can save configuration. Then later it can be open using same file. but when i am saving this configuration file. its not showing the icon. the application is built using qt c++.. here is my plist what is the possible cause? i tried directly providing path of icon in plist. i tried CFBundleSignature from ???? to ABCC
Posted
by
Post not yet marked as solved
0 Replies
28 Views
i made a virtual machine with the help of the tutorial on the apple developer website running macos on my m2 macbook pro. I am wondering how I can integrate my mouse in the virtual machine instead of it being on top of the app.
Posted
by
Post not yet marked as solved
0 Replies
27 Views
Hello, I am facing a passkey authentication during cross device authentication. After I called completeAssertionRequest with passkeyCredential, I am encountering the issue attach and I am unable to find where the RPID mismatch is coming from and referencing from. It would be great if I could know the source of where RPID hash (found). Thanks. Returned credential failed validation: Error Domain=com.apple.AuthenticationServicesCore.AuthorizationError Code=14 "RPID hash did not match expected value. Expected xMTokW1VIYg2DZVB9lCtheT+0n8NxHvx4HaxTPhH4bY=, found: eE1Ub2tXMVZJWWcyRFpWQjlsQ3RoZVQtMG44TnhIdng=." UserInfo={NSLocalizedFailureReason=RPID hash did not match expected value. Expected xMTokW1VIYg2DZVB9lCtheT+0n8NxHvx4HaxTPhH4bY=, found: eE1Ub2tXMVZJWWcyRFpWQjlsQ3RoZVQtMG44TnhIdng=.}
Posted
by
Post not yet marked as solved
0 Replies
41 Views
Hi, ho un'idea per un'applicazione che può rivoluzionare e agevolare riguardo l'utilizzo degli apparati video (telefoni, monitor, tv) per almeno il75/80% dell'intera popolazione mondiale. Vorrei che fosse un'operazione realizzata inizialmente da apple ios/os e in seguito su tutti i supporti. Mi piacerebbe creare, sviluppare o cedere l'idea ma non so proprio da dove iniziare. potete darmi info su questo? so che può sembrare una richiesta astrana, ma sono certo che questa app sarà una vera rivoluzione per tutti, o quasi Grazie Giovanni da Cagliari, Sardegna, Italia Hi, I have an idea for an application that can revolutionize and facilitate the use of video equipment (telephones, monitors, TVs) for at least 75/80% of the entire world population. I would like it to be an operation carried out initially by apple ios/os and later on all media. I would like to create, develop or sell the idea but I really don't know where to start. can you give me info on this? I know it may seem like a strange request, but I'm sure that this app will be a real revolution for everyone, or almost everyone Thank you Giovanni from Italy [Edited by Moderator]
Posted
by
Post not yet marked as solved
0 Replies
21 Views
I am using Pushkit + Callkit and I am facing issue: I receive incoming call when device is locked screen state then I answer and end call. The next incoming call will alway display on left corner screen device as below image. If I receive incoming call when device is unlocked screen state, my application receive, answer call and end call normally.
Posted
by
Post not yet marked as solved
0 Replies
25 Views
hello I am trying to detect the orientation of text in images. (each image has a label with a number but sometimes the the label is not in the right orientation and I would like two detect these cases and add a prefix to the image files) this code is working well but when the text is upside down it considers that the text is well oriented is it a way to distinguish the difference ? thanks for your help ! import SwiftUI import Vision struct ContentView: View { @State private var totalImages = 0 @State private var processedImages = 0 @State private var rotatedImages = 0 @State private var remainingImages = 0 var body: some View { VStack { Button(action: chooseDirectory) { Text("Choisir le répertoire des images") .padding() } Text("TOTAL: \(totalImages)") Text("TRAITEES: \(processedImages)") Text("ROTATION: \(rotatedImages)") Text("RESTANT: \(remainingImages)") } .padding() } func chooseDirectory() { let openPanel = NSOpenPanel() openPanel.canChooseDirectories = true openPanel.canChooseFiles = false openPanel.allowsMultipleSelection = false openPanel.begin { response in if response == .OK, let url = openPanel.url { processImages(in: url) } } } func processImages(in directory: URL) { DispatchQueue.global(qos: .userInitiated).async { do { let fileManager = FileManager.default let urls = try fileManager.contentsOfDirectory(at: directory, includingPropertiesForKeys: nil) let imageUrls = urls.filter { $0.pathExtension.lowercased() == "jpg" || $0.pathExtension.lowercased() == "png" } DispatchQueue.main.async { self.totalImages = imageUrls.count self.processedImages = 0 self.rotatedImages = 0 self.remainingImages = self.totalImages } for url in imageUrls { self.processImage(at: url) } } catch { print("Error reading contents of directory: \(error.localizedDescription)") } } } func processImage(at url: URL) { guard let image = NSImage(contentsOf: url), let cgImage = image.cgImage(forProposedRect: nil, context: nil, hints: nil) else { return } let request = VNRecognizeTextRequest { (request, error) in if let error = error { print("Error recognizing text: \(error.localizedDescription)") return } if let results = request.results as? [VNRecognizedTextObservation], !results.isEmpty { let orientationCorrect = self.isTextOrientationCorrect(results) if !orientationCorrect { self.renameFile(at: url) DispatchQueue.main.async { self.rotatedImages += 1 } } } DispatchQueue.main.async { self.processedImages += 1 self.remainingImages = self.totalImages - self.processedImages } } let handler = VNImageRequestHandler(cgImage: cgImage, options: [:]) do { try handler.perform([request]) } catch { print("Error performing text recognition request: \(error.localizedDescription)") } } func isTextOrientationCorrect(_ observations: [VNRecognizedTextObservation]) -> Bool { // Placeholder for the logic to check text orientation // This should be implemented based on your specific needs for observation in observations { if let recognizedText = observation.topCandidates(1).first { let boundingBox = observation.boundingBox let angle = atan2(boundingBox.height, boundingBox.width) if abs(angle) > .pi / 4 { return false } } } return true } func renameFile(at url: URL) { let fileManager = FileManager.default let directory = url.deletingLastPathComponent() let newName = "ROTATION_" + url.lastPathComponent let newURL = directory.appendingPathComponent(newName) do { try fileManager.moveItem(at: url, to: newURL) } catch { print("Error renaming file: \(error.localizedDescription)") } } } struct ContentView_Previews: PreviewProvider { static var previews: some View { ContentView() } }
Posted
by
Post not yet marked as solved
0 Replies
27 Views
My applescript calls a perl script to convert a file. When I call the perl from commandline, everything works fine. When I call the applescript (by dropping a file on it), this error occurs: Can't load '/Library/Perl/5.30/darwin-thread-multi-2level/auto/Encode/Encode.bundle' for module Encode: dlopen(/Library/Perl/5.30/darwin-thread-multi-2level/auto/Encode/Encode.bundle, 0x0001): tried: '/Library/Perl/5.30/darwin-thread-multi-2level/auto/Encode/Encode.bundle' (mach-o file, but is an incompatible architecture (have 'arm64', need 'x86_64')), '/ which I interprete that OSA runs in Rosetta mode now. This error did not occur at least until Mar 27, when I used it last. Of course on the same machine. How is it possible that OSA now is called under Rosetta? Has there been a chacnge in Ventura 13.6.6? How to control that this OSA runs as native arm64?
Posted
by
Post not yet marked as solved
0 Replies
24 Views
can the example from Support external cameras in your iPadOS app work on IOS 17.5 Iphone 15PRO ? https://developer.apple.com/videos/play/wwdc2023/10106/
Posted
by
Post not yet marked as solved
0 Replies
51 Views
I have an executable in macOS that I m launching as a User Agent. The same executable can be launched in multiple ways like either user can directly click the exe to launch it, or user can launch it from the terminal using ./ etc. One similar way is when the user launches the exe as a User Agent(i.e daemon in user session). In this scenarios, I want to identify in my exe If my user has launched it as agent to perform certain task. I wanted to know how can I accurately determine this? I have tried figuring out If there is some unique session that agents operate in, but I could not find anything. Can someone help here? Is this even possible?
Posted
by
Post not yet marked as solved
1 Replies
58 Views
Why does this Regex Builder code in my SwiftUI app not work? I'm parsing a string that might be a date and time with either AM or PM specified for the time. This bit of code looks for the optional AM or PM. The error I get is: The compiler is unable to type-check this expression in reasonable time; try breaking up the expression into distinct sub-expressions What would 'distinct sub-expressions' mean in this case? The code: let ampmRef = Reference<Substring>() let ampmReg = Regex { Capture(as: ampmRef) { ZeroOrMore { ChoiceOf { One("am") One("pm") } } } transform: { $0.lowercase } }.ignoresCase() In a related question, is there a way to return a default if the ChoiceOf fails both AM and PM?
Posted
by
Post not yet marked as solved
0 Replies
69 Views
I've started getting reports of this today and I am able to replicate it on my end but looking to see if anyone else can verify or if it's possibly regional to me (Canada). In Apple Maps (iOS or macOS), if you search a latitude and longitude -- for example: "49.110,-112.110" and search, it centers on the location as it always has and shows the "Directions" button. When you tap the directions button, I get "A route can't be shown because of a problem connecting to the server.". Alternatively, if you pass the coordinate in via Apple Maps URL (https://maps.apple.com/?daddr=49.110,-112.110) it will route but the route is no longer to those specific coordinates, Apple Maps alters them to some nearest known entity (in this case, the RM of Warner County). If you compare the suggested route end destination with the search results for specifically entering the coordinates, you will see they are different locations and mapping routes are not actually taking you to the coordinates anymore. In the last photo attached, the arrow points to where "49.110,-112.110" is actually located which tapping the "Directions" button cannot figure out a route because of a server issue. If you pass it in via URL, it changes the destination coordinates and begins a route quite a ways away from the intended coordinate. The problem started happening either this morning or last night. Can anyone else confirm this happens to them? Thanks, Mike
Posted
by
Post not yet marked as solved
1 Replies
91 Views
Dear developers, now that we have played with Vision Pro for 3 months, I am wondering why some features are missing on Vision Pro, especially some seem to be very basic/fundamental. So I would like to see if you know more about the reasons or correct me if I'm wrong! You are also welcome to share features that you think is fundamental, but missing on Vision Pro. My list goes below: (1) GPS/Compass: cost? heat? battery? (2) Moving image tracking: surrounding environment processing is already too computing intensive? (3) 3D object tracking: looks like only supported on iOS and iPadOS, but not visionOS (4) Does not invoke application focus/pause callback: maybe I'm wrong? But we were not able to detect if an app has been put on background or brought to foreground to invoke a callback
Posted
by
Post not yet marked as solved
0 Replies
61 Views
I'm trying to understand better how to 'navigate' around a large USD scene inside a RealityView in SwiftUI (itself in a volume on VisionOS). With a little trial and error I have been able to understand scale and translate transforms, and I can have the USD zoom to 'presets' of different scale and translation transforms. Separately I can also rotate an unscaled and untranslated USD, and have it rotate in place 90 degrees at a time to return to a rotation of 0 degrees. But if I try to combine the two activities, the rotation occurs around the center of the USD, not my zoomed location. Is there a session or sample code available that combines these activities? I think I would understand relatively quickly if I saw it in action. Thanks for any pointers available!
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all