Posts

Sort by:
Post not yet marked as solved
0 Replies
11 Views
If I annotate a class with @Observable I get this error in @Query: Expansion of macro 'Query()' produced an unexpected 'init' accessor If I remove @Observable the error goes away. Elsewhere I have .environment referencing the class. With @Observable this complains that the class needs to be @Observable. I am mystified. Does anyone have a suggestion?
Posted
by
Post not yet marked as solved
0 Replies
15 Views
I'm trying to download artifacts from some recent Xcode Cloud builds. In both Xcode and App Store Connect I'm getting errors. Xcode says: "Error Fetching Test Results: API Invalid status code: 501. App Store Connect says: "artifacts could not be found." FB13773789 - Xcode Cloud: Service returning 501 in Xcode when trying to view artifacts of successful build from minutes ago I have tried several projects to rule out project specific issues and it is happening to all of my Xcode Cloud enabled projects. Both Xcode 15.3 and 15.4 beta exhibit this behavior. Is anyone else running into this issue? I noticed it yesterday, and it continues into this morning.
Posted
by
Post not yet marked as solved
0 Replies
19 Views
I am trying to map the 3D skeleton joint positions of an ARBodyAnchor to the real body on the camera image. I know I could simply use the "detectedBody" of the ARFrame, which would already deliver the normalized 2D position of each joint, but what I am mostly interested in is the z-axis (the distance of each joint to the camera). I am starting a ARBodyTrackingConfiguration, setting the world alignment to ARWorldAlignmentCamera (in which case the camera transform is an identity matrix) and multiplying each joint transform in model space (via modelTransformForJointName:) with the transform of the ARBodyAnchor. And then tried many different ways to get the joints to line up with the image, by for example multiplying the transforms with the projectionMatrix of the ARCamera. But whatever I do, it never lines up correctly. For example, the doesn't really seem to be a scale factor in the projectionMatrix or the ARBodyAnchor transform, no matter the distance of the camera to the detected body, the scale of the body is always the same. Which means I am missing something important, and I haven't figured out what. So does anyone have an example of how I can get the body align to the camera image? (or get the distance to each joint in any other way?) Thanks!
Posted
by
Post not yet marked as solved
0 Replies
20 Views
Dear Sirs, I'm writing an audio application that should show up to 128 horizontal peakmeters (width for each is about 150, height is 8) stacked inside a ScrollViewReader. For the actual value of the peakmeter I have a binding to a CGFloat value. The peakmeter works as expected and is refreshing correct. For testing I added a timer to my swift application that is firing every 0.05 secs, meaning I want to show 20 values per second. Inside the timer func I'm just creating random CGFloat values in range of 0...1 for the bound values. The peakmeters refresh and flicker as expected but I can see a CPU load of 40-50% in the activity monitor on my MacBook Air with Apple M2 even when compiled in release mode. I think this is quite high and I'd like to reduce this CPU load. Should this be possible? I.e. I thought about blocking the refresh until I've set all values? How could this be done and would it help? What else could I do? Thanks and best regards, JFreyberger
Posted
by
Post not yet marked as solved
0 Replies
27 Views
On macOS, that is. The goals are largely for testing, where we'd like to know the maximum and minimum memory our processes are using, but we'd also like to know it on crash. Our current method is to use ps periodically and grab the appropriate field, but is there a better way? (I looked at MetricKit, but it's not as useful on macOS; I filed FB13640765 "MetricKit would be awesome with more mac features" a couple of months ago.)
Posted
by
Post not yet marked as solved
0 Replies
26 Views
Hello, I am working on a fairly complex iPhone app that controls the front built-in wide angle camera. I need to take and display a sequence of photos that cover the whole range of focus value available. Here is how I do it : call setExposureModeCustom to set the first lens position wait for the completionHandler to be called back capture a photo do it again for the next lens position. etc. This works fine, but it takes longer than I expected for the completionHandler to be called back. From what I've seen, the delay scales with the exposure duration. When I set the exposure duration to the max value: on the iPhone 14 Pro, it takes about 3 seconds (3 times the max exposure) on the iPhone 8 1.3s (4 times the max exposure). I was expecting a delay of two times the exposure duration: take a photo, throw one away while changing lens position, take the next photo, etc. but this takes more than that. I also tried the same thing with changing the ISO instead of the focus position and I get the same kind of delays. Also, I do not think the problem is linked to the way I process the images because I get the same delay even if I do nothing with the output. Is there something I could do to make things go faster for this use-case ? Any input would be appreciated, Thanks I created a minimal testing app to reproduce the issue : import Foundation import AVFoundation class Main:NSObject, AVCaptureVideoDataOutputSampleBufferDelegate { let dispatchQueue = DispatchQueue(label:"VideoQueue", qos: .userInitiated) let session:AVCaptureSession let videoDevice:AVCaptureDevice var focus:Float = 0 override init(){ session = AVCaptureSession() session.beginConfiguration() session.sessionPreset = .photo videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)! super.init() let videoDeviceInput = try! AVCaptureDeviceInput(device: videoDevice) session.addInput(videoDeviceInput) let videoDataOutput = AVCaptureVideoDataOutput() if session.canAddOutput(videoDataOutput) { session.addOutput(videoDataOutput) videoDataOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA ] videoDataOutput.setSampleBufferDelegate(self, queue: dispatchQueue) } session.commitConfiguration() dispatchQueue.async { self.startSession() } } func startSession(){ session.startRunning() //lock max exposure duration try! videoDevice.lockForConfiguration() let exposure = videoDevice.activeFormat.maxExposureDuration.seconds * 0.5 print("set max exposure", exposure) videoDevice.setExposureModeCustom(duration: CMTime(seconds: exposure, preferredTimescale: 1000), iso: videoDevice.activeFormat.minISO){ time in print("did set max exposure") self.changeFocus() } videoDevice.unlockForConfiguration() } func changeFocus(){ let date = Date.now print("set focus", focus) try! videoDevice.lockForConfiguration() videoDevice.setFocusModeLocked(lensPosition: focus){ time in let dt = abs(date.timeIntervalSinceNow) print("did set focus - took:", dt, "frames:", dt/self.videoDevice.exposureDuration.seconds) self.next() } videoDevice.unlockForConfiguration() } func next(){ focus += 0.02 if focus > 1 { print("done") return } changeFocus() } func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection){ print("did receive video frame") } }
Posted
by
Post not yet marked as solved
2 Replies
40 Views
I am trying to disable the screenshot on iOS app but apple does not expose any api for this. Due to which I added a workaround when iPhone will try to take screenshot we have a textfield in secured way which will cover the whole screen and screenshot will be blank. Does anyone tried the same approach to avoid screenshot? if yes did apple act against those application or they reject if they find. It will be great help if Apple can provide us some insight on it.
Posted
by
Post not yet marked as solved
0 Replies
67 Views
Our keyboard extension can be accessed independently in China region with native app like Notes or Safari, however the keyboard can only be opened in the app under same project in Taiwan region. I've checked some articles about how MDM managing extensions, also make sure our RequestOpenAccess option of keyboard extension info.plist also set to Yes. I'm not sure is there anything I missed, or I just need to inform client that they need to reach out their MDM manager and modify some restrictions? If keyboard supports mobile device management (MDM), it can work with managed apps. App extensions give third-party developers a way to provide functionality to other apps or even to key systems built into the operating systems Allow full access to custom keyboard in iOS
Posted
by
Post not yet marked as solved
0 Replies
52 Views
I'm adding many users to internal testing purpose. A few users did't received and welcome to apple developer program after signing in from invitation email. I've checked and didn't they on the user lists, so I can't added to internal tester group. We've tried so many times. I noticed that every users that had the same problem didn't see the trust this device page during the signing process. How should I fix this problem.
Posted
by
Post not yet marked as solved
3 Replies
74 Views
Hello, I am new to IOS Development and I am trying to create new project but swift ui view is not showing.
Posted
by
Post not yet marked as solved
1 Replies
53 Views
Hello everyone! I'm currently working on an iOS app developed with Swift that involves connecting to a specific bluetooth device and exchanging data even when the app is terminated or running in the background. I just want to understand why the CBCenterManager should be Implicitly unwrapped to use it. I have check out couple off apple developer sample project it was set to Implicitly unwrapped. can some one help to understand the reason behind this, also what are possible issues/scenario trigger if we set the centermanager to optional "CBcentermanager?" Thanks in Advance!
Posted
by
Post not yet marked as solved
0 Replies
49 Views
Hello! Is it possible to turn on hand pass-through (hand cutout, not sure what is the correct name, real hands from the camera) in WebXR when in immersive-vr and hand tracking feature is enabled? I only see my hands when I disable hand tracking feature in WebXR, but then I don't get the joints positions and orientations.
Posted
by
Post not yet marked as solved
0 Replies
59 Views
Hi everyone! We are wondering whether it's possible to have two macOS apps use the Voice Processing from Audio Engine at the same time, since we have had issues trying to do so. Specifically, our app seems to cut off the input stream from the other, only if it has Voice Processing enabled. We are developing a macOS app that records microphone input simultaneously with videoconference apps like Zoom. We are utilizing the Voice Processing from Audio Engine like in this sample: https://developer.apple.com/documentation/avfaudio/audio_engine/audio_units/using_voice_processing We have also noticed this behaviour in Safari recording audios with the Javascript Web Audio API, which also seems to use Voice Processing under the hood due to the Echo Cancellation. Any leads on this would be greatly appreciated! Thanks
Posted
by
Post not yet marked as solved
1 Replies
58 Views
Hi everyone! We use to have an intel Mac machine where we generate the Developer ID Installer & Application certs for signing and notarization process. This process works sweet. Now, we move from an intel to a m1 Mac machine, where we want to do the same process as before. I had try two different approaches, but ending up with the same result. I export the cert with the private key from my intel to the m1 machine, but when I try to sign, I get: Invalid signature. (Not sure what this error means in this case as everything works on the intel machine. I am guessing the cipher for creating either the private key or the signature differs between the architecture) I try to generate new certs for this m1 machine, but I get the following error: You already have a current Developer ID installer certificate or a pending certificate request. I try with the same account, but also with a different account. In both cases got the same error. I create a ticket for apple, where they said to expect a reply between one and two business days, but no luck yet.
Posted
by
Post not yet marked as solved
0 Replies
46 Views
Per the apple API documentation (https://developer.apple.com/documentation/tipkit/tipuipopoverviewcontroller/imagesize), imageSize is meant to control the size of the image within the tip, but it doesn't seem to be working. My code is as follows, taken from the apple docs example: tipObservationTask = tipObservationTask ?? Task { @MainActor [weak controller] in for await shouldDisplay in tip.shouldDisplayUpdates { if shouldDisplay { let popoverController = TipUIPopoverViewController(tip, sourceItem: sourceItem) popoverController.imageSize = CGSize(width: 10, height: 10) controller?.present(popoverController, animated: true) tipPopoverController = popoverController } else { if controller?.presentedViewController is TipUIPopoverViewController { controller?.dismiss(animated: true) tipPopoverController = nil } } } }
Posted
by
L4D
Post not yet marked as solved
0 Replies
50 Views
Hello, I've got a "other swift flag" set up for one of my schemes. It works when I build locally, but not when I try building using Xcode Cloud. My Other Swift Flags is set up like this: My code looks something like this: class Config { #if LIMITED static let configProperty = 1 #endif } However, Xcode Cloud says Config has no member 'configProperty', when I build my 'Limited' scheme.
Posted
by
Post not yet marked as solved
0 Replies
50 Views
What documents would count for verifying my address for the EU DSA? Do I have to own the address, or just be reachable from it? My family has a PO Box that they share that I was hoping to use so I don't have to list my home address, or pay for my own box that would rarely receive any mail.
Posted
by
Post not yet marked as solved
3 Replies
68 Views
We have a random issue that when ARKitSession.run() is called, monitorSessionEvents() receives .paused and it never transitions to .running If we exit Immersive Space and do ARKitSesssion.run() again it works fine. Unfortunately this is very difficult to manage in the flow of our App.
Posted
by
Post not yet marked as solved
1 Replies
52 Views
Hello, I am trying to enumerate all ways on macOS for launching an application when a user opens a session. Please note i am not looking for a way which requires root or sudo privileges. I have found this: ~/Library/LaunchAgents/ Login Items (in macOS System Settings) But are there others ? Thanks
Posted
by
Post not yet marked as solved
1 Replies
55 Views
I had DuckDuckGo browser installed and working fine on my MacBook Pro for years. Currently DDG Verion 1.85.0, MBP M2 Max Sonoma 14.4.1 Up until two days ago everything worked fine. Then, while using DDG to access a Copyright.gov site a security alert popped up. Now, because I followed a link from an email from the copyright.gov office, I suspected it might be suspicious so I hit escape until it canceled the load and exited DDG. I used the browser successfully the rest of the day with no issues. The following day DDG would not open any websites I tried except two (google and duotrope.com) The next day, no website would open. The error message I received was "DuckDuckGo Can't Load This Page" and "The network connection was lost" There is no problem with my network. All email, other browsers, streaming services, everything works fine. But not DDG. I have reinstalled DDG several times, deleting any remaining DDG files I could fine, in between. I even tried it in safe mode and I rebooted the WiFi router. Still the same error. I've checked network settings, keychain permissions, antivirus permissions (I use intego) I cannot seem to figure out what could be causing the issue. I have another MacBook Pro with DDG installed and it works fine. I tried copying that one to this MBP and I get the same error. Hell, at this point I'd light incense candles and sacrifice a chicken if it would work. Does anyone have any idea what could be causing my issue?
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all