Posts

Sort by:
Post not yet marked as solved
0 Replies
14 Views
We have a workspace with three projects in it. Trying to export localizations for the workspace fails with the "ComputeTargetDependencyGraph failed with a nonzero exit code" error but with no additional information to track down the failure. Here are the exact steps I've tried: Click Menu Bar > Product > Export Localizations > Workspace (the first item in the menu) A few moments later, an error alert pops up that says "Unable to build project for localization string extraction" In the build log tab, it shows this: If I try running xcodebuild -exportLocalizations -localizationPath ~/ExportedWorkspaceLocalizations -workspace <workspaceLocation> -exportLanguage en, the same "ComputeTargetDependencyGraph failed with a nonzero exit code" error message appears. Exporting the three projects individually works great when I go to Menu Bar > Product > Export Localizations > Select one of the three projects instead of the workspace. Has anyone else run into this error? I haven't been able to find any additional build logs that would point to a more concrete error.
Posted Last updated
.
Post not yet marked as solved
0 Replies
24 Views
I have a Canvas inside a ScrollView on a Mac. The Canvas's size is determined by a model (for the example below, I am simply drawing a grid of circles of a given radius). Everything appears to works fine. However, I am wondering if it is possible for the Canvas rendering code to know what portion of the Canvas is actually visible in the ScrollView? For example, if the Canvas is large but the visible portion is small, I would like to avoid drawing content that is not visible. Is this possible? Example of Canvas in a ScrollView I am using for testing: struct MyCanvas: View { @ObservedObject var model: MyModel var body: some View { ScrollView([.horizontal, .vertical]) { Canvas { context, size in // Placeholder rendering code for row in 0..<model.numOfRows { for col in 0..<model.numOfColumns { let left: CGFloat = CGFloat(col * model.radius * 2) let top: CGFloat = CGFloat(row * model.radius * 2) let size: CGFloat = CGFloat(model.radius * 2) let rect = CGRect(x: left, y: top, width: size, height: size) let path = Circle().path(in: rect) context.fill(path, with: .color(.red)) } } } .frame(width: CGFloat(model.numOfColumns * model.radius * 2), height: CGFloat(model.numOfRows * model.radius * 2)) } } }
Posted
by Todd2.
Last updated
.
Post not yet marked as solved
0 Replies
19 Views
I'm currently working with complication using widgetkit for watchOS. When I select complication from Watch app in iPhone, The complication does not show content. In complication gallery, untitled complication is selecting. But when I select complication from watch, it's OK. This bug occurs in both real device and simulator. But it happen in some pair. Example: watch ultra (os 10.4) pair with iPhone 14 pro (os 17.0): NG watch ultra (os 10.4) pair with iPhone 14 pro (os 16.1): NG watch ultra (os 10.0) pair with iPhone 14 pro (os 17.0): OK I tried create simple project to check this bug. But this bug still occurs This is sample project: Github
Posted
by Kaze2910.
Last updated
.
Post not yet marked as solved
0 Replies
24 Views
Hi! I was trying to port our sdk for visionOS. I was going through the documentation and saw this video: https://developer.apple.com/videos/play/wwdc2023/10089/ Is there any working code sample for it, same goes for arkit c api ? Couldn't find any links. Thanks in advance. Sahil
Posted
by saagn.
Last updated
.
Post not yet marked as solved
1 Replies
77 Views
Hello everyone, few months ago I submitted my new app for review, but got automated message back: We need additional time to evaluate your submission and Apple Developer Program account. Your submission status will appear as "Rejected" in App Store Connect while we investigate. However, we do not require a revised binary or additional information from you at this time. thought it won't take long (I was wrong) , I waited for a whole month and nothing, then I started researching and some other devs suggested to cancel that submition and resubmit , and I did just that , same thing happened this time , almost 3 weeks that i've been waiting , if any one from apple sees this please help me :(
Posted
by ExoB.
Last updated
.
Post not yet marked as solved
0 Replies
24 Views
Hello! I need to display a .scnz 3D model in an iOS app. I tried converting the file to a .scn file so I could use it with SCNScene but the file became corrupted. I also tried to instantiate a SCNScene with the .scnz file but that didn't work either (crash when instantiating it). After all this, what would be the best way to use this file knowing that converting it or exporting it to a .scn file with scntool hasn't worked? Thank you!
Posted Last updated
.
Post not yet marked as solved
0 Replies
29 Views
The status remains "In App Review" continuously. I deleted the submission and resubmitted it, but it still remains in "In App Review" status. Apple ID:6480371419
Posted Last updated
.
Post not yet marked as solved
0 Replies
25 Views
I am developing an immersive application featured with hands interacting my virtual objects. When my hand passes through the object, the rendered color of my hand is like blending hand color with object's color together, both semi transparent. I wonder if it is possible to make my hand be always "opaque", or say the alpha value of rendered hand (coz it's VST) is always 1, but the object's alpha value could be varied in terms of whether it is interacting with hand. (I was thinking this kind of feature might be supported by a specific component (just like HoverEffectComponent), but I didn't find that out)
Posted
by milanowth.
Last updated
.
Post not yet marked as solved
0 Replies
27 Views
Good day. I'm inquiring if there is a way to test functionality between Apple Pencil Pro and Apple Vision Pro? I'm trying to work on an idea that would require a tool like the Pencil as an input device. Will there be an SDK for this kind of connectivity?
Posted
by MrDanger.
Last updated
.
Post not yet marked as solved
0 Replies
20 Views
Steps to reproduce: Connect iPhone to a Bluetooth keyboard, and enable "Full Keyboard Access" in settings Got to https://material.angular.io/components/select/examples Open any dropdown and use keyboard to tab away Focus moved to the next control and dropdown panel is still open Expected Behavior: Dropdown should be collapsed when user tabs away using keyboard
Posted
by denises0.
Last updated
.
Post not yet marked as solved
0 Replies
20 Views
Hello iPhone xs for testing XCode 15.3 I've successfully managed to select the AID and send other commands to a SmartCard using the following code: class NFCReader: NSObject, ObservableObject, NFCTagReaderSessionDelegate { func startNFCSession() { session = NFCTagReaderSession(pollingOption: [.iso14443], delegate: self, queue: nil) session?.alertMessage = "Hold your iPhone near the NFC tag." session?.begin() // Start the session } func tagReaderSession(_ session: NFCTagReaderSession, didDetect tags: [NFCTag]) { guard let tag = tags.first else { // Safely unwrap the first tag session.invalidate(errorMessage: "No SmartCard detected.") return } session.connect(to: tag, completionHandler: { (error: Error?) in if let error = error { session.invalidate(errorMessage: "Unable to connect to SmartCard: \(error.localizedDescription)") return } switch tag { case let .iso7816(iso7816Tag): self.sendAPDUCommandSelect(to: iso7816Tag) self.sendAPDUCommand_2(to: iso7816Tag) default: print("Not iso7816 type") // Update the SwiftUI view break } }) } } I encounter an issue when attempting to interact with the user for specific commands. I need to maintain the session active in the background while waiting for additional commands. However, upon closing the NFC dialog, I receive the NFCError Code=200 "Session invalidated by the user" error. My ideal use case: Button1: Initiates the connection and waits for the SmartCard to be presented. tagReaderSession() connects to the SmartCard and sends the AID. Button2: Sends APDU command 2. Button3: Sends APDU command 3. To proceed with clicking the other buttons, the NFC dialog must be closed, but doing so invalidates the session with error 200. Is there a way to keep the connection valid in the background while waiting for additional commands?
Posted Last updated
.
Post not yet marked as solved
0 Replies
37 Views
We've switched over to using the new App Store Server API on our server instead of the deprecated Apple verify receipt method. But we've noticed some differences between our purchase records approved by the App Store Server API and what's shown on the Apple iTunes dashboard for purchases made on May 7th, 2024. Just to clarify, the purchase dates are in UTC time and are taken from the Apple Store API transaction requests. Below are the purchase dates and the subscription names for the 7th. 2024-05-07 22:46:16 - monthly subscription 2024-05-07 22:31:53 - monthly subscription 2024-05-07 22:26:53 - six months subscription 2024-05-07 22:19:16 - monthly subscription 2024-05-07 22:01:04 - monthly subscription 2024-05-07 21:28:56 - six months subscription 2024-05-07 20:09:04 - six months subscription 2024-05-07 20:01:31 - six months subscription 2024-05-07 19:43:24 - monthly subscription 2024-05-07 19:12:30 - annual subscription 2024-05-07 18:30:36 - six months subscription 2024-05-07 00:43:33 - monthly subscription While our records indicate these transactions, the Apple iTunes dashboard only displays 3 monthly subscriptions for May 7th. May 6th and 5th is worse; almost 1 or 2 monthly subscriptions are showing. We think there's a problem that Apple engineers or developers need to look into. Before, when we used the verify receipt method in our API, purchases would show up on the iTunes dashboard the next day. But now that we've switched to the new transaction method, there's a delay in seeing these purchases on the dashboard. Have you migrated to the new App Store Server API and encountered a similar issue? What could be causing this delay?
Posted
by thus78.
Last updated
.
Post not yet marked as solved
0 Replies
26 Views
6 weeks ago, we received an email from the Apple Developer Program requesting certain documents. After we uploaded the requested documents, we received a message stating, "Thank you for providing the documents we requested. We will review them and follow up with you within two business days." However, we are still awaiting a response after 6 weeks. What steps can we take to expedite communication with the Apple Developer Program? Has anyone else experienced a similar issue? Additionally, how long does it typically take for the Apple Developer Program to reply after requesting documents to be uploaded? Thank you.
Posted
by Simona_.
Last updated
.
Post not yet marked as solved
0 Replies
26 Views
Hi, I was working with URLSession.upload with background config. I came across this cancellation reason in URLError.BackgroundTaskCancelledReason. backgroundUpdatesDisabled Docs suggest that these are triggered while background tasks are disabled. Does it mean disabled by user? Can anyone please shed light on how this cancellation reason can occur? Who can disable the background upload (User or the system ...) and how?
Posted Last updated
.
Post not yet marked as solved
0 Replies
21 Views
This is Saeid from Drion.ai Ag company in Germany. We are s marketing startup and we are developing a marketing platform we need to give opportunity to our users to sign in using Apple ID. I have sent all of the company documents, my ID card, and all documents that we have in our company just to activate the AppID login API and it is nothing after more than 2 months. I had a call from Apple support and she told me to send some documents I did it, but nothing yet. Is this Apple company's suport for developers?!
Posted Last updated
.
Post not yet marked as solved
0 Replies
30 Views
I have a swiftui iPhone app running on the "iOS Apps on Mac" simulator. What I'm trying to do is get a notification when the window size changes, but nothing seems to work. I have tried .onReceive(NotificationCenter.default.publisher(for: UIContentSizeCategory.didChangeNotification)) { _ in updateStuff() } I also tried .onAppear() { updateStuff() } but neither seems to get called any suggestions ?
Posted Last updated
.

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all