Posts

Sort by:
Post not yet marked as solved
0 Replies
13 Views
I connect two AVAudioNodes by using - (void)connectMIDI:(AVAudioNode *)sourceNode to:(AVAudioNode *)destinationNode format:(AVAudioFormat * __nullable)format eventListBlock:(AUMIDIEventListBlock __nullable)tapBlock and add a AUMIDIEventListBlock tap block to it to capture the MIDI events. Both AUAudioUnits of the AVAudioNodes involved in this connection are set to use MIDI 1.0 UMP events: [[avAudioUnit AUAudioUnit] setHostMIDIProtocol:(kMIDIProtocol_1_0)]; But all the MIDI voice channel events received are automatically converted to UMP MIDI 2.0 format. Is there something else I need to set so that the tap receives MIDI 1.0 UMPs? (Note: My app can handle MIDI 2.0, so it is not really a problem. So this question is mainly to find out if I forgot to set the protocol somewhere...). Thanks!!
Posted
by
Post not yet marked as solved
0 Replies
8 Views
Hello, I have an app with two AppIntents. I invoke first AppIntent with my voice command, then I would like invoke the second AppIntent after the first one. I have implemented the necessary return type for the first AppIntent. My first AppIntent is invoked and executes successfully, however when it calls the second AppIntent, everything inside perform() method in the second AppIntent works but Siri dialog. Siri doesn't say the dialog in the second AppIntent. Below implementation details for my two AppIntents together with AppShortcutsProvider. I appreciate any help. struct MyFirstAppIntent : AppIntent { static let title: LocalizedStringResource = "Show My Usages" func perform() async throws -> some ProvidesDialog & OpensIntent { let dialogStr = IntentDialog(stringLiteral: "You have a package") print("I'm in first AppIntent") return .result(opensIntent: MySecondAppIntent(), dialog: dialogStr) } struct MySecondAppIntent: AppIntent { static let title: LocalizedStringResource = "Show My Usages" func perform() async throws -> some IntentResult & ReturnsValue<String> & ProvidesDialog { print("I'm in second AppIntent") return .result(value: "Listing Packages", dialog: "You have activated Default") } struct MyVoiceShortcutProvider : AppShortcutsProvider { @AppShortcutsBuilder static var appShortcuts: [AppShortcut] { AppShortcut( intent: MyFirstAppIntent(), phrases: ["Call my first intent \(.applicationName)"] ); AppShortcut( intent: MySecondAppIntent(), phrases: ["Call my second intent \(.applicationName)"] ); } }
Posted
by
Post not yet marked as solved
0 Replies
10 Views
We are developing an app which connects to a BLE peripheral in the background whenever it gets close to it. What we have used so far is that we monitor a circular region. When the phone enters the region the app will start scanning for peripherals and when it discovers the peripheral it connects to it. This worked pretty well for the last few iOS versions, perhaps iOS 14-16. It wasn't perfect but for the most part it would feel like it connected rather quickly when you would approach the BLE peripheral. If you listen to music via BLE or talk to someone using your BLE headset then it could sometimes work noticeably worse. But, as said, for the most part it would work satisfactory. Starting with iOS 17 and analyzing the functionality over the past 6 months or so we've noticed a clear worsening of it. It does generally connect to the peripheral but the user might often have to wait for quite some time. Rather frequently the user must even light up the screen of the phone before anything even happens. It appears that some sort of resource allocation, battery saving feature or similar has affected this functionality greatly. The time difference between entering a region and physically approaching the device is generally around 2-3 minutes. We have tried to do it more in line with documentation and follow the guidelines that we find in: https://developer.apple.com/library/archive/documentation/NetworkingInternetWeb/Conceptual/CoreBluetooth_concepts/CoreBluetoothBackgroundProcessingForIOSApps/PerformingTasksWhileYourAppIsInTheBackground.html So in doing this we do not start scanning for peripherals when a region is entered, but instead we directly invoke connectPeripheral:options:. This way we offload to the system that we want to connect to that peripheral. However, when testing this we see no real improvement. Sometimes it connects satisfactorily. Sometimes it doesn't really connect at all. Many times it connects if the user lights up the screen. So just looking at what the user is experiencing our analysis is that doing it this way works even worse than what we previously did. I understand that the system has many resources to consider and that some may have to wait while others perform things. But are there any documentation on what one could expect from initiating a connectPeripheral:options: from the background? In the link I posted it simply states the iOS device will reconnect when the user returns home. So not much detail in terms of performance which is crucial for our application. If there aren't any further details on the performance, are there any other ways to improve this functionality? We are not looking at draining the battery whatsoever but we simply need our background running app to be as responsive as possible for a few minutes after it has been launched by the region monitoring. We understand that battery life is important but since this happens rarely and sparsely (not more than a few times per day) it seems reasonable that there should be a way to be able to make it function properly.
Posted
by
Post not yet marked as solved
0 Replies
15 Views
This was mentioned in another thread 4 years ago: This whole discussion assumes that every network connection requires a socket. This isn’t the case on most Apple platforms, which have a user-space networking stack that you can access via the Network framework [1]. [1] The one exception here is macOS, where Network framework has to run through the kernel in order to support NKEs. This is one of the reasons we’re in the process of phasing out NKE support, starting with their deprecation in the macOS 10.15 SDK. Is macOS still an unfortunate exception that requires a socket per Network framework's connection?
Posted
by
Post not yet marked as solved
0 Replies
9 Views
I am getting following message after accepting the agrrement "Your enrollment in the Apple Developer Program could not be completed at this time." This is consistent and i have not been able to bypass this. I added enough money in my bank account as i though that could have been the issue.
Posted
by
Post not yet marked as solved
1 Replies
15 Views
Initially, my task was to determine which type of connection is being used at the moment: 5G or 4G. And I found "CTTelephonyNetworkInfo().serviceCurrentRadioAccessTechnology" but there is a problem when the device has more than one sim. My iPhone has two sims, one physical and one electronic. I need to determine which one is used to access the network. I tried to use "CTTelephonyNetworkInfo().serviceCurrentRadioAccessTechnology" but it is a dictionary [String: String] that only indicates the connection of each of the cards, and it is not possible to find out which one is active from this dictionary. So how can I determine which of the two cards are currently being used to access the Internet?
Posted
by
Post not yet marked as solved
0 Replies
13 Views
We are facing an issue on Catalyst when building our app using Xcode 15.4. The issue is related to precompiled frameworks and seems to be widespread as it happens with multiple vendors (like Firebase or Braze). We are using SPM to add these dependencies, for instance: .package(url: "https://github.com/braze-inc/braze-swift-sdk", from: "8.2.1"), When building, we get the following error: clang:1:1: invalid version number in '-target arm64-apple-ios10.15-macabi' Our macOS deployment target is 12.3. Our iOS deployment target is 15.4. I will try to create a reproducer I can share but I wanted to share this in case there's a known workaround. Thanks in advance!
Posted
by
Post not yet marked as solved
0 Replies
12 Views
Hello, Element with position: fixed; bottom: 0; code is being cut on iPhone Safari Preview mode, when application snippet is present. Clicking “x” or scrolling down repairs it. I am not quite sure how to fix and develop it. Fixing this bug is very problematic, since this preview mode is not possible to recreate on Mac/PC.
Posted
by
Post not yet marked as solved
0 Replies
15 Views
Hello fellow developers, I’ve been working with the CallKit framework in iOS, specifically handling incoming calls. One issue I’ve encountered is that Siri when "read caller name" is enabled announces the caller name/surname set via localizedCallerName and then reads the generic handle value (usually alphanumeric) too! Has anyone encountered a similar situation or if there’s a solution to prioritize the localizedCallerName over the generic handle value without using CXHandleType.emailAddress? Alternatively, any insights or workarounds you know would be greatly appreciated. TLDR: even when I correctly configure the localizedCallerName property, Siri persists in reading the CXHandleType.generic. The original Implementation with CXHandleType.generic: The issue arises when using CXHandleType.generic for alphanumeric IDs (or even URLs as stated by documentation https://developer.apple.com/documentation/callkit/cxhandle). Despite correctly setting the localizedCallerName, Siri continues to announce the generic handle value. Expected Behavior: Siri should read only the localizedCallerName when set and ignore the generic handle value when announcing incoming calls. Workaround: Currently, the only workaround is to use CXHandleType.emailAddress for alphanumeric IDs. However, this is not ideal since it repurposes an email-related handle type for a different purpose. Steps to Reproduce: Create a CallKit app that handles incoming calls (example app from documentation can be used too). On incoming call create CXCallUpdate object Create a CXHandle with CXHandleType.generic and an alphanumeric value (e.g., “ABC123”). Pass the CXHandle to the CXCallUpdate objects' remoteHandle. Set the localizedCallerName property of the CXCallUpdate object with a custom caller name/surname. Report the call with reportNewIncomingCallWithUUID Observe that Siri reads both the localizedCallerName and the generic handle value during call announcements. While we are here a Feature Request: Developers should be able to provide a user-friendly caller name without resorting to workarounds like using CXHandleType.emailAddress. I kindly request that Apple consider enhancing Siri’s behavior in the following ways: Allow developers to suppress the reading of generic handle values while still using the correct handle type. Introduce additional type options for call announcements that don't read the generic value. Both of the above. Thank you for your help! 🙌
Posted
by
Post not yet marked as solved
0 Replies
8 Views
I am trying to develop website that can upload instagram story from web. I opened the Instagram app and even opened the story upload pop-up. but, I can't send Image Data. I need your Help.... source_application is detected at instagram, but backgroundImage is not sent at instagram. How can I send....... because
Posted
by
Post not yet marked as solved
0 Replies
10 Views
We’re looking forward to completing our review, but we need additional information about your app's cryptocurrency exchange services. This information is required to review the app for compliance with App Review Guideline 3.1.5(iii), which requires exchange services only be offered in countries or regions where the app has appropriate licensing and permissions to provide a cryptocurrency exchange. Next Steps Please provide the following information and/or documentation: Confirm in which specific countries or regions you intend to make your app's cryptocurrency exchange services available. Provide documentary evidence of the licensing and permissions for the cryptocurrency exchange services in your app for all of the countries or regions where your app is available. The documentation you provide should indicate where you intend to distribute your app. Provide links to government websites that display the licenses you've secured to provide exchange services in all the countries or regions where your app is available. Provide information on the third-party exchange APIs that your app connects with, including links to public APIs and documentation of partnership with specific third-party exchanges. Do the cryptocurrency exchange transactions occur between the users and the exchange(s) or do you, as the developer, handle the transaction requests with the exchange(s) directly? Are the cryptocurrency exchange features provided in your app decentralized, centralized, or a mix of decentralized and centralized exchange features? Does your app offer new or exclusive tokens or cryptocurrency to users? If so, which ones and on which exchanges can users obtain the currency? Explain the precautions you've taken to comply with anti-money laundering ("AML") and Know Your Customer ("KYC") requirements. If you intend to distribute your app in the United States, provide a copy of your Money Services Business (MSB) registration. Additionally, confirm that you will restrict your app's availability to the states listed on your MSB registration. I'm so confused about this. The reviewer has mention about cryptocurrency exchange services. We enrolled as an organization, in our app doesn't provide cryptocurrency exchange. We just provide decentralized application for users to easily interact with our smart contracts. Basically, all transactions calling smart contracts are generated by our DApps and presented to the users for further verification (meta mask, trust wallet etc) before being sent to the blockchain. So, our app doesn't have any control of user crypto assets. I have no idea. If we need licensing and permissions for the cryptocurrency exchange services. Because, we don't provide cryptocurrency exchange in our app. Am I missing something? anyone can help me to explaine for this case? Thanks
Posted
by
Post not yet marked as solved
1 Replies
17 Views
My App has beed rejected for not complying with the Criminal Activity Reporting Guideline 1.7. I have to provide a documentation showing partnership with local law enforcement (where ever the app is used). Here's the response i got: We noticed that your app allows users to report criminal activity, but we need additional information before continuing our review. Specifically, it is unclear if you have partnered with local law enforcement to respond to the reports of alleged criminal activity. To provide users a safe and reliable experience, apps may only be used to report criminal activity in locations where you have the active involvement of local law enforcement. Next Steps To ensure that your app is partnering with the appropriate institutions, you must provide documentation or evidence of your relationship with local law enforcement wherever your app is distributed. Please attach the documentation in the App Review Information section of App Store Connect. Once you have shared this documentation, we will proceed with our review and will let you know if there are any further issues. Resources Learn more about our requirements for ensuring user safety in App Review Guideline 1.0 - Safety. My App provied a list of police, ambulances, etc to use for emergency purposes only. Not allow users to report anywhere! Please help!
Posted
by
Post not yet marked as solved
0 Replies
20 Views
Hi, Is this possible? I would like to: Store a biometrically secured key in the Secure Enclave. Do multiple cryptographic operations using that key in a short period of time (say 5 seconds), not all at once. Only do one FaceID for that set. For the time I've only gotten either multiple flashing FaceId requests or the operations failing. Is it possible to set a time limit in which the first FaceID authentication is accepted? Should I do something else? Thanks!
Posted
by
Post not yet marked as solved
0 Replies
13 Views
Hello, I am working on an AR application to visualize a life-size room. I am working with Unity 2023.3, Apple ARKIT XR Plugin 6.0.0-pre.8 and a 2021 5th gen iPad. First I scan a room with roomplan to get a usdz file. I open it with Blender to make sure I have the right data (I do) and I export it to fbx to use it in Unity. Then I import the fbx to Unity and I use it as a prefab to instantiate it when I click on a detected floor. I build my application in Unity, then in Xcode to use it on my iPad. But when the room is displayed, it is way too small. I tried adding a slider to scale up the room's gameobject and I added a plugin to visualize my Unity scene in my built application. The room is scalling up in the Unity scene but not in the application. Does anyone ever had this issue and if so how did you fix that? Best regards, Angel Garcia
Posted
by
Post not yet marked as solved
0 Replies
11 Views
Hi, guys. I am preparing to develop a Vision Pro app with Unity. The Play to Device, which connects Unity Engine and Vision Pro, worked well, and there was no problem with the connection with Vision Pro simulator. But when I tried to connect Xcode and Vision Pro, I couldn't see Vision Pro itself in the device list. (The iPhone 11, which was wired as a test, recognizes well.) I looked up the forum and it was simple to connect. The link to the post I found is below. https://forums.developer.apple.com/forums/thread/746464 I don't know why it's not working even though I look up YouTube. Leaving my work environment, I'd appreciate it if you could leave a helpful answer. MacBook : M2 MacBook Xcode Ver. : 15.3 VisionPro Ver. : 1.1.2 Developer accounts: All use the same Apple developer account
Posted
by
Post not yet marked as solved
0 Replies
22 Views
I'm currently facing an issue with my RADIUS server's EAP configuration and Apple devices. I'm using a certificate signed by "DigiCert Global Root G2", which is included in Apple's trusted store CA's (https://support.apple.com/en-us/105116). However, DigiCert uses an intermediate authority, "DigiCert Global G2 TLS RSA SHA256 2020 CA1", to sign customer certificates, and this seems to be causing a problem. When an Apple device tries to connect to the WiFi, the RADIUS server presents its certificate, but the device doesn't trust it due to the untrusted intermediate certificate. Here's my current configuration: Root CA in FreeRADIUS: "DigiCert Global Root G2" Server certificate in FreeRADIUS: "Intermediate + Server certificate" I have also tried to extend the CA with the full chain, but since the final certificate is issued by the intermediate authority, my Apple devices continue to report that they don't trust the certificate. Has anyone else experienced this issue and found a solution? It seems unlikely that DigiCert would sign certificates with their (presumably offline) root authority. Any help or suggestions would be greatly appreciated. Thanks!
Posted
by
Post not yet marked as solved
0 Replies
14 Views
I'm trying to cast the screen from an iOS device to an Android device. I'm leveraging ReplayKit on iOS to capture the screen and VideoToolbox for compressing the captured video data into H.264 format using CMSampleBuffers. Both iOS and Android are configured for H.264 compression and decompression. While screen casting works flawlessly within the same platform (iOS to iOS or Android to Android), I'm encountering an error ("not in avi mode") on the Android receiver when casting from iOS. My research suggests that the underlying container formats for H.264 might differ between iOS and Android. Data transmission over the TCP socket seems to be functioning correctly. My question is: Is there a way to ensure a common container format for H.264 compression and decompression across iOS and Android platforms? Here's a breakdown of the iOS sender details: Device: iPhone 13 mini running iOS 17 Development Environment: Xcode 15 with a minimum deployment target of iOS 16 Screen Capture: ReplayKit for capturing the screen and obtaining CMSampleBuffers Video Compression: VideoToolbox for H.264 compression Compression Properties: kVTCompressionPropertyKey_ConstantBitRate: 6144000 (bitrate) kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Main_AutoLevel (profile and level) kVTCompressionPropertyKey_MaxKeyFrameInterval: 60 (maximum keyframe interval) kVTCompressionPropertyKey_RealTime: true (real-time encoding) kVTCompressionPropertyKey_Quality: 1 (lowest quality) NAL Unit Handling: Custom header is added to NAL units Android Receiver Details: Device: RedMi 7A running Android 10 Video Decoding: MediaCodec API for receiving and decoding the H.264 stream
Posted
by
Post not yet marked as solved
0 Replies
47 Views
I need to enable a user to log in to an SAP system. The login does not work on the Apple Watch. Is this because javascripts are not supported? Is there another way for a user to log in to an SAP system with an Apple Watch?
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all