Posts

Sort by:
Post not yet marked as solved
0 Replies
7 Views
Hi all, I'm working on a small PoC to get Content Filtering (FilterDataProvider) working on macOS without any user interaction. So far, I've pushed two payloads to my machine using user-approved MDM enrollment: com.apple.system-extension-policy com.apple.webcontent-filter The application containing the network extension is present in /Applications. The installation of the profiles both succeed and I can see a Content Filter is created in the Network section of System Settings. Even the status says "Enabled", but the dot remains orange. Inspecing the system logs (specifically: filtering on process:neagent) shows me the following error: 1. Failed to find a com.apple.networkextension.filter-data extension inside of app com.my.app.containing.the.ext Only when I submit an activation request using OSSystemExtensionRequest.activationRequest, the network extension starts (without prompts, as expected) and everything works. Is this expected behaviour? Do I need to submit an activation request through code regardless of the fact that MDM pre-approved the System Extension prompts and created the Content Filter in the System Settings?
Posted
by
Post not yet marked as solved
0 Replies
4 Views
Hello, We have discovered an app on the App Store that has copied our application entirely, including the UI interface, icons, and artwork. The only difference is a single letter in the app name, while the logo remains remarkably similar. We couldn't find a report button on the App Store page. How can we report it? Thank you,
Posted
by
Post not yet marked as solved
0 Replies
5 Views
Hi! I would like to release my app in the EU, and I would be classified as trader. Thereby I have to hand over some data like address, phone number etc. My question is: can I use a VoIP number as phone number? I'm just concerned about the sms verification. Much appreciated!
Posted
by
Post not yet marked as solved
0 Replies
6 Views
Hi everybody, I'm trying to implement Firebase messaging into a new iOS app. I've done everything (add Background Mode and Push Notification capabilities, import my key into FIrebass, ...) as describe in different tutorials that I founded on internet. The fact is that the console doesn't show me a token.. Does someone has an idea ? Here's my code : // pushnotifApp.swift // pushnotif // // Created by Thibault COLLIN on 13/05/2024. // import SwiftUI import FirebaseCore import FirebaseMessaging import UserNotifications class AppDelegate: NSObject, UIApplicationDelegate, MessagingDelegate, UNUserNotificationCenterDelegate { func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool { print("Application did finish launching") FirebaseApp.configure() Messaging.messaging().delegate = self UNUserNotificationCenter.current().delegate = self let authOptions: UNAuthorizationOptions = [.alert, .badge, .sound] UNUserNotificationCenter.current().requestAuthorization(options: authOptions) { (granted, error) in if granted { print("Notification authorization granted") DispatchQueue.main.async { application.registerForRemoteNotifications() } } else { print("Notification authorization denied") } } return true } func messaging(_ messaging: Messaging, didReceiveRegistrationToken fcmToken: String?) { print("Firebase registration token received: \(String(describing: fcmToken))") } } @main struct pushnotifApp: App { @UIApplicationDelegateAdaptor(AppDelegate.self) var delegate var body: some Scene { WindowGroup { ContentView() } } } And the console result : Application did finish launching 10.26.0 - [FirebaseMessaging][I-FCM001000] FIRMessaging Remote Notifications proxy enabled, will swizzle remote notification receiver handlers. If you'd prefer to manually integrate Firebase Messaging, add "FirebaseAppDelegateProxyEnabled" to your Info.plist, and set it to NO. Follow the instructions at: https://firebase.google.com/docs/cloud-messaging/ios/client#method_swizzling_in_firebase_messaging to ensure proper integration. Notification authorization granted
Posted
by
Post not yet marked as solved
0 Replies
13 Views
Hello, I'm working on an app that makes use of Screen Time features by leveraging the Family Controls, Device Activity and Managed Settings frameworks. The main app works fine by shielding/unshielding apps with a toggle. When it comes to monitoring the time intervals with the Device Activity Monitor (DAM) extension (e.g. lock X apps for Y minutes), I'm experiencing several issues. To shield/unshield apps and kick off the monitoring I perform the following instructions: let timeInMinutes = 15 let startDate = Date(timeIntervalSinceNow: 1.0) // padding added to avoid invalid DAM ranges < 15 mins. let endDate = startDate.addingTimeInterval(timeInMinutes * 60.0) let components: Set<Calendar.Component> = [.day, .month, .year, .hour, .minute, .second] let calendar = Calendar.current let intervalStart = calendar.dateComponents(components, from: startDate) let intervalEnd = calendar.dateComponents(components, from: endDate) let schedule = DeviceActivitySchedule(intervalStart: intervalStart, intervalEnd: intervalEnd, repeats: false) try deviceActivityCenter.startMonitoring(.definiteShield, during: schedule) let managedSettingsStore = ManagedSettingsStore() managedSettingsStore.shield.applications = selection.applicationTokens // `selection` being an instance of `FamilyActivitySelection` The main pain points are: After this code is performed, I would expect the Device Activity Monitor extension to start, or at least to start once I go to background. To check whether the DAM extension is running or not, I attach to the extension process manually (Product > Attach to Process by PID or Name). But I can see the extension correctly running only after 3-4 attempts of calling startMonitoring. Even when the DAM extension runs, intervalDidStart and intervalDidEnd methods in the extension are called quite randomly - most of the times not being called at all - thus making the extension hugely unaffordable. Please note: I already ask for Screen Time permissions during the onboarding by calling AuthorizationCenter.shared.requestAuthorization(for: .individual), so by the time the user shields the apps, these permissions are already granted. I already have Family Control entitlements for development and distribution, and for both the main target and the DAM extension target. In the intervalDidEnd method, I simply call ManagedSettingsStore().clearAllSettings() and DeviceActivityCenter().stopMonitoring(). This looks like to be enough to stay way below the 6MB memory limit. Am I doing something wrong, is there a way to fix this, or is just the Device Activity framework that is unstable?
Posted
by
Post not yet marked as solved
0 Replies
19 Views
I am running two different background modes(not at the same time), 1 with a workout and 1 with a location. I noticed that the app logo appears above the watch face for both background modes but does not show up consistently. I wonder what the significance of the logo showing up above watch face is? Additionally why does it show up sometimes but not others? Thanks
Posted
by
Post not yet marked as solved
0 Replies
11 Views
I am unsure what happened, but almost all my documents are missing from multiple folders on my Mac and cloud. I have reviewed all the diagnostics posted on the Apple websites and Q and A to recover it. Everything I have tried. When I can find some of those documents listed (some being Excel spreadsheets) in their app and try to open them, it says that the file has been deleted or is currently not accessible. All of these documents were saved on my desktop, and I searched the computer far and wide, downloaded recovery apps to find them, and found nothing. I am not even sure how they disappeared. I tried updating my Mac to the newest software. As I read, they may appear after that, but there is still nothing.
Posted
by
Post not yet marked as solved
0 Replies
19 Views
Hi, How to change title, subtitle, primaryButtonLabel, secondaryButtonLabel values according to iPhone language? I noticed that the Shield Configuration Extension only runs once, when I turn on shield. Currently I can't find a way to run the Shield Configuration Extension again. Thanks!
Posted
by
Post not yet marked as solved
0 Replies
23 Views
I'm building a visionOS app which loads a Reality Composer scene with a large number of models. The app includes several of these scenes, and allows the user to switch between them. Because the scenes have a large number of models, I want to unload the currently loaded scene before loading a different one. So far I have been unable to reclaim all of the used memory by removing the entities from the scene. I've made a few small changes to the Mixed Immersive app template which demonstrate this behavior which I've included below (apparently I'm unable to upload a zip file with the entire project). Using just the two spheres included in the reality kit content the leaked memory is fairly small, but if you add a couple larger models to the scene (I was able to easily find free ones online) then the memory leak becomes much more obvious. When the immersive space is initially opened, I'm seeing roughly 44MB of used memory (as shown in the Xcode Debug navigator). Each time I tap the "Load Models" and then "Unload Models" buttons, the memory use decreases but does not get back down to the initial amount. Subsequent loads and unloads will continue to increase the used memory (the amount of increase will depend on the models that you add to the scene). Also note that I've seen similar memory increases when dynamically creating the entities. Inside ViewModel.loadModels I've included some commented out code that dynamically creates entities instead of loading a Reality Composer scene. Is there a way to fully reclaim the used memory? I've tried many different ways to clear the RealityKit entities but so far have been unsuccessful. struct RKMemTestApp: App { private var viewModel = ViewModel() var body: some Scene { WindowGroup { ContentView() .environment(viewModel) } ImmersiveSpace(id: "ImmersiveSpace") { ImmersiveView() .environment(viewModel) } } } Add this above the body in ContentView: @Environment(ViewModel.self) private var viewModel The ContentView body should be: VStack { Toggle("Show ImmersiveSpace", isOn: $showImmersiveSpace) .font(.title) .frame(width: 360) .padding(24) .glassBackgroundEffect() Button("Load Models") { viewModel.loadModels() } Button("Unload Models") { viewModel.unloadModels() } } ImmersiveView: struct ImmersiveView: View { @Environment(ViewModel.self) private var viewModel var body: some View { RealityView { content in if let rootEntity = viewModel.rootEntity { content.add(rootEntity) } } update: { content in if viewModel.rootEntity == nil && !content.entities.isEmpty { content.entities.removeAll() } else if let rootEntity = viewModel.rootEntity, content.entities.isEmpty { content.add(rootEntity) } } } } ViewModel: import Foundation import Observation import RealityKit import RealityKitContent @Observable class ViewModel { var rootEntity: Entity? init() { } func loadModels() { Task { if let scene = try? await Entity(named: "Immersive", in: realityKitContentBundle) { Task { @MainActor in if rootEntity == nil { rootEntity = Entity() } rootEntity!.addChild(scene) } } } /*if rootEntity == nil { rootEntity = Entity() } for _ in 0..<1000 { let mesh = MeshResource.generateSphere(radius:0.1) let material = SimpleMaterial(color: .blue, roughness: 0, isMetallic: true) let entity = ModelEntity(mesh: mesh, materials: [material]) entity.position = [Float.random(in: 0.0..<1.0), Float.random(in: 0.5..<1.5), -Float.random(in: 1.5..<2.5)] rootEntity!.addChild(entity) }*/ } func unloadModels() { rootEntity?.children.removeAll() rootEntity?.removeFromParent() rootEntity = nil } }
Posted
by
Post not yet marked as solved
0 Replies
27 Views
I'm developing an app with a chart in SwiftUI. I want the following block of code to run when the chart is clicked. On my personal iPhone, the app works flawlessly. But when I try it in the simulator it crashes and gives me about 5-10 of the following errors. When I remove the @Query macro from the code block, the application does not crash in the simulator, but I continue to get the errors I mentioned. If I do not run the following code block, I do not get the errors I mentioned. struct SaleDetailView: View { @Query(filter: #Predicate<Registration> { !$0.activeRegistration }) private var regs: [Registration] var body: some View { VStack { DailySaleView() } .padding() } } Thank you in advance for your answers. Do not hesitate to ask if you have any questions. Thanks, MFS
Posted
by
Post not yet marked as solved
1 Replies
27 Views
Guideline 5.2.3 - Legal Your app contains content or features that may violate the rights of one or more third parties. Specifically, your app provides potentially unauthorized access to third-party audio or video streaming, catalogs, and discovery services. Your app and its contents should not infringe upon the rights of another party. In the event your app infringes another party’s rights, you are responsible for any liability to Apple because of a claim.
Posted
by
Post not yet marked as solved
0 Replies
26 Views
I've been investigating an issue with the SO_OOBINLINE socket option. When that option is disabled, the expectation is that out-of-band data that is sent on the socket will not be available through the use of read() or recv() calls on that socket. What we have been noticing is that when the socket is bound to a non-loopback address (and the communication is happening over that non-loopback address), then even when SO_OOBINLINE is disabled for the socket, the read()/recv() calls both return the out-of-band data. The issue however isn't reproducible with loopback address, and read()/recv() both correctly exclude the out-of-band data. This issue is only noticed on macos. I have been able to reproduce on macos M1, following version, but the original report which prompted me to look into this was reported on macos x64. My M1 OS version is: sw_vers ProductName: macOS ProductVersion: 14.3.1 BuildVersion: 23D60 Attached is a reproducer (main.c.txt - rename it to main.c after downloading) that I have been able to develop which reproduces this issue on macos. When you compile and run that: ./a.out it binds to a non-loopback address by default and you should see the failure log, resembling: ... ERROR: expected: 1234512345 but received: 12345U12345 To run the same reproducer against loopback address, run it as: ./a.out loopback and that should succeed (i.e. no out-of-band data) with logs resembling: ... SUCCESS: completed successfully, expected: 1234512345, received: 1234512345 Is this a bug in the OS? I would have reported this directly through feedback assistant, but my past few open issues (since more than a year) have not even seen an acknowledgement or a reply, so I decided to check here first. main.c.txt
Posted
by
Post not yet marked as solved
0 Replies
23 Views
i installed @react-native-firebase/firestore in my react-native project, i changed my podfile as i did previously when i installed Firebase for push notifications, now i'm trying to store my fcm token in Firestore and while building the app i got this error Framework 'FirebaseFirestoreInternal' not found . but i have installed it, its in my podfile : pod 'Firebase', :modular_headers => true pod 'FirebaseCoreInternal', :modular_headers => true pod 'GoogleUtilities', :modular_headers => true pod 'FirebaseCore', :modular_headers => true pod 'FirebaseFirestore', :modular_headers => true pod 'FirebaseCoreExtension', :modular_headers => true pod 'FirebaseFirestoreInternal', :modular_headers => true and even in my podfile.lock - FirebaseFirestoreInternal (10.24.0) . those first 4 lines had no issue, i successfully installed and tested push notifications but this FirebaseFirestoreInternal got some problem, i don't know why, can anyone help?
Posted
by
Post not yet marked as solved
0 Replies
25 Views
Hey everyone, I've noticed a recent issue in my app where users are not receiving automatic updates/ blank push to the pass added to the user wallet. Instead, they have to manually refresh the pass by dragging down the back of the Apple Wallet to see any updates. This might be because the blank push notification is not updating the passes. If you've encountered this problem or have any insights into what might be causing it, I'd love to hear from you. Your feedback is invaluable to me.
Posted
by
Post not yet marked as solved
1 Replies
37 Views
I need to use the Network Link Conditioner tool to test an issue with possibly dropped packets. However, all links on google say you need to download "Additional Tools for XCode" - but the links just go to the developer downloads home page, and the option isn't available anywhere on there. Likewise, "Xcode -> Open Developer Tool -> More Developer Tools..." just goes to the same location. It seems like the download has either been removed, or moved elsewhere - how do we get this tool now?
Posted
by
Post not yet marked as solved
0 Replies
27 Views
I connect two AVAudioNodes by using - (void)connectMIDI:(AVAudioNode *)sourceNode to:(AVAudioNode *)destinationNode format:(AVAudioFormat * __nullable)format eventListBlock:(AUMIDIEventListBlock __nullable)tapBlock and add a AUMIDIEventListBlock tap block to it to capture the MIDI events. Both AUAudioUnits of the AVAudioNodes involved in this connection are set to use MIDI 1.0 UMP events: [[avAudioUnit AUAudioUnit] setHostMIDIProtocol:(kMIDIProtocol_1_0)]; But all the MIDI voice channel events received are automatically converted to UMP MIDI 2.0 format. Is there something else I need to set so that the tap receives MIDI 1.0 UMPs? (Note: My app can handle MIDI 2.0, so it is not really a problem. So this question is mainly to find out if I forgot to set the protocol somewhere...). Thanks!!
Posted
by
Post not yet marked as solved
0 Replies
25 Views
Hello, I have an app with two AppIntents. I invoke first AppIntent with my voice command, then I would like invoke the second AppIntent after the first one. I have implemented the necessary return type for the first AppIntent. My first AppIntent is invoked and executes successfully, however when it calls the second AppIntent, everything inside perform() method in the second AppIntent works but Siri dialog. Siri doesn't say the dialog in the second AppIntent. Below implementation details for my two AppIntents together with AppShortcutsProvider. I appreciate any help. struct MyFirstAppIntent : AppIntent { static let title: LocalizedStringResource = "Show My Usages" func perform() async throws -> some ProvidesDialog & OpensIntent { let dialogStr = IntentDialog(stringLiteral: "You have a package") print("I'm in first AppIntent") return .result(opensIntent: MySecondAppIntent(), dialog: dialogStr) } struct MySecondAppIntent: AppIntent { static let title: LocalizedStringResource = "Show My Usages" func perform() async throws -> some IntentResult & ReturnsValue<String> & ProvidesDialog { print("I'm in second AppIntent") return .result(value: "Listing Packages", dialog: "You have activated Default") } struct MyVoiceShortcutProvider : AppShortcutsProvider { @AppShortcutsBuilder static var appShortcuts: [AppShortcut] { AppShortcut( intent: MyFirstAppIntent(), phrases: ["Call my first intent \(.applicationName)"] ); AppShortcut( intent: MySecondAppIntent(), phrases: ["Call my second intent \(.applicationName)"] ); } }
Posted
by
Post not yet marked as solved
1 Replies
40 Views
We are developing an app which connects to a BLE peripheral in the background whenever it gets close to it. What we have used so far is that we monitor a circular region. When the phone enters the region the app will start scanning for peripherals and when it discovers the peripheral it connects to it. This worked pretty well for the last few iOS versions, perhaps iOS 14-16. It wasn't perfect but for the most part it would feel like it connected rather quickly when you would approach the BLE peripheral. If you listen to music via BLE or talk to someone using your BLE headset then it could sometimes work noticeably worse. But, as said, for the most part it would work satisfactory. Starting with iOS 17 and analyzing the functionality over the past 6 months or so we've noticed a clear worsening of it. It does generally connect to the peripheral but the user might often have to wait for quite some time. Rather frequently the user must even light up the screen of the phone before anything even happens. It appears that some sort of resource allocation, battery saving feature or similar has affected this functionality greatly. The time difference between entering a region and physically approaching the device is generally around 2-3 minutes. We have tried to do it more in line with documentation and follow the guidelines that we find in: https://developer.apple.com/library/archive/documentation/NetworkingInternetWeb/Conceptual/CoreBluetooth_concepts/CoreBluetoothBackgroundProcessingForIOSApps/PerformingTasksWhileYourAppIsInTheBackground.html So in doing this we do not start scanning for peripherals when a region is entered, but instead we directly invoke connectPeripheral:options:. This way we offload to the system that we want to connect to that peripheral. However, when testing this we see no real improvement. Sometimes it connects satisfactorily. Sometimes it doesn't really connect at all. Many times it connects if the user lights up the screen. So just looking at what the user is experiencing our analysis is that doing it this way works even worse than what we previously did. I understand that the system has many resources to consider and that some may have to wait while others perform things. But are there any documentation on what one could expect from initiating a connectPeripheral:options: from the background? In the link I posted it simply states the iOS device will reconnect when the user returns home. So not much detail in terms of performance which is crucial for our application. If there aren't any further details on the performance, are there any other ways to improve this functionality? We are not looking at draining the battery whatsoever but we simply need our background running app to be as responsive as possible for a few minutes after it has been launched by the region monitoring. We understand that battery life is important but since this happens rarely and sparsely (not more than a few times per day) it seems reasonable that there should be a way to be able to make it function properly.
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all