Posts

Sort by:
Post not yet marked as solved
2 Replies
71 Views
We have trying to programmatically send data to Final Cut Pro by using Apple Event as decribed in Sending Data Programmatically to Final Cut Pro : tell application "Final Cut Pro" activate open POSIX file "/Users/JohnDoe/Documents/UberMAM/MyEvents.fcpxml" end tell This works fine in Script Editor but we run into problems when trying to do the same in our macOS app. We found interesting information in Workflow Extensions SDK 1.0.2 Release Notes.pdf. A) Hardened runtime has "Apple Events Enabled" checked. B) Info.plist contains NSAppleEventsUsageDescription: <key>NSAppleEventsUsageDescription</key> <string>Test string</string> C) We added following entitlements: <key>com.apple.security.scripting-targets</key> <dict> <key>com.apple.FinalCut</key> <array> <string>com.apple.FinalCut.library.inspection</string> </array> <key>com.apple.FinalCutTrial</key> <array> <string>com.apple.FinalCut.library.inspection</string> </array> </dict> <key>com.apple.security.automation.apple-events</key> <true/> With this configuration in place, our app is able to call AppleScript to activate Final Cut Pro application but it is unable to open the file. Following error is returned: Error executing AppleScript: { NSAppleScriptErrorAppName = "Final Cut Pro Trial"; NSAppleScriptErrorBriefMessage = "A privilege violation occurred."; NSAppleScriptErrorMessage = "Final Cut Pro Trial got an error: A privilege violation occurred."; NSAppleScriptErrorNumber = "-10004"; NSAppleScriptErrorRange = "NSRange: {56, 64}"; } Also there is no prompt asking user to allow Automation from our app to Final Cut. I am not sure whether the prompt is to be expected when developing an application in Xcode. Our current workaround is to add (or even replace com.apple.security.scripting-targets with): com.apple.security.temporary-exception.apple-events entitlement like this <key>com.apple.security.temporary-exception.apple-events</key> <array> <key>com.apple.FinalCutTrial</key> </array> However while this approach might work in development we know this would probably prevent us from publishing the app to Mac App Store. I think we are missing something obvious. Could you help? :-)
Posted Last updated
.
Post not yet marked as solved
0 Replies
23 Views
I am trying to find the following icon. I have gone through all the icons in SF Symbols 5.1 but have been unable to locate it. Does anyone know what this icon is or how I can get it?
Posted Last updated
.
Post not yet marked as solved
0 Replies
11 Views
Hi there, I'm trying to upload a new build to my app, but I have a windows computer, so seems transporter is out of the equation. Any recommendations to upload the new ipa? A contractor with a mac took care of the original publishing of the app so I'm on my own for this one. Thanks! Evan
Posted Last updated
.
Post not yet marked as solved
1 Replies
13 Views
Is anyone else having the same problems. We been contacting Apple for almost two week, have raised at least 3 or 4 claim tickets to get our issue resolved, but Apple does not/has not responded? Does anyone have recommendation as to what we can do. We are losing money as we cannot open up our App? Thanks, CDL
Posted Last updated
.
Post not yet marked as solved
0 Replies
9 Views
Hello Developer Community, I hope you all are doing well. We need your help to find out about one issue in Mac application deployment outside the App Store. As required notarization, if we want to release the product outside the App Store. We have built the application in ElectronJS and signed it with Developer ID Installer. The next step is notarization, where we get the issue. It says, "Team is not yet configured for notarization." We raised the problem with the Apple team 6 months ago and are still getting the same response from them: "Our engineering team is working on it." without having a timeline. I want to confirm if someone has had the same issue, how long it can take to resolve this, or if you have any solutions. Your support means a lot to us. Thanks. Dhiren Patel
Posted
by dhipatel.
Last updated
.
Post not yet marked as solved
0 Replies
11 Views
This code to write UIImage data as heic works in iOS simulator with iOS < 17.5 import AVFoundation import UIKit extension UIImage { public var heic: Data? { heic() } public func heic(compressionQuality: CGFloat = 1) -> Data? { let mutableData = NSMutableData() guard let destination = CGImageDestinationCreateWithData(mutableData, AVFileType.heic as CFString, 1, nil), let cgImage = cgImage else { return nil } let options: NSDictionary = [ kCGImageDestinationLossyCompressionQuality: compressionQuality, kCGImagePropertyOrientation: cgImageOrientation.rawValue, ] CGImageDestinationAddImage(destination, cgImage, options) guard CGImageDestinationFinalize(destination) else { return nil } return mutableData as Data } public var isHeicSupported: Bool { (CGImageDestinationCopyTypeIdentifiers() as! [String]).contains("public.heic") } var cgImageOrientation: CGImagePropertyOrientation { .init(imageOrientation) } } extension CGImagePropertyOrientation { init(_ uiOrientation: UIImage.Orientation) { switch uiOrientation { case .up: self = .up case .upMirrored: self = .upMirrored case .down: self = .down case .downMirrored: self = .downMirrored case .left: self = .left case .leftMirrored: self = .leftMirrored case .right: self = .right case .rightMirrored: self = .rightMirrored @unknown default: fatalError() } } } But with iOS 17.5 simulator it seems to be broken. The call of CGImageDestinationFinalize writes this error into the console: writeImageAtIndex:962: *** CMPhotoCompressionSessionAddImage: err = kCMPhotoError_UnsupportedOperation [-16994] (codec: 'hvc1') On physical devices it still seems to work. Is there any known workaround for the iOS simulator?
Posted Last updated
.
Post not yet marked as solved
2 Replies
114 Views
Hello, I am developing a private internal Flutter app for our customer, which will not be published on the Apple Store. One of the key features of this app is to collect RF strength metrics to share user experience with the network. For Android, we successfully implemented the required functionality and are able to collect the following metrics: Signal strength level (0-4) Signal strength in dBm RSSI RSRQ Cell ID Location Area Code Carrier name Mobile country code Mobile network code Radio access technology Connection status Duplex mode However, for iOS, we are facing challenges with CoreTelephony, which is not returning the necessary data. We are aware that CoreTelephony is deprecated and are looking for alternatives. We noticed that a lot of the information we need is available via FTMInternal-4. Is there a way to access this data for a private app? Are there any other recommended approaches or frameworks that can be used to gather cellular network information on iOS for an app that won't be distributed via the Apple Store? my swift code import Foundation import CoreTelephony class RfSignalStrengthImpl: RfSignalStrengthApi { func getCellularSignalStrength(completion: @escaping (Result<CellularSignalStrength, Error>) -> Void) { let networkInfo = CTTelephonyNetworkInfo() guard let carrier = networkInfo.serviceSubscriberCellularProviders?.values.first else { completion(.failure(NSError(domain: "com.xxxx.yyyy", code: 0, userInfo: [NSLocalizedDescriptionKey: "Carrier not found"]))) return } let carrierName = carrier.carrierName ?? "Unknown" let mobileCountryCode = carrier.mobileCountryCode ?? "Unknown" let mobileNetworkCode = carrier.mobileNetworkCode ?? "Unknown" let radioAccessTechnology = networkInfo.serviceCurrentRadioAccessTechnology?.values.first ?? "Unknown" var connectionStatus = "Unknown" ... ... } Thank you for your assistance.
Posted
by raiton.
Last updated
.
Post not yet marked as solved
0 Replies
18 Views
I was wondering if anyone knows why the sample project uses Task.detached everywhere because it seems highly non-standard, e.g. in ContentView: .task { Task.detached { @MainActor in await flightData.load() } } Instead, I would expect to see something like: .task { flightData = await controller.loadFlightData() } Or: .task { await controller.load(flightData: flightData) } Is the use of detached perhaps an attempt to work around some issue with ObservableObject published updates?
Posted
by malc.
Last updated
.
Post not yet marked as solved
1 Replies
598 Views
While I'm keen on being on the latest version of macOS, I'm having trouble meeting the requirement that my app can run on a version of macOS that doesn't exist :) I'm using Xcode 15.2 (15C500b), targeting iOS 17.2. I've elected to have the App Store choose the minimum version for Catalyst. I've also tried selecting a version. ITMS-90899: Apple silicon Mac support issue - The app is not compatible with the provided minimum macOS version of 14.2. It can run on macOS 14.4 or later. Please specify an LSMinimumSystemVersion value of 14.4 or later in a new build, or select a compatible version in App Store Connect. For details, visit: ... Is there something I could be doing wrong to prompt this behaviour?
Posted
by Slif.
Last updated
.
Post marked as solved
2 Replies
31 Views
I want to automatically load different views depending on OS (OSX or iOS). Is there a way that I can do this without the user having to click on a link? This is my code so far. struct ContentView: View { #if os(iOS) var myOS = "iOS" #elseif os(OSX) var myOS = "OSX" #else var myOS = "Something Else" #endif var body: some View { NavigationStack { VStack { Text("PLEASE WAIT....") .font(.system(size: 24)) .fontWeight(.bold) } .padding() if (myOS == "OSX"){ // Goto Screen for iMac } else{ // go to screen for iOS } } } } If I use "NavigationLink", my understanding is that the user would need to click on a link. Is there some way to do this without user interaction?
Posted
by jamesm46.
Last updated
.
Post not yet marked as solved
1 Replies
62 Views
I’m trying to track the location of the user on every 10-20 meters of the location change, I’ve managed to obtain the location updates while the application is in foreground or background using Backround modes. my issue is that I cannot get any updates after the application is terminated either by the user or by the system, I’ve tried using startMonitoringSignificantLocationChanges() but this does not fit my purpose since you get updates every 500m. Is it possible to achieve getting updates every 10 meters using “Region Monitoring”? Or is mobile device management (MDM) the only way to achieve this on an iOS device?
Posted
by skavouras.
Last updated
.
Post not yet marked as solved
1 Replies
81 Views
Hello everyone, I hope you’re all doing well. I have a question regarding the use of Apple's Find My network. I’m in the early stages of developing an app that would track third-party Find My-compatible tags. Before proceeding further, I want to ensure that I am compliant with Apple’s guidelines and policies. Can anyone provide insight into whether Apple allows developers to use the Find My crowd-sourced network for their own apps? Specifically, I'm interested in tracking third-party Find My tags through my app. Any guidance or resources you can share would be greatly appreciated! Thank you!
Posted
by va81092.
Last updated
.
Post not yet marked as solved
0 Replies
28 Views
I have an app that utilizes the Network Extension ( Packet Tunnel Provider ), but also uses MDNS to find local devices for data transfer via Network Extensions. However, once connected over Peer to Peer using AWDL0 or NWConnections, it works as expected until a user shuts the screen down. It looks like there's a difference in behavior when the device is plugged in vs when it's on just battery alone. So we can be happily sending data over p2p ( awdl0 ) then a screen shuts off and it kills the connection. Is this expected behavior and if so is there documentation? Also, Network Extensions do not appear to be able to discover over P2P, they can only connect to endpoints directly. Is this expected behavior? My thoughts; If a user allows both the Network Extension Permission and Local Network Permissions that the Network Extension should be able to discover peers via p2p. The connections ( if not asleep ) should stay active while in use.
Posted
by mikeKane.
Last updated
.
Post not yet marked as solved
2 Replies
76 Views
I've implemented a custom system extension VPN for macOS using a Packet Tunnel Provider. At the Protocol Configuration, the 'includeAllNetworks' flag is unset. At the provider, I included all routes (IPv4 default route). What is the expected behavior for LAN traffic? Should the LAN traffic go via the VPN? By 'LAN traffic', I'm referring to local hosts, ssh, printer access, etc.
Posted
by roee84.
Last updated
.
Post not yet marked as solved
2 Replies
72 Views
I'm debugging some Regex Builder code in my Playground. I run the following piece code: let timeMatchWithout = possibleTime.firstMatch(of: timeWithoutSec) and I get this error message: Regex.Match optional storedCapture contains no some What could this possibly mean? contains no some??? Here is a more complete snippet, if this helps: let hourRef = Reference&lt;Substring&gt;() let minuteRef = Reference&lt;Substring&gt;() let hourReg = Regex { ChoiceOf { Capture(as: hourRef) { One(.digit) One(.digit) } Capture(as: hourRef) { One(.digit) } } } let minuteReg = Regex { ChoiceOf { Capture(as: minuteRef) { One(.digit) One(.digit) } Capture(as: minuteRef) { One(.digit) } } } let ampmRef = Reference&lt;Substring&gt;() let ampmReg = Regex { Capture(as: ampmRef) { ZeroOrMore { ChoiceOf { One("am") One("pm") One("a.m.") One("p.m.") } } } /* transform: { $0.lowercase } */ }.ignoresCase() let timeWithoutSec = Regex { hourReg One(":") minuteReg ZeroOrMore(.whitespace) ampmReg }.ignoresCase() let possibleTime = "10:20 AM" let timeMatchWithout = possibleTime.firstMatch(of: timeWithoutSec) The last line produces the error message. Thanks for the help. Note the removed transform: on the ampmReg definition. If that is included the compiler times out as noted in my previous post, yesterday.
Posted
by RJStover.
Last updated
.
Post not yet marked as solved
1 Replies
53 Views
I have macOS application and I want to provide an action for it in the 'Shortcuts' app QuickAction list. I was using InApp handling to present my intent created in the intent.intentdefinition file as action in the shortcuts app and it was working. However, this action perform a very lightweight task so I intent to have the action implemented as an extension in my xcode project. According to my minimum deployment(i.e macOS 11.0) I found that 'Intents Extension' could be used. I have added the 'Intents extension' target to my main application and created an intent using the intent.intentdefinition file. However, my intent does not appear in the shortcuts app. I have verified it multiple time to ensure I am not missing anything, but still the intent is not present in the shortcuts app action. I wanted to be know, Is this even possible? cause this apple documentation only mentions about iOS and watchOS app. It also does not mention If our custom intent(created using Intents extension) in the intents extension can be exposed to the shortcuts app. For macOS 13.0+, I have used the 'AppIntents extension' and I m able to achieve the same. So, I suppose the same should be possible using the 'Intents extension'
Posted Last updated
.
Post not yet marked as solved
2 Replies
105 Views
Why does this Regex Builder code in my SwiftUI app not work? I'm parsing a string that might be a date and time with either AM or PM specified for the time. This bit of code looks for the optional AM or PM. The error I get is: The compiler is unable to type-check this expression in reasonable time; try breaking up the expression into distinct sub-expressions What would 'distinct sub-expressions' mean in this case? The code: let ampmRef = Reference<Substring>() let ampmReg = Regex { Capture(as: ampmRef) { ZeroOrMore { ChoiceOf { One("am") One("pm") } } } transform: { $0.lowercase } }.ignoresCase() In a related question, is there a way to return a default if the ChoiceOf fails both AM and PM?
Posted
by RJStover.
Last updated
.
Post not yet marked as solved
0 Replies
93 Views
Hi, ho un'idea per un'applicazione che può rivoluzionare e agevolare riguardo l'utilizzo degli apparati video (telefoni, monitor, tv) per almeno il75/80% dell'intera popolazione mondiale. Vorrei che fosse un'operazione realizzata inizialmente da apple ios/os e in seguito su tutti i supporti. Mi piacerebbe creare, sviluppare o cedere l'idea ma non so proprio da dove iniziare. potete darmi info su questo? so che può sembrare una richiesta astrana, ma sono certo che questa app sarà una vera rivoluzione per tutti, o quasi Grazie Giovanni da Cagliari, Sardegna, Italia Hi, I have an idea for an application that can revolutionize and facilitate the use of video equipment (telephones, monitors, TVs) for at least 75/80% of the entire world population. I would like it to be an operation carried out initially by apple ios/os and later on all media. I would like to create, develop or sell the idea but I really don't know where to start. can you give me info on this? I know it may seem like a strange request, but I'm sure that this app will be a real revolution for everyone, or almost everyone Thank you Giovanni from Italy [Edited by Moderator]
Posted Last updated
.
Post not yet marked as solved
1 Replies
56 Views
My Xcode workspace contains build settings for a macOS, iOS, and tvOS application. My Sandbox macOS app builds just fine and works great - and is on the App Store. I am in the process of creating a new build / branch of this app that is not Sandboxed so that I can add IPC (Syphon support) - as I don't think I can use App Groups to enable CFMessage support (which Syphon requires) because Syphon (third party framework) - uses its own naming convention for the ports. Anyway, sandbox support for a Syphon app is a topic for another day (it's actually quite disappointing that I can't release a Syphon version on the App Store). The trouble I am having, is that even afer deleting the App Sandbox entitlement from my project, my App still seems to be running in the App Sandbox, and I can't figure out how to remove the App Sandbox entitlement completely. What I am seeing, is that even after deleting the App Sandbox entitlement (using the project settings and deleting it in the "Signing and Capabilities" tab (and also checking the entitlements file manually to doubly make sure it is gone) - I am still seeing the following error message: *** CFMessagePort: bootstrap_register(): failed 1100 (0x44c) 'Permission denied', port = 0x8703, name = 'info.v002.Syphon.332143F7-0916-428A-A88A-59B752F95304' See /usr/include/servers/bootstrap_defs.h for the error codes. It is also saving my Application Support data in the ~/Library/Containers folder, and not in ~/Library/ApplicationSupport What step am I missing?
Posted
by EulerDev.
Last updated
.
Post not yet marked as solved
0 Replies
63 Views
Good afternoon, After a long time of using of Macbook security popups with requesting access from apps start appearing For example today I opened vscode to work with nuxtjs and 3 popups appeared: vscode requests access to photos, calendar, contacts, desktop, icloud etc The same happens with PHPStorm. If I open terminal, the same things happen with terminal I haven't installed anything and haven't updated anything Then I decided to update the latest MacOS, thought that it may help, but it didn't help My questions are: How to fix that? All applications even terminal should not have such permission requests Is it a bug and it will be fixed in a patch? Why do these popups always appearing if I clicked Don't allow? OS: MacOs Sonoma 14.5 Mac book Pro 2019
Posted
by EugeneXO.
Last updated
.

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all