Post not yet marked as solved
I have macOS application and I want to provide an action for it in the 'Shortcuts' app QuickAction list. I was using InApp handling to present my intent created in the intent.intentdefinition file as action in the shortcuts app and it was working.
However, this action perform a very lightweight task so I intent to have the action implemented as an extension in my xcode project.
According to my minimum deployment(i.e macOS 11.0) I found that 'Intents Extension' could be used.
I have added the 'Intents extension' target to my main application and created an intent using the intent.intentdefinition file. However, my intent does not appear in the shortcuts app. I have verified it multiple time to ensure I am not missing anything, but still the intent is not present in the shortcuts app action.
I wanted to be know, Is this even possible? cause this apple documentation only mentions about iOS and watchOS app. It also does not mention If our custom intent(created using Intents extension) in the intents extension can be exposed to the shortcuts app.
For macOS 13.0+, I have used the 'AppIntents extension' and I m able to achieve the same. So, I suppose the same should be possible using the 'Intents extension'
Post not yet marked as solved
HI!
I am developing an application that should utilize ScriptingBridge.framework to interact with another process. Firstly, I created a separate test application for which I have added Apple Events entitlements via "Signing & Capabilities" section in Xcode and updated its Info.plist to have "Privacy - AppleEvents Sending Usage Description". While the test app works fine (I see an automation request popup and the process executes as expected) the main application where I want to integrate this functionality gets closed immediately after reaching the code interacting with Scripting Bridge.
On its launch, I see the following error message from tccd in Console:
Prompting policy for hardened runtime; service: kTCCServiceAppleEvents requires entitlement com.apple.security.automation.apple-events but it is missing for accessing={TCCDProcess: identifier=<app bundleID>, ..., binary_path=<path to the app's binary>}
I had no such issues with the test app. Moreover, I should mention that the bundle I want to have with such functionality is stored in another bundle, both main and inner bundles aren't sandboxed, and the target app has Application is agent (UIElement) key set in Info.plist.
Can you suggest any ideas as to why processes behave so differently despite having pretty much the same build configurations?
Post not yet marked as solved
Hello,
I am developing a private internal Flutter app for our customer, which will not be published on the Apple Store. One of the key features of this app is to collect RF strength metrics to share user experience with the network.
For Android, we successfully implemented the required functionality and are able to collect the following metrics:
Signal strength level (0-4)
Signal strength in dBm
RSSI
RSRQ
Cell ID
Location Area Code
Carrier name
Mobile country code
Mobile network code
Radio access technology
Connection status
Duplex mode
However, for iOS, we are facing challenges with CoreTelephony, which is not returning the necessary data. We are aware that CoreTelephony is deprecated and are looking for alternatives.
We noticed that a lot of the information we need is available via FTMInternal-4. Is there a way to access this data for a private app? Are there any other recommended approaches or frameworks that can be used to gather cellular network information on iOS for an app that won't be distributed via the Apple Store?
my swift code
import Foundation
import CoreTelephony
class RfSignalStrengthImpl: RfSignalStrengthApi {
func getCellularSignalStrength(completion: @escaping (Result<CellularSignalStrength, Error>) -> Void) {
let networkInfo = CTTelephonyNetworkInfo()
guard let carrier = networkInfo.serviceSubscriberCellularProviders?.values.first else {
completion(.failure(NSError(domain: "com.xxxx.yyyy", code: 0, userInfo: [NSLocalizedDescriptionKey: "Carrier not found"])))
return
}
let carrierName = carrier.carrierName ?? "Unknown"
let mobileCountryCode = carrier.mobileCountryCode ?? "Unknown"
let mobileNetworkCode = carrier.mobileNetworkCode ?? "Unknown"
let radioAccessTechnology = networkInfo.serviceCurrentRadioAccessTechnology?.values.first ?? "Unknown"
var connectionStatus = "Unknown"
...
...
}
Thank you for your assistance.
Post not yet marked as solved
In Declarative Device Management there is the Get Server Supported Declarations endpoint that is sent via an MDM Check-In request. Is this supposed to return all of the declarations supported by the server, or only the ones that are intended for the device making the request?
This seems like a bad choice of naming for that endpoint and, if my assumption is correct it should be named more along the lines of "Get Device Declarations"
Or am I fundamentally misunderstanding DDM and our server should be sending all declarations we have to the device and the device controls them via activations? This seems counter to the pitch around scalability and performance improvements that DDM offers if we have to send literally everything to the device even if it's known to not be needed, and similarly if the device doesn't support it but the server does then obviously(?) the server shouldn't send it to the device.
Post not yet marked as solved
Can someone please explain the purpose of the ManagementServerCapabilities declaration in Declarative Device Management?
I understand based on the documentation that it contains a "dictionary that contains the server’s optional protocol features" but what would be an example of an "optional protocol feature"?
Post not yet marked as solved
Hello
Please can you tell me if this 'warning' was given because of a Translocation Error?
Howard Oakley, the developer of Mints, will not talk to me about this. The image has not been tampered with in any way.
Your advice will be most welcome.
Thanks.
Post not yet marked as solved
Hi,
Thanks for getting back to me regarding my query. I'm developing a Cordova app that includes games with an initial free tier and additional tiers available through in-app purchases. Here’s a detailed explanation of what I’m aiming to achieve:
Users can play a set of games for free initially (Free Tier).
After playing a certain number of games, users can purchase additional game tiers:
Tier 1: Adds 3 more games.
Tier 2: Adds 6 more games.
Tier 3: Adds 9 more games.
Users can continue playing the games in any purchased tier indefinitely.
If users do not wish to purchase additional tiers, they can continue playing the free tier games with limited themes but without restrictions on play count.
Questions:
How can I configure App Store Connect to offer the Free Tier initially and then present the in-app purchase options for the additional tiers?
Is there a specific configuration in App Store Connect that supports this model, or do I need to handle this logic within the app itself?
I appreciate any guidance you can provide on setting this up correctly.
Best regards,
T
Post not yet marked as solved
Hi, I want to create an offline mode for my iOS app. Where is it better to keep files in terms of download speed: in a CDN like Cloudflare or using On-Demand Resources? The total file size is 2 GB.
Post not yet marked as solved
Does anyone know what this might be?
/private/tmp/game-porting-toolkit-20240516-2293-dagdsr/wine/dlls/wow64cpu/cpu.c:356:18: error: assignment to 'DWORD64' {aka 'long long unsigned int'} from 'PVOID' {aka 'void '} makes integer from pointer without a cast [-Wint-conversion]
356 | context->Rsp = NtCurrentTeb()->TlsSlots[2]; / WOW64_TLS_WINEHYBRID_RESERVED_R14 */
| ^
make: *** [dlls/wow64cpu/cpu.cross.o] Error 1
make: *** Waiting for unfinished jobs....
Path: /usr/local/Homebrew/Library/Taps/apple/homebrew-apple/Formula/game-porting-toolkit.rb
==> Configuration
HOMEBREW_VERSION: 4.3.0
ORIGIN: https://github.com/Homebrew/brew
HEAD: 8378cc825d83acffd125fb0fec041793df378a57
Last commit: 3 days ago
Core tap JSON: 17 May 05:28 UTC
Core cask tap JSON: 17 May 05:28 UTC
HOMEBREW_PREFIX: /usr/local
HOMEBREW_CASK_OPTS: []
HOMEBREW_MAKE_JOBS: 10
Homebrew Ruby: 3.1.4 => /usr/local/Homebrew/Library/Homebrew/vendor/portable-ruby/3.1.4/bin/ruby
CPU: 10-core 64-bit westmere
Clang: 15.0.0 build 1500
Git: 2.39.3 => /Library/Developer/CommandLineTools/usr/bin/git
Curl: 8.4.0 => /usr/bin/curl
macOS: 14.4.1-x86_64
CLT: 15.1.0.0.1.1700200546
Xcode: 15.1 => /Users/raiderpig/Downloads/Xcode.app/Contents/Developer
Rosetta 2: true
Post not yet marked as solved
for (int i = 0; i < 1000; i++){
double st_tmp = CFAbsoluteTimeGetCurrent();
retBuffer = [self.enhancer enhance:pixelBuffer error:&error];
double et_tmp = CFAbsoluteTimeGetCurrent();
NSLog(@"[enhance once] %f ms ", (et_tmp - st_tmp) * 1000);
}
When I run a CoreML model using the above code, I notice that the runtime gradually decreases at the beginning.
output:
[enhance once] 14.965057 ms
[enhance once] 12.727022 ms
[enhance once] 12.818098 ms
[enhance once] 11.829972 ms
[enhance once] 11.461020 ms
[enhance once] 10.949016 ms
[enhance once] 10.712981 ms
[enhance once] 10.367990 ms
[enhance once] 10.077000 ms
[enhance once] 9.699941 ms
[enhance once] 9.370089 ms
[enhance once] 8.634090 ms
[enhance once] 7.659078 ms
[enhance once] 7.061005 ms
[enhance once] 6.729007 ms
[enhance once] 6.603003 ms
[enhance once] 6.427050 ms
[enhance once] 6.376028 ms
[enhance once] 6.509066 ms
[enhance once] 6.452084 ms
[enhance once] 6.549001 ms
[enhance once] 6.616950 ms
[enhance once] 6.471038 ms
[enhance once] 6.462932 ms
[enhance once] 6.443977 ms
[enhance once] 6.683946 ms
[enhance once] 6.538987 ms
[enhance once] 6.628990 ms
...
In most deep learning inference frameworks, there is usually a warmup process, but typically, only the first inference is slower. Why does CoreML have a decreasing runtime at the beginning? Is there a way to make only the first inference time longer, while keeping the rest consistent?
I use the CoreML model in the (void)display_pixels:(IJKOverlay *)overlay function.
Post not yet marked as solved
URLSession.shared.downloadTask doesn't work apple watch. I monitored network traffic but it doesn't. have any activity. Data(contentsOf: URL(string: url)!) works perfectly. Also, it works perfectly in preview. What do i do? Ask for more context if necessary.
Post not yet marked as solved
When I accessed the page "Certificates, IDs, & Profiles", it shows
"Unable to find a team with the given Team ID 'XXXXXXXXXX' to which you belong. Please contact Apple Developer Program Support. https://developer.apple.com/support".
I confirmed my developer account was paid in 6 months and should be valid. I accessed 2 weeks ago without any problem.
This is the screenshot.
Post not yet marked as solved
Is it possible to initiate an iap flow from App Clips?
There's not a clear answer to this in the docs. StoreKit is NOT listed as an unapproved framework for App Clips, but in-app purchases ARE listed as "not recommended" for App Clip functionality.
I tried setting up a test with a StoreKit config file on the App Clip in XCode and the products weren't returned but... storekit testing... it would be great to get confirmation on this functionality after the iOS 17 updates.
Post not yet marked as solved
WindowGroup{
SolarDisplayView()
.environment(model)
}
.windowStyle(.plain)
Why is the code above correct while the code below reports an error? How to modify the following code?
WindowGroup{
SolarDisplayView()
.environment(model)
}
.windowStyle(model.isShow ? .plain : .automatic)
Post not yet marked as solved
I uploaded a build to testflight and released it to Internal testers, however when the testers tried downloading the app a alert dialog shows the following:
Could not install [App] The requested app is not available or doesn't exist.
Also tried to submit it via external testers but getting another error.
Post not yet marked as solved
Hi,
I wanted to use Siri Capability for a WatchOS app, however in xcode on a WatchOS project, the option to add Siri is not present.
In an IOS project this is visible but if you are not part of the ADP or ADEP you do not have access to it, this message appears in red if you try to select it as a personal team.
I am considering paying to join the ADP but I am unsure if it will unlock the ability to use Siri capability on WatchOS. It looks like it is completely unsupported as it cannot be even selected from the capabilities section in xcode , even though Apple states it is supported under ADP and ADEP on their website. I am a little confused.
Does anyone else have this issue, or is Siri present under capabilities for you in a WatchOS project?
Post not yet marked as solved
Hi everyone,
I'm encountering an intermittent issue with my Xcode Cloud CI/CD pipeline when pulling Swift Package Manager (SPM) dependencies from AWS CodeArtifact. The build process occasionally fails with an SSL error, but other times it succeeds without any issues. This inconsistency is causing significant disruption to our continuous integration process.
Environment:
Xcode Cloud
Swift Package Manager (SPM) for dependency management
AWS CodeArtifact as the package registry
Error Message:
Error: registry login using https://xx-xx-xx.codeartifact.eu-central-1.amazonaws.com/swift/***/login failed: The certificate for this server is invalid. You might be connecting to a server that is pretending to be "xx-xx-xx.codeartifact.eu-central-1.amazonaws.com" which could put your confidential information at risk.. Would you like to connect to the server anyway?
Post not yet marked as solved
We currently have a PacketTunnelProvider providing VPN to managed devices. Our profile locks this down with OnDemandEnabled and OnDemandUserOverrideDisabled set to true.
We've received some reports that on device startup, there is a time period after Wi-Fi connects but before the OnDemand VPN kicks in to enable our VPN, where users are able to navigate to IPs that are meant to be captured by the VPN tunnel. Instead, they are able to reach these IPs directly during this time period.
Is there an expectation in regards to when OnDemand VPN is allowed to kick in to enable the VPN? Is there anything that we can do to minimize this delay?
Post not yet marked as solved
I'm working on an Angular application that retrieves static data (JSON, MP3, and images) from a backend server, with a cache control response header set to Cache-Control: public, max-age=2592000. I expect these files to be served from either disk or memory cache after the initial request. However, in Safari, the browser sometimes fetches the data from the cache and other times makes a network call. This inconsistent behavior is particularly noticeable with MP3 files, whereas JSON and image files are consistently served from the cache as expected.
I've tested this on multiple Safari versions and observed the same issue:
Version 17.2 (19617.1.17.11.9)
Version 17.1 (19616.2.9.11.7)
Version 17.3 (19617.2.4.11.8)
I confirmed that the "Disable Cache" option is not enabled in the developer tools, so the MP3 files should be cached. This functionality works correctly in Chrome and Firefox without any issues.
Post not yet marked as solved
In the past, Apple recommended restricting USDZ models to a maximum of 100,000 triangles and a texture sizes of 2048x2048 for Apple QuickLook (and I think for RealityKit on iOS in general).
Does Apple have any recommended max polygon counts for visionOS? Is it the same for models running in a Volumetric window in the shared space and in ImmersiveSpace?
What is the recommended texture size for visionOS? (I seem to recall 8192x8192, but I can't find it now)