Post not yet marked as solved
Hello, I'm applying to the development program. However, an unknown error appears in the account. Can you help identify this error?
We are unable to process your request.
An unknown error occurred.
The error will appear after filling out the form in Enroll Today and selecting as Sole Ent.
I wrote to support 5 days ago, but there was no response.
Post not yet marked as solved
Hi,
just generated a HDR10 MVHEVC file, mediainfo is below:
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Codec configuration box : hvcC+lhvC
then generate the segment files with below command:
mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov
then upload the segment files and prog_index.m3u8 to web server.
just find that can not play the HLS stream on Safari...
the url is http://ip/vod/prog_index.m3u8
just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file.
above same mediafilesegmenter command and upload the files to web server.
the new version of HLS stream is can play on Safari...
Is there any way to play HLS PQ video on Safari. thanks.
Post not yet marked as solved
Hello,
I use Preview to quickly test functionalities. However I found that the unified logging does not output to Preview console. So I had to do both log.debug and print. Is there a way to enable logging to Preview console?
Post not yet marked as solved
Hi,
just find that I can not play 3D movie stream: MV-HEVC/UHD/Dolby Vision/Dolby Digital/Dolby Atmos(website is https://developer.apple.com/streaming/examples)
when I click View 3D example(fMP4), have below issue: my MacOS is 14.4.1 and M2 chip.
Post not yet marked as solved
Hi everyone,
I wanted to ask if anybody knows what the current status is about the declaration of required reasons APIs.
Before May 1, when I uploaded a new build to the App Store Connect and added it to a group with external testers, I got a notification by email like the following:
ITMS-91053: Missing API declaration - Your app’s code in the [...] file references one or more APIs that require reasons, including the following API categories: NSPrivacyAccessedAPICategoryDiskSpace. While no action is required at this time, starting May 1, 2024, when you upload a new app or app update, you must include a NSPrivacyAccessedAPITypes array in your app’s privacy manifest to provide approved reasons for these APIs used by your app’s code.
In an article published by Apple (https://developer.apple.com/documentation/bundleresources/privacy_manifest_files/describing_use_of_required_reason_api) it is even statet that that after May 1, apps that do not comply are not accepted by the App Store Connect. According to my interpretation, even the upload should be rejected.
I am currently in the process to add a privacy manifest and add the declarations. For testing purposes, I wanted to add the declarations step by step and see where I still need to fix anything. My problem is, that the warnings by apple are not beeing sent anymore. I have uploaded a new build after May 1 with no privacy manifest and therefore no API declarations, it was accepted by App Store Connect and even passed the review for an external testers group.
Does anybody have information about the following questions?
Did Apple shift the deadline?
How can I trigger the warning emails again so that I know what to fix and see, when my app is compliant?
Thanks in advance!
Post not yet marked as solved
It seems like this may have been an issue for a while based on what I've seen, but I have added a toolbar item to a textfield keyboard and it doesn't show. The only way I can get it to show is by opening the keyboard, typing something, closing the keyboard, and then reopening it. Anyone have a workaround for this? It's like Apple purposely wants to make it difficult to close the keyboard.
TextField("Something here...", text: $text, axis: .vertical)
.multilineTextAlignment(.leading)
.toolbar {
ToolbarItemGroup(placement: .keyboard, content: {
Button("Close") { }
})
}
Post not yet marked as solved
I have recently started using apple shortcuts to connect to a server to control a remote device.
This remote device replies using JSON and depending on the request, it replies with a different status message.
I would like to be able to read the reply in the shortcut and change the icon and/or the colour of the shortcut on the home screen depending on the reply.
This could be used for locking a door where the reply indicates if the door is locked/unlocked by way of a closed/open padlock or by changing the background colour of the shortcut.
Post not yet marked as solved
I'm developing an Apple TV OS app using SwiftUI. I'm encountering an issue with the color of the tab item. I want the color of the tab item to turn green when focused; for instance, when the tab item is selected, its background color should be green. Below is my current code, but it's not working. I've tried several other approaches, but none have worked.
struct ContentView: View {
@State private var selectedTab = 0
var body: some View {
ZStack(alignment: .topLeading){
TabView(selection: $selectedTab) {
HomeView()
.tabItem {
AppColors.gradientColor1
Label("Home", image: "home-dash")
}
.tag(0)
ProductsView()
.tabItem {
Label("Products", image: "open box-search")
}
.tag(1)
SearchView()
.tabItem {
Label("Search", image: "search")
}
.tag(2)
}
.accentColor(Color.green)
.background(.black)
Image("amazontabbarlogo")
}
}
}
Post not yet marked as solved
How can a programs be launched at startup if it is not in Launch Options and Launch Daemons/Agents? Spotify, for example.
Post not yet marked as solved
If my app utilized the RoomPlan api to create a parametric representation of the room, would it open on iPhones that don’t have lidars? I‘m aware the iPhone models that are equipped with lidar are iPhone 12 Pro & Pro Max, iPhone 13 Pro & Pro Max, iPhone 14 Pro & Pro Max, and iPhone 15 Pro & Pro Max.
Post not yet marked as solved
I was following the Swiftui tutorial at section 6 (https://developer.apple.com/tutorials/swiftui/creating-and-combining-views) to use a circle image to create an overlapping effect on the map. Turns out that when using a GeometryReader, the bottom padding was not working at all:
VStack{
MapView().frame(height:300)
CircleImage()
.offset(y: -130)
.padding(.bottom, -130)
VStack(alignment:.leading){
Text(
"Turtle Rock"
)
.font(
.title
)
HStack {
Text(
"Joshua Tree National Park"
)
.font(
.subheadline
)
Spacer()
Text(
"California"
)
.font(
.subheadline
)
}
}.padding(/*@START_MENU_TOKEN@*/10/*@END_MENU_TOKEN@*/)
}
This is the code for the CicleImage view: GeometryReader(content: { geometry in
let _ = print(
geometry.size.width
)
AsyncImage(
url: URL(
string: "https://cms.rationalcdn.com/v3/assets/blteecf9626d9a38b03/bltf5486c52361f2012/6144fafd39dff133fc23de9f/img-ios.png"
)
)
.frame(width: geometry.size.width)
.clipShape(
Circle()
).overlay{
Circle().stroke(
.white,
lineWidth: 4
)
}.shadow(
radius: 7
)
})
Post not yet marked as solved
Is it possible to create a material that behaves the same as AR ground plane, but in a NON-AR setup?
Any help appreciated.
Post not yet marked as solved
My Macbook Air mid-2013 model running MacOS Big Sur 11.7.10 has its system preference broken. The padlock to make changes doesn't do anything when clicked and so does every other tab or setting. Now, I can get into panels like Sound or Accessibilities, but everything that I SHOULD be able to do within them just... doesn't work. Nothing happens when I click on them. It's like they're just images.
Post not yet marked as solved
Hi all, I need some help debugging some code I wrote. Just as a preface, I'm an extremely new VR/AR developer and also very new to using ARKit + RealityKit. So please bear with me :) I'm just trying to make a simple program that will track an image and place an entity on it. The image is tracked correctly, but the moment the program recognizes the image and tries to place an entity on it, the program crashes. Here’s my code:
VIEWMODEL CODE:
Observable class ImageTrackingModel {
var session = ARKitSession() // ARSession used to manage AR content
var imageAnchors = [UUID: Bool]() // Tracks whether specific anchors have been processed
var entityMap = [UUID: ModelEntity]() // Maps anchors to their corresponding ModelEntity
var rootEntity = Entity() // Root entity to which all other entities are added
let imageInfo = ImageTrackingProvider(
referenceImages: ReferenceImage.loadReferenceImages(inGroupNamed: "referancePaper")
)
init() {
setupImageTracking()
}
func setupImageTracking() {
if ImageTrackingProvider.isSupported {
Task {
try await session.run([imageInfo])
for await update in imageInfo.anchorUpdates {
updateImage(update.anchor)
}
}
}
}
func updateImage(_ anchor: ImageAnchor) {
let entity = ModelEntity(mesh: .generateSphere(radius: 0.05)) // THIS IS WHERE THE CODE CRASHES
if imageAnchors[anchor.id] == nil {
rootEntity.addChild(entity)
imageAnchors[anchor.id] = true
print("Added new entity for anchor \(anchor.id)")
}
if anchor.isTracked {
entity.transform = Transform(matrix: anchor.originFromAnchorTransform)
print("Updated transform for anchor \(anchor.id)")
}
}
}
APP:
@main
struct MyApp: App {
@State var session = ARKitSession()
@State var immersionState: ImmersionStyle = .mixed
private var viewModel = ImageTrackingModel()
var body: some Scene {
WindowGroup {
ModeSelectView()
}
ImmersiveSpace(id: "appSpace") {
ModeSelectView()
}
.immersionStyle(selection: $immersionState, in: .mixed)
}
}
Content View:
RealityView { content in
Task {
viewModel.setupImageTracking()
}
} //Im serioulsy so clueless on how to use this view
Post not yet marked as solved
Hello, I'll be objective: when I compile any APP in my XCode and transfer the APP to my iPhone, including the test APP "Hello world!", whether via network or USB cable, when I open the APP it simply doesn't work and the iPhone crashes. Only that. My XCode is 15.3, iPhone 14 Pro Max, IOS 17.5, macOS latest version.
Post not yet marked as solved
I am using @AppStorage in a model object (see code below that works as expected).
class ModelObject {
static let shared = ModelObject()
@AppStorage("enhanced") var scriptPickers: Bool = true
var defaultDependentValue: String {
scriptPickers ? "Enhanced" : "NOT enhanced"
}
}
struct ContentView: View {
@AppStorage("enhanced") var scriptPickers: Bool = true
var body: some View {
VStack {
Toggle(isOn: $scriptPickers, label: {
Text("userDefault val")
})
Text("value: \(ModelObject.shared.defaultDependentValue)")
}
}
}
Now I want to test my model object in a way that will allow me to use a mock instance of UserDefaults, but am having trouble with the syntax. I tried adding a userDefaults var, and referring to the var in the @AppStorage
class ModelObject {
static let shared = ModelObject()
let userDefaults: UserDefaults
init(userDefaults: UserDefaults = .standard) {
self.userDefaults = userDefaults
}
@AppStorage("enhanced", store: userDefaults) var scriptPickers: Bool = true
var defaultDependentValue: String {
scriptPickers ? "Enhanced" : "NOT enhanced"
}
}
However I can't find a way to avoid the syntax error this generates:
Cannot use instance member 'userDefaults' within property initializer; property initializers run before 'self' is available
Any guidance on how I might be able to:
continue using @AppStorage
be able to test my class in a way that doesn't force me to use UserDefaults.standard
thanks, in advance,
Mike
Post not yet marked as solved
Can I use Apple's sound recognition in my augmented reality app to trigger content ? Or is there another source I can use?
Post not yet marked as solved
Seeing the following, whether initializing Maps() in SwiftUI or using Apple's example Overlay Project since updating to Xcode 15.3:
Thread Performance Checker: Thread running at User-interactive quality-of-service class waiting on a thread without a QoS class specified (base priority 0). Investigate ways to avoid priority inversions
PID: 2148, TID: 42369
Backtrace
=================================================================
3 VectorKit 0x00007ff81658b145 ___ZN3geo9TaskQueue5applyEmNSt3__18functionIFvmEEE_block_invoke + 38
4 libdispatch.dylib 0x00000001036465c2 _dispatch_client_callout2 + 8
5 libdispatch.dylib 0x000000010365d79b _dispatch_apply_invoke3 + 527
6 libdispatch.dylib 0x000000010364658f _dispatch_client_callout + 8
7 libdispatch.dylib 0x0000000103647c6d _dispatch_once_callout + 66
8 libdispatch.dylib 0x000000010365c89b _dispatch_apply_redirect_invoke + 214
9 libdispatch.dylib 0x000000010364658f _dispatch_client_callout + 8
10 libdispatch.dylib 0x000000010365a67f _dispatch_root_queue_drain + 1047
11 libdispatch.dylib 0x000000010365af9d _dispatch_worker_thread2 + 277
12 libsystem_pthread.dylib 0x00000001036e2b43 _pthread_wqthread + 262
13 libsystem_pthread.dylib 0x00000001036e1acf start_wqthread + 15```
Post not yet marked as solved
Posting this on behalf of my colleague, who has a project in mind that requires a huge amount of RAM. Is it true that modern Mac Pro's can only have up to 192GB of RAM which is about 8 times less than 5 years old intel based Mac Pros?