Posts

Sort by:
Post not yet marked as solved
0 Replies
1 Views
Dear all, I'm building an app leveraging SwiftData and I have the following two classes: Stagione: import SwiftData @Model class Stagione { @Attribute(.unique) var idStagione: String var categoriaStagione: String var miaSquadra: String @Relationship(deleteRule: .cascade) var rosa: [Rosa]? @Relationship(deleteRule: .cascade) var squadra: [Squadre]? @Relationship(deleteRule: .cascade) var partita: [CalendarioPartite]? init(idStagione: String, categoriaStagione: String, miaSquadra: String) { self.idStagione = idStagione self.categoriaStagione = categoriaStagione self.miaSquadra = miaSquadra } } Squadre: import SwiftData @Model class Squadre { var squadraCampionato: String var stagione: Stagione? init(squadraCampionato: String) { self.squadraCampionato = squadraCampionato } } Now, I have a view in which I'm calling a sheet to insert some Squadre: // Presenta il foglio per aggiungere una nuova partita GroupBox(label: Text("Dettagli Partita").font(.headline).padding()) { VStack { HStack { Text("Giornata:") TextField("Giornata", text: $idGiornata) .frame(width: 30) .textFieldStyle(RoundedBorderTextFieldStyle()) .padding() } DatePicker("Data Partita:", selection: $dataPartita, displayedComponents: .date) .padding() HStack { Text("Squadra Casa:") .frame(width: 150) TextField("Squadra Casa", text: $squadraCasa) .textFieldStyle(RoundedBorderTextFieldStyle()) .padding() TextField("Gol Casa", text: $golCasa) .textFieldStyle(RoundedBorderTextFieldStyle()) .padding() } HStack { Text("Squadra Trasferta:") .frame(width: 150) TextField("Squadra Trasferta", text: $squadraTrasferta) .textFieldStyle(RoundedBorderTextFieldStyle()) .padding() TextField("Gol Trasferta", text: $golTrasferta) .textFieldStyle(RoundedBorderTextFieldStyle()) .padding() } HStack { Button("Salva") { if let partitaSelezionata = partitaSelezionata { // Se è stata selezionata una partita, aggiorna i suoi dati if let index = partite.firstIndex(where: { $0.id == partitaSelezionata.id }) { partite[index].idGiornata = Int(idGiornata) ?? 0 partite[index].dataPartita = dataPartita partite[index].squadraCasa = squadraCasa partite[index].golCasa = Int(golCasa) ?? 0 partite[index].squadraTrasferta = squadraTrasferta partite[index].golTrasferta = Int(golTrasferta) ?? 0 } } else { // Altrimenti, aggiungi una nuova partita aggiungiPartita(stagione: stagione) } // Chiudi il foglio di presentazione mostraAggiungiPartita = false // Resetta il campo di input idGiornata = "" dataPartita = Date() squadraCasa = "" golCasa = "" squadraTrasferta = "" golTrasferta = "" } .buttonStyle(.borderedProminent) .disabled(idGiornata.isEmpty || squadraCasa.isEmpty || squadraTrasferta.isEmpty || golCasa.isEmpty || golTrasferta.isEmpty) // Bottone Chiudi Button("Chiudi") { mostraAggiungiPartita = false } .buttonStyle(.borderedProminent) } } .padding() } } I'd like to insert a autocomplete function in the textfields "Squadra Casa" and "Squadra Trasferta", based on the list of Squadre contained in the class "Squadre" and filtered for a specific Stagione. Has anybody of you made something similar? Do you have any suggestions or code example which I can use? Thanks, A.
Posted
by
Post not yet marked as solved
0 Replies
11 Views
I just bought a Vision Pro to build and run my app on it, but I'm encountering this error from Xcode: Developer Mode is enabled. Computer is paired. Device is paired, and it's connected to my Mac via the Developer Strap. In the "VPN & Device Management" setting it says to navigate to, there is no "Developer App certificate". Others have suggested clearing the ModuleCache in DerivedData, but that's also been fruitless. I've cleaned the build multiple times, and restarted both devices, and Xcode. I have no idea what else I can possibly do to resolve this. Does anyone have any other ideas?
Posted
by
Post not yet marked as solved
0 Replies
25 Views
Hello. How to write this command correctly on a Macbook, in the script editor, so that I can click the "Run script" button and the script will give the result: if there is no folder, then report that there is no folder, if there is a folder, then report that the folder exists. do shell script "test -d 'Users/user/Desktop/New folder'" Now, if the folder exists, an empty string ("") is returned, if the folder does not exist, the script editor reports that an error has occurred. In general, my task is to write a script that checks the existence of a folder.
Posted
by
Post not yet marked as solved
0 Replies
18 Views
Users have reported unusually high data usage with my app. So to investigate I have profiled in instruments. My app as expected in using minimal data. However in instruments I see an "Unknown" process. Which sends around 1mb of data every 2 seconds. Can anyone explain what unknown process is? Sorry my question is vague but I'm at the beginning of understanding the instruments outputs so your help is so very much appreciated.
Posted
by
Post not yet marked as solved
0 Replies
73 Views
Some Macs recently received a macOS system update which disabled the simulator runtimes used by Xcode 15, including the simulators for iOS, tvOS, watchOS, and visionOS. If your Mac received this update, you will receive the following error message and will be unable to use the simulator: The com.apple.CoreSimulator.SimRuntime.iOS-17-2 simulator runtime is not available. Domain: com.apple.CoreSimulator.SimError Code: 401 Failure Reason: runtime profile not found using "System" match policy Recovery Suggestion: Download the com.apple.CoreSimulator.SimRuntime.iOS-17-2 simulator runtime from the Xcode To resume using the simulator, please reboot your Mac. After rebooting, check Xcode Preferences → Platforms to ensure that the simulator runtime you would like to use is still installed. If it is missing, use the Get button to download it again. The Xcode 15.3 Release Notes are also updated with this information.
Posted
by
Post not yet marked as solved
0 Replies
39 Views
There is some new UI in Xcode for upgrading OS versions. I clicked the new Get button at the top of the screen. It went and got the new iOS 17.4. Great, but it is just sitting there. After much floundering around, I eventually found the UI that has the Install button. Clicked it. Nothing happens. I have restarted Xcode several times, went looking for an Xcode update, no joy. My development is now dead for two days. Really not happy.
Posted
by
Post marked as solved
1 Replies
39 Views
I keep receiving this message : Your payment authorisation failed on card •••8090. Please verify your information and try again, or try another payment method. The problem is that i went already through with my bank and they confirmed nothing is wrong with my bank account they even checked with the Visa team and they confirmed nothing wrong. The bank adviser informed me that no payment was even attempted. I went through the phone with Apple customer support and she couldn't find anything wrong either we tried to pay from my iphone then from my Macbook but i keep getting this error As i have 3 different bank accounts i have tried to pay with 3 different visa debit cards but i get the same error so i believe its not the bank but Apple. Anyone in the same boat ?
Posted
by
Post not yet marked as solved
0 Replies
35 Views
I've been using the App Store Analytics API for a few weeks now with no problem, but yesterday I started to get no data from the ONE_TIME_SNAPSHOT. It's simply returning empty, although we haven't changed anything on the code (already checked 100 times). Anyone here with the same problem?
Posted
by
Post not yet marked as solved
1 Replies
34 Views
If I annotate a class with @Observable I get this error in @Query: Expansion of macro 'Query()' produced an unexpected 'init' accessor If I remove @Observable the error goes away. Elsewhere I have .environment referencing the class. With @Observable this complains that the class needs to be @Observable. I am mystified. Does anyone have a suggestion?
Posted
by
Post not yet marked as solved
0 Replies
36 Views
I'm trying to download artifacts from some recent Xcode Cloud builds. In both Xcode and App Store Connect I'm getting errors. Xcode says: "Error Fetching Test Results: API Invalid status code: 501. App Store Connect says: "artifacts could not be found." FB13773789 - Xcode Cloud: Service returning 501 in Xcode when trying to view artifacts of successful build from minutes ago I have tried several projects to rule out project specific issues and it is happening to all of my Xcode Cloud enabled projects. Both Xcode 15.3 and 15.4 beta exhibit this behavior. Is anyone else running into this issue? I noticed it yesterday, and it continues into this morning.
Posted
by
Post not yet marked as solved
1 Replies
40 Views
I am trying to map the 3D skeleton joint positions of an ARBodyAnchor to the real body on the camera image. I know I could simply use the "detectedBody" of the ARFrame, which would already deliver the normalized 2D position of each joint, but what I am mostly interested in is the z-axis (the distance of each joint to the camera). I am starting a ARBodyTrackingConfiguration, setting the world alignment to ARWorldAlignmentCamera (in which case the camera transform is an identity matrix) and multiplying each joint transform in model space (via modelTransformForJointName:) with the transform of the ARBodyAnchor. And then tried many different ways to get the joints to line up with the image, by for example multiplying the transforms with the projectionMatrix of the ARCamera. But whatever I do, it never lines up correctly. For example, the doesn't really seem to be a scale factor in the projectionMatrix or the ARBodyAnchor transform, no matter the distance of the camera to the detected body, the scale of the body is always the same. Which means I am missing something important, and I haven't figured out what. So does anyone have an example of how I can get the body align to the camera image? (or get the distance to each joint in any other way?) Thanks!
Posted
by
Post not yet marked as solved
1 Replies
38 Views
Dear Sirs, I'm writing an audio application that should show up to 128 horizontal peakmeters (width for each is about 150, height is 8) stacked inside a ScrollViewReader. For the actual value of the peakmeter I have a binding to a CGFloat value. The peakmeter works as expected and is refreshing correct. For testing I added a timer to my swift application that is firing every 0.05 secs, meaning I want to show 20 values per second. Inside the timer func I'm just creating random CGFloat values in range of 0...1 for the bound values. The peakmeters refresh and flicker as expected but I can see a CPU load of 40-50% in the activity monitor on my MacBook Air with Apple M2 even when compiled in release mode. I think this is quite high and I'd like to reduce this CPU load. Should this be possible? I.e. I thought about blocking the refresh until I've set all values? How could this be done and would it help? What else could I do? Thanks and best regards, JFreyberger
Posted
by
Post not yet marked as solved
0 Replies
39 Views
On macOS, that is. The goals are largely for testing, where we'd like to know the maximum and minimum memory our processes are using, but we'd also like to know it on crash. Our current method is to use ps periodically and grab the appropriate field, but is there a better way? (I looked at MetricKit, but it's not as useful on macOS; I filed FB13640765 "MetricKit would be awesome with more mac features" a couple of months ago.)
Posted
by
Post not yet marked as solved
0 Replies
41 Views
Hello, I am working on a fairly complex iPhone app that controls the front built-in wide angle camera. I need to take and display a sequence of photos that cover the whole range of focus value available. Here is how I do it : call setExposureModeCustom to set the first lens position wait for the completionHandler to be called back capture a photo do it again for the next lens position. etc. This works fine, but it takes longer than I expected for the completionHandler to be called back. From what I've seen, the delay scales with the exposure duration. When I set the exposure duration to the max value: on the iPhone 14 Pro, it takes about 3 seconds (3 times the max exposure) on the iPhone 8 1.3s (4 times the max exposure). I was expecting a delay of two times the exposure duration: take a photo, throw one away while changing lens position, take the next photo, etc. but this takes more than that. I also tried the same thing with changing the ISO instead of the focus position and I get the same kind of delays. Also, I do not think the problem is linked to the way I process the images because I get the same delay even if I do nothing with the output. Is there something I could do to make things go faster for this use-case ? Any input would be appreciated, Thanks I created a minimal testing app to reproduce the issue : import Foundation import AVFoundation class Main:NSObject, AVCaptureVideoDataOutputSampleBufferDelegate { let dispatchQueue = DispatchQueue(label:"VideoQueue", qos: .userInitiated) let session:AVCaptureSession let videoDevice:AVCaptureDevice var focus:Float = 0 override init(){ session = AVCaptureSession() session.beginConfiguration() session.sessionPreset = .photo videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back)! super.init() let videoDeviceInput = try! AVCaptureDeviceInput(device: videoDevice) session.addInput(videoDeviceInput) let videoDataOutput = AVCaptureVideoDataOutput() if session.canAddOutput(videoDataOutput) { session.addOutput(videoDataOutput) videoDataOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_32BGRA ] videoDataOutput.setSampleBufferDelegate(self, queue: dispatchQueue) } session.commitConfiguration() dispatchQueue.async { self.startSession() } } func startSession(){ session.startRunning() //lock max exposure duration try! videoDevice.lockForConfiguration() let exposure = videoDevice.activeFormat.maxExposureDuration.seconds * 0.5 print("set max exposure", exposure) videoDevice.setExposureModeCustom(duration: CMTime(seconds: exposure, preferredTimescale: 1000), iso: videoDevice.activeFormat.minISO){ time in print("did set max exposure") self.changeFocus() } videoDevice.unlockForConfiguration() } func changeFocus(){ let date = Date.now print("set focus", focus) try! videoDevice.lockForConfiguration() videoDevice.setFocusModeLocked(lensPosition: focus){ time in let dt = abs(date.timeIntervalSinceNow) print("did set focus - took:", dt, "frames:", dt/self.videoDevice.exposureDuration.seconds) self.next() } videoDevice.unlockForConfiguration() } func next(){ focus += 0.02 if focus > 1 { print("done") return } changeFocus() } func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection){ print("did receive video frame") } }
Posted
by
Post not yet marked as solved
2 Replies
57 Views
I am trying to disable the screenshot on iOS app but apple does not expose any api for this. Due to which I added a workaround when iPhone will try to take screenshot we have a textfield in secured way which will cover the whole screen and screenshot will be blank. Does anyone tried the same approach to avoid screenshot? if yes did apple act against those application or they reject if they find. It will be great help if Apple can provide us some insight on it.
Posted
by
Post not yet marked as solved
0 Replies
100 Views
Our keyboard extension can be accessed independently in China region with native app like Notes or Safari, however the keyboard can only be opened in the app under same project in Taiwan region. I've checked some articles about how MDM managing extensions, also make sure our RequestOpenAccess option of keyboard extension info.plist also set to Yes. I'm not sure is there anything I missed, or I just need to inform client that they need to reach out their MDM manager and modify some restrictions? If keyboard supports mobile device management (MDM), it can work with managed apps. App extensions give third-party developers a way to provide functionality to other apps or even to key systems built into the operating systems Allow full access to custom keyboard in iOS
Posted
by
Post not yet marked as solved
0 Replies
63 Views
I'm adding many users to internal testing purpose. A few users did't received and welcome to apple developer program after signing in from invitation email. I've checked and didn't they on the user lists, so I can't added to internal tester group. We've tried so many times. I noticed that every users that had the same problem didn't see the trust this device page during the signing process. How should I fix this problem.
Posted
by
Post not yet marked as solved
3 Replies
90 Views
Hello, I am new to IOS Development and I am trying to create new project but swift ui view is not showing.
Posted
by
Post not yet marked as solved
1 Replies
65 Views
Hello everyone! I'm currently working on an iOS app developed with Swift that involves connecting to a specific bluetooth device and exchanging data even when the app is terminated or running in the background. I just want to understand why the CBCenterManager should be Implicitly unwrapped to use it. I have check out couple off apple developer sample project it was set to Implicitly unwrapped. can some one help to understand the reason behind this, also what are possible issues/scenario trigger if we set the centermanager to optional "CBcentermanager?" Thanks in Advance!
Posted
by
Post not yet marked as solved
0 Replies
59 Views
Hello! Is it possible to turn on hand pass-through (hand cutout, not sure what is the correct name, real hands from the camera) in WebXR when in immersive-vr and hand tracking feature is enabled? I only see my hands when I disable hand tracking feature in WebXR, but then I don't get the joints positions and orientations.
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all