Posts

Sort by:
Post not yet marked as solved
1 Replies
24 Views
I keep on getting the same error when trying to build my project. It seems to stem from a problem that I am having entering the path to the info.plist file in packaging/build settings. I have repeatedly tried to install the proper path as "$(SRCROOT)/Main Project Group/Info.plist", yet as soon as I hit return, the path reverts to "/Users/josephnicholas/Documents/dEATour/Xcode/dEATourSunday/Main Project Group/Info.plist" and I get the following error when I clean/build the project: "/Users/josephnicholas/Documents/dEATour/Xcode/dEATourSunday/dEATourSunday.xcodeproj One of the paths in DEVELOPMENT_ASSET_PATHS does not exist: /Users/josephnicholas/Documents/dEATour/Xcode/dEATourSunday/dEATourSunday/Preview Content". I have deleted the info.plist file and reinstalled it, I have even started a new project this morning, transferring all of my files. I cannot figure out how to prevent the path in 'packaging' from reverting back to the absolute path and throwing this error out. I'd really appreciate any help anyone can offer. Thanks.
Posted
by
Post not yet marked as solved
0 Replies
18 Views
I have iMac and MacStudio. iMac works perfectly fine on xcode 15.4, but somehow MacStudio on xcode 15.2(Sonoma 14.2) is worked not properly. When I tried to Add Team on xcode in each device, iMac is fine but MacStudio force closed the xcode. I'm not sure even I upgrade xcode 15.2 to 15.4 on my MacStudio, it could be better. Any advice?
Posted
by
Post not yet marked as solved
0 Replies
32 Views
I've implemented a custom system extension VPN for macOS using a Packet Tunnel Provider. At the Protocol Configuration, the 'includeAllNetworks' flag is unset. At the provider, I included all routes (IPv4 default route). What is the expected behavior for LAN traffic? Should the LAN traffic go via the VPN? By 'LAN traffic', I'm referring to local hosts, ssh, printer access, etc.
Posted
by
Post not yet marked as solved
0 Replies
21 Views
There is coming vawe of SoftPOS class app, turning iPhone into regular payment terminal. so far Apple picks country by country, where the service is available. As for today limited number of regions, where Tap on Mobile is possible to use at all. Beeing preparing for that vawe we are considering possibly integration scenario and found the following obstacle: Originally it was possible to integrate as app to app, means Tap on Mobile to other apps, where both apps are installed separately on iPhone. But now Apple does not allowe to make integration as app to app, but only integration is possible by embedded libraries (special libraries are embedded in master app, and on the iPhone there is only one app, which covers also Tap on Mobile features). 1/ Do you know, what is the reason for such restriction to have only embedded libraries method to integrate (although originally it was possible to integrate as app to app, means Tap on Mobile to other app)? 2/ Do you think, Apple release again first integration as app to app, as they allowed originally?
Posted
by
Post not yet marked as solved
0 Replies
21 Views
Good afternoon, After a long time of using of Macbook security popups with requesting access from apps start appearing For example today I opened vscode to work with nuxtjs and 3 popups appeared: vscode requests access to photos, calendar, contacts, desktop, icloud etc The same happens with PHPStorm. If I open terminal, the same things happen with terminal I haven't installed anything and haven't updated anything Then I decided to update the latest MacOS, thought that it may help, but it didn't help My questions are: How to fix that? All applications even terminal should not have such permission requests Is it a bug and it will be fixed in a patch? Why do these popups always appearing if I clicked Don't allow? OS: MacOs Sonoma 14.5 Mac book Pro 2019
Posted
by
Post not yet marked as solved
1 Replies
31 Views
So i meant to make a shared album but made a shared library. That being said, I deleted the shared album but my family can not remove it from the phones
Posted
by
Post not yet marked as solved
0 Replies
46 Views
The App Store Connect API documentation still doesn't list the new 13" iPad display type: https://developer.apple.com/documentation/appstoreconnectapi/screenshotdisplaytype When adding screenshots to 13" iPads on the website, they still seem to use the display type APP_IPAD_PRO_3GEN_129 when listed by the API, and uploading to that same type uploads them to the 13" display type instead, but then there is the requirement that one still has to upload screenshots for 12.9" display type, without an apparent way of doing so. I would expect to have an option to upload to 13" display type that is also used for 12.9" display type. Do we have to wait for Apple to update the documentation or does someone know a workaround?
Posted
by
Post not yet marked as solved
0 Replies
17 Views
Dear all, I have several scenes, each with it’s own camera at different positions. The scenes will be loaded with transitions. If I set the pointOfView in every Scene to the scene-camera, the transitions don’t work properly. The active scene View switches to the position of the camera of the scene, which is fading in. If I comment the pointOfView out, the transitions works fine, but the following error message appears: Error: camera node already has an authoring node - skip Has someone an idea to fix this? Many Thanks, Ray
Posted
by
Post not yet marked as solved
0 Replies
20 Views
I’m trying to track the location of the user on every 10-20 meters of the location change, I’ve managed to obtain the location updates while the application is in foreground or background using Backround modes. my issue is that I cannot get any updates after the application is terminated either by the user or by the system, I’ve tried using startMonitoringSignificantLocationChanges() but this does not fit my purpose since you get updates every 500m. Is it possible to achieve getting updates every 10 meters using “Region Monitoring”? Or is mobile device management (MDM) the only way to achieve this on an iOS device?
Posted
by
Post not yet marked as solved
2 Replies
46 Views
Hello, I have a view have three textfields and a button. I wrote following code to move between textfields using return key. func textFieldShouldReturn(_ textField: UITextField) -> Bool { if textField == self.A { self.B.becomeFirstResponder() }else if textField == self.B { self.C.becomeFirstResponder() } return true } when I use return key between A->B, above code works properly. but when i use return key between B->C, above code doesn't work. I couldn't figure out what's wrong with this. if anyone pick me my mistake and suggest solution for it, I'd very appreciate. Thanks, c00012
Posted
by
Post not yet marked as solved
1 Replies
46 Views
I'm trying to start and stop recording when my app is in background periodically. I implemented it using Timer and DispatchQueue. However whenever I am trying to initiate the recording I get this error. This issue does not exist in foreground. Here is the current state of my app and configuration. I have added "Background Modes" capability in the Signing & Capability and I also checked Audio and Self Care. Here is my Info.plist: <plist version="1.0"> <dict> <key>UIBackgroundModes</key> <array> <string>audio</string> </array> <key>WKBackgroundModes</key> <array> <string>self-care</string> </array> </dict> </plist> I also used the AVAudioSession with .record category and activated it. Here is the code snippet: func startPeriodicMonitoring() { let session = AVAudioSession.sharedInstance() do { try session.setCategory(AVAudioSession.Category.record, mode: .default, options: [.mixWithOthers]) try session.setActive(true, options: []) print("Session Activated") print(session) // Start recording. measurementTimer = Timer.scheduledTimer(withTimeInterval: measurementInterval, repeats: true) { _ in self.startMonitoring() DispatchQueue.main.asyncAfter(deadline: .now() + self.recordingDuration) { self.stopMonitoring() } } measurementTimer?.fire() // Start immediately } catch let error { print("Unable to set up the audio session: \(error.localizedDescription)") } } Any thoughts on this? I have tried most of the ways but the issue is still there.
Posted
by
Post not yet marked as solved
0 Replies
64 Views
I am trying to store usdz files with SwiftData for now. I am converting usdz to data, then storing it with SwiftData My model import Foundation import SwiftData import SwiftUI @Model class Item { var name: String @Attribute(.externalStorage) var usdz: Data? = nil var id: String init(name: String, usdz: Data? = nil) { self.id = UUID().uuidString self.name = name self.usdz = usdz } } My function to convert usdz to data. I am currently a local usdz just to test if it is going to work. func usdzData() -> Data? { do { guard let usdzURL = Bundle.main.url(forResource: "tv_retro", withExtension: "usdz") else { fatalError("Unable to find USDZ file in the bundle.") } let usdzData = try Data(contentsOf: usdzURL) return usdzData } catch { print("Error loading USDZ file: \(error)") } return nil } Loading the items @Query private var items: [Item] ... var body: some View { ... ForEach(items) { item in HStack { Model3D(?????) { model in model .resizable() .scaledToFit() } placeholder: { ProgressView() } } } ... } How can I load the Model3D? I have tried: Model3D(data: item.usdz) Gives me the errors: Cannot convert value of type '[Item]' to expected argument type 'Binding<C>' Generic parameter 'C' could not be inferred Both errors are giving in the ForEach. I am able to print the content inside item: ForEach(items) { item in HStack { Text("\(item.name)") Text("\(item.usdz)") } } This above works fine for me. The item.usdz prints something like Optional(10954341 bytes) I would like to know 2 things: Is this the correct way to save usdz files into SwiftData? Or should I use FileManager? If so, how should I do that? Also how can I get the usdz from the storage (SwiftData) to my code and use it into Model3D?
Posted
by
Post not yet marked as solved
0 Replies
55 Views
I'm working on an app that does peer-to-peer communication between Apple devices. As far as I understand, the Network framework is a good choice for this. I have something that works, but I'm curious about the details of how this works and if I might somehow optimize this. My current understanding is that the best connection I can get between two devices is over AWDL. Is this true? If so, does Network use this? Can I ask it to use it preferentially? What kind of bandwidth and latency should I expect out of this, and are there any drawbacks to using it like power usage or transport limitations? If both devices are on the same LAN, I assume they can also talk to each other over Wi-Fi (or a wired connection if both are plugged in, I guess). If I use Bonjour service discovery, is this what I will be getting? What does Network do if the LAN network does not perform well? Will it swap the underlying connection if it figures out there is something better? I am not tied to any particular API or transport protocol, so any input on tradeoffs between ease of implementation/performance/reliability/whatever would be welcome :)
Posted
by
Post not yet marked as solved
2 Replies
65 Views
I have an app that has the camera continuously running, as it is doing its own AI, have zero need for Apple'video effects, and am seeing a 200% performance hit after updating to Sonoma. The video effects are the "heaviest stack trace" when profiling my app with Instruments CPU profiler (see below). Is forcing your software onto developers not something Microsoft would do? Is there really no way to opt out? 6671 Jamscape_exp (23038) 2697 start_wqthread 2697 _pthread_wqthread 2183 _dispatch_workloop_worker_thread 2156 _dispatch_root_queue_drain_deferred_wlh 2153 _dispatch_lane_invoke 2146 _dispatch_lane_serial_drain 1527 _dispatch_client_callout 1493 _dispatch_call_block_and_release 777 __88-[PTHandGestureDetector initWithFrameSize:asyncInitQueue:externalHandDetectionsEnabled:]_block_invoke 777 -[VCPHandGestureVideoRequest initWithOptions:] 508 -[VCPHandGestureClassifier initWithMinHandSize:] 508 -[VCPCoreMLRequest initWithModelName:] 506 +[MLModel modelWithContentsOfURL:configuration:error:] 506 -[MLModelAsset modelWithError:] 506 -[MLModelAsset load:] 506 +[MLLoader loadModelFromAssetAtURL:configuration:error:] 506 +[MLLoader _loadModelFromAssetAtURL:configuration:loaderEvent:error:] 505 +[MLLoader _loadModelFromArchive:configuration:loaderEvent:useUpdatableModelLoaders:error:] 505 +[MLLoader _loadWithModelLoaderFromArchive:configuration:loaderEvent:useUpdatableModelLoaders:error:] 505 +[MLLoader _loadModelFromArchive:configuration:modelVersion:compilerVersion:loaderEvent:useUpdatableModelLoaders:loadingClasses:error:] 505 +[MLLoader _loadModelWithClass:fromArchive:modelVersionInfo:compilerVersionInfo:configuration:error:] 445 +[MLMultiFunctionProgramEngine loadModelFromCompiledArchive:modelVersionInfo:compilerVersionInfo:configuration:error:] 333 -[MLMultiFunctionProgramEngine initWithProgramContainer:configuration:error:] 333 -[MLNeuralNetworkEngine initWithContainer:configuration:error:] 318 -[MLNeuralNetworkEngine _setupContextAndPlanWithConfiguration:usingCPU:reshapeWithContainer:error:] 313 -[MLNeuralNetworkEngine _addNetworkToPlan:error:] 313 espresso_plan_add_network 313 EspressoLight::espresso_plan::add_network(char const*, espresso_storage_type_t) 313 EspressoLight::espresso_plan::add_network(char const*, espresso_storage_type_t, std::__1::shared_ptrEspresso::net) 313 Espresso::load_network(std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, std::__1::shared_ptrEspresso::abstract_context const&, Espresso::compute_path, bool) 235 Espresso::reload_network_on_context(std::__1::shared_ptrEspresso::net const&, std::__1::shared_ptrEspresso::abstract_context const&, Espresso::compute_path) 226 Espresso::load_and_shape_network(std::__1::shared_ptrEspresso::SerDes::generic_serdes_object const&, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, std::__1::shared_ptrEspresso::abstract_context const&, Espresso::network_shape const&, Espresso::compute_path, std::__1::shared_ptrEspresso::blob_storage_abstract const&, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&) 214 Espresso::load_network_layers_internal(std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, std::__1::basic_string<char, std::__1::char_traits, std::__1::allocator> const&, std::__1::shared_ptrEspresso::abstract_context const&, Espresso::network_shape const&, std::__1::basic_istream<char, std::__1::char_traits>, Espresso::compute_path, bool, std::__1::shared_ptrEspresso::blob_storage_abstract const&) 208 Espresso::run_dispatch_v2(std::__1::shared_ptrEspresso::abstract_context, std::__1::shared_ptrEspresso::net, std::__1::vector<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::allocator<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object>> const&, Espresso::network_shape const&, Espresso::compute_path const&, std::__1::basic_istream<char, std::__1::char_traits>) 141 try_dispatch(std::__1::shared_ptrEspresso::abstract_context, std::__1::shared_ptrEspresso::net, std::__1::vector<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::allocator<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object>> const&, Espresso::network_shape const&, Espresso::compute_path const&, std::__1::basic_istream<char, std::__1::char_traits>, Espresso::platform const&, Espresso::compute_path const&) 131 Espresso::get_net_info_ir(std::__1::shared_ptrEspresso::abstract_context, std::__1::shared_ptrEspresso::net, std::__1::vector<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::allocator<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object>> const&, Espresso::network_shape const&, Espresso::compute_path const&, Espresso::platform const&, Espresso::compute_path const&, std::__1::shared_ptrEspresso::cpu_context_transfer_algo_t&, std::__1::shared_ptrEspresso::net_info_ir_t&, std::__1::shared_ptrEspresso::kernels_validation_status_t&) 131 Espresso::cpu_context_transfer_algo_t::create_net_info_ir(std::__1::vector<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::allocator<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object>> const&, std::__1::shared_ptrEspresso::abstract_context, Espresso::network_shape const&, Espresso::compute_path, std::__1::shared_ptrEspresso::net_info_ir_t) 120 Espresso::cpu_context_transfer_algo_t::check_all_kernels_availability_on_context(std::__1::vector<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::allocator<std::__1::shared_ptrEspresso::SerDes::generic_serdes_object>> const&, std::__1::shared_ptrEspresso::abstract_context&, Espresso::compute_path, std::__1::shared_ptrEspresso::net_info_ir_t&) 120 is_kernel_available_on_engine(unsigned long, std::__1::shared_ptrEspresso::base_kernel, Espresso::kernel_info_t const&, std::__1::shared_ptrEspresso::SerDes::generic_serdes_object, std::__1::shared_ptrEspresso::abstract_context, Espresso::compute_path, std::__1::shared_ptrEspresso::net_info_ir_t, std::__1::shared_ptrEspresso::kernels_validation_status_t) 83 Espresso::ANECompilerEngine::mix_reshape_kernel::is_valid_for_engine(std::__1::shared_ptrEspresso::kernels_validation_status_t, Espresso::base_kernel::validate_for_engine_args_t const&) const 45 int ValidateLayer<ANECReshapeLayerDesc, ZinIrReshapeUnit, ZinIrReshapeUnitInfo, ANECReshapeLayerDescAlternate>(void, ANECReshapeLayerDesc const*, ANECTensorDesc const*, unsigned long, unsigned long*, ANECReshapeLayerDescAlternate**, ANECTensorValueDesc const*) 45 void ValidateLayer_Impl<ANECReshapeLayerDesc, ZinIrReshapeUnit, ZinIrReshapeUnitInfo, ANECReshapeLayerDescAlternate>(void*, ANECReshapeLayerDesc const*, ANECTensorDesc const*, unsigned long, unsigned long*, ANECReshapeLayerDescAlternate**, ANECTensorValueDesc const*) (...)
Posted
by
Post not yet marked as solved
0 Replies
57 Views
I started to use Xcode Cloud recently trying to understand how the whole build process etc. works. I created some workflows, integrated ci scripts to let fastlane create snapshots in the end, and everything seem to work while I was making progress step by step to get it up and running (struggling with the environment, etc.). Then last night it suddenly stopped working in the form, that Xcode shows the builds, but the last build still shows a spinner, although the job has finished already When I try to cancel the last run from Xcode (thought it was not finished) I get an error: Failed to Cancel Build 57. An internal error occurred while authenticating. Try again later. (FB13802231) When I open Manage Workflows…, Xcode suddenly shows "This operation could not be completed" and details reveal: The operation couldn’t be completed. ((extension in XcodeCloudKit):XcodeCloudAPI.Client.HTTPClientError error 0.) (FB13802952) Trying to access the Xcode Cloud tab of the app or general from users and permissions shows a white page with a spinner. The console filtered by Xcode shows (around the time the error happens): Failed to query public key: 0xe800000c The Safari browser console shows: An internal error occurred while authenticating. Try again later. What works: I can login to AppStore Connect without problem. I am logged-in in Xcode (but there's no option to logout and re-login) I can use tools like fastlane with API calls (or some xcc command line tool) and list the products, workflows and runs using my API key. From this information I also figured out, that the build in question (57) where Xcode still shows the spinner, has finished (failing) already. I was also able to start a new build manually (58) via API, which however has finished 3 seconds after creation, but has never started (startedDate is empty while createdDate and finshedDate contains data). I was able to delete the workflow via API in hope that the problem goes away, but theres no change in Xcode (still shows the same picture with the workflow and last build 57 spinning). I should have used <4 / 25 hours of free plan of Xcode Cloud this month, so limit not exceeded. I'm on macOS 14.4.1, Xcode 15.3. Seems I can't make use of Xcode Cloud at all anymore right now which keeps me from proceeding now on my way to first time deliver to AppStore (still understanding and learning). I also provided info about the 2 Xcode errors as feedback, which I linked above. Any help on how to reset / do whatever is necessary to make it work again is very much appreciated.
Posted
by
Post not yet marked as solved
0 Replies
49 Views
I'm debugging some Regex Builder code in my Playground. I run the following piece code: let timeMatchWithout = possibleTime.firstMatch(of: timeWithoutSec) and I get this error message: Regex.Match optional storedCapture contains no some What could this possibly mean? contains no some??? Here is a more complete snippet, if this helps: let hourRef = Reference<Substring>() let minuteRef = Reference<Substring>() let hourReg = Regex { ChoiceOf { Capture(as: hourRef) { One(.digit) One(.digit) } Capture(as: hourRef) { One(.digit) } } } let minuteReg = Regex { ChoiceOf { Capture(as: minuteRef) { One(.digit) One(.digit) } Capture(as: minuteRef) { One(.digit) } } } let ampmRef = Reference<Substring>() let ampmReg = Regex { Capture(as: ampmRef) { ZeroOrMore { ChoiceOf { One("am") One("pm") One("a.m.") One("p.m.") } } } /* transform: { $0.lowercase } */ }.ignoresCase() let timeWithoutSec = Regex { hourReg One(":") minuteReg ZeroOrMore(.whitespace) ampmReg }.ignoresCase() let possibleTime = "10:20 AM" let timeMatchWithout = possibleTime.firstMatch(of: timeWithoutSec) The last line produces the error message. Thanks for the help. Note the removed transform: on the ampmReg definition. If that is included the compiler times out as noted in my previous post, yesterday.
Posted
by
Post not yet marked as solved
2 Replies
79 Views
I am trying to load and view several locations onto a map from a JSOPN file in my SwiftUI project, but I continually encounter the error "no exact matches in call to initializer" in my ContentView.swift file. What I Am Trying to Do: I am working on a SwiftUI project where I need to display several locations on a map. These locations are stored in a JSON file, which I have successfully loaded into Swift. My goal is to display these locations as annotations on a Map view. JSON File Contents: coordinates: latitude and longitude name: name of the location uuid: unique identifier for each location Code and Screenshots: Here are the relevant parts of my code and the error I keep encountering: import SwiftUI import MapKit struct ContentView: View { @State private var mapPosition = MapCameraPosition.region( MKCoordinateRegion( center: CLLocationCoordinate2D(latitude: 37.7749, longitude: -122.4194), span: MKCoordinateSpan(latitudeDelta: 0.05, longitudeDelta: 0.05) ) ) @State private var features: [Feature] = [] var body: some View { NavigationView { Map(position: $mapPosition, interactionModes: .all, showsUserLocation: true) { ForEach(features) { feature in Marker(coordinate: feature.coordinate) { FeatureAnnotation(feature: feature) } } } .onAppear { POILoader.loadPOIs { result in switch result { case .success(let features): self.features = features case .failure(let error): print("Error loading POIs: \(error.localizedDescription)") } } } .navigationBarTitle("POI Map", displayMode: .inline) } } } struct FeatureAnnotation: View { let feature: Feature var body: some View { VStack { Circle() .strokeBorder(Color.red, lineWidth: 2) .background(Circle().foregroundColor(.red)) .frame(width: 20, height: 20) Text(feature.name) } } } I have not had any luck searching for solutions to my problems using the error messages that keep arising. Does anyone have any advice for how to move forward?
Posted
by
Post marked as solved
1 Replies
69 Views
Map(initialPosition: .camera(mapCamera)) { Marker("Here", coordinate: location) } .frame(height: 300) .clipShape(RoundedRectangle(cornerSize: CGSize(width: 10, height: 10))) .onMapCameraChange(frequency: .continuous) { cameraContext in locationManager.location = cameraContext.camera.centerCoordinate } .onReceive(locationManager.$location, perform: { location in if let location { mapCamera.centerCoordinate = location } }) class LocationDataManager: NSObject, CLLocationManagerDelegate, ObservableObject { enum LoadingState { case loading case notLoading case finished } static let shared = LocationDataManager() private let locationManager = CLLocationManager() @Published var location: CLLocationCoordinate2D? = nil @Published var loading: LoadingState = .notLoading override init() { super.init() locationManager.delegate = self } func resetLocation() { loading = .notLoading location = nil } func getLocation() { locationManager.requestLocation() loading = .loading } func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]) { location = locations.first?.coordinate if location != nil { loading = .finished } } func locationManager(_ manager: CLLocationManager, didFailWithError error: any Error) { print("Failed to retrieve location: \(error.localizedDescription)") loading = .notLoading } } So the when the LocationButton is selected, the location is found and the marker is set correctly. You can also move the camera around to adjust the marker position, which works correctly. However, if you press the LocationButton again, it updates the marker position but it won't move the MapCamera to the new location. I can see the marker move. mapCamera.centerCoordinate = location should be doing it, but it's not. Anyone know how to fix this?
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all