Posts

Sort by:
Post not yet marked as solved
0 Replies
18 Views
I am using MusicKit ApplicationMusicPlayer to play music in my app. Everything works fine as long as I'm not playing large playlists that contain hundreds of songs. When I to play collection of songs that is larger than around 300 I'm always getting the error message saying: "Prepare to play failed" UserInfo={NSDebugDescription=Prepare to play failed, NSUnderlyingError=0x121d42dc0 {Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Remote call timed out" UserInfo={NSDebugDescription=Remote call timed out}}})) It doesn't matter if songs are downloaded to the device or not. I am aware that there is another initializer for player's queue that accepts Playlist instances but in my app users can choose to sort playlist tracks in different order than the default and that makes using that initializer not feasible for me. I tried everything I could think of, I tried to fall back on MPMusicPlayerController and pass array of MPMusicPlayerPlayParameters to it but the result was the same. typealias QueueEntry = ApplicationMusicPlayer.Queue.Entry let player = ApplicationMusicPlayer.shared let entries: [QueueEntry] = tracks .compactMap { guard let song = $0 as? Song else { return nil } return QueueEntry(song) } Task(priority: .high) { [player] in do { player.queue = .init(entries, startingAt: nil) try await player.play() // prepareToPlay failed } catch { print(error) } }
Posted
by
Post not yet marked as solved
0 Replies
22 Views
Please take a look at the following simple SwiftUI View: struct ContentView: View { var body: some View { ForEach(1...1, id: \.self) { i in subview(i) } } func subview(_ i: Int) -> some View { print("creating subview \(i)") return Text("Hello, world!") } } When this View is displayed, all subviews are created twice, as the print statements show. (Unfortunately the Apple Developer Forums UI does not let me attach my sample Xcode project.) This happens on macOS 14.4.1. Am I doing something wrong or is this a SwiftUI bug? (In a real-world application the View creation can be expensive…)
Posted
by
rx8
Post not yet marked as solved
0 Replies
25 Views
Our app exports a number of file types in info.plist. For each of these file types, we export various possible file name extensions. For example, for one of our file types, we specify the possible extensions ".data", ".fitdat", and ".profitdata". In the settings of our app, we allow the user to select their preferred extension to be used, i.e. the extension to be used when, e.g., saving a document with "Save As..." panel without explicitly specifying the extension. For this, we override NSDocument's fileNameExtensionForType:saveOperation to return the presently preferred extension. This has stopped working, probably starting from macOS 14. If the user does not specify the extension in the NSSavePanel, it's always the first extension (".data") that gets added to the file name. I guess it's a consequence of the introduction of UTType, which has its own preferredFilenameExtension, which in turn probably just grabs the first extension we specify in our info.plist Any advice how to resolve this? Is there any way to override NSSavePanel's selection of extension if the user does not specify one? Thanks for any advice in advance Kurt
Posted
by
Post not yet marked as solved
0 Replies
26 Views
I have tried to insert 1L key-values into the user defaults. It is working fine until 50k key-values. After some time, am getting Not updating lastKnownShmemState in CFPrefsPlistSource<0x2825b0c60> (Domain: peformance_validator, User: kCFPreferencesCurrentUser, ByHost: No, Container: (null), Contents Need Refresh: No) also, key-values are not updating in the plist as well. peformance_validator is my user defaults group name. How can I solve this ?
Posted
by
Post not yet marked as solved
1 Replies
50 Views
NSString *jsonString = @"{\"key1\":\"value1\",\"key2\":\"value2\",\"key3\":\"value3\",\"key4\":\"value4\"}"; NSString *jsonString2 = @"{\"key2\":\"value2\",\"key1\":\"value1\",\"key4\":\"value4\",\"key3\":\"value3\"}"; NSData *jsonData = [jsonString dataUsingEncoding:NSUTF8StringEncoding]; NSData *jsonData2 = [jsonString2 dataUsingEncoding:NSUTF8StringEncoding]; NSDictionary *dict1 = [NSJSONSerialization JSONObjectWithData:jsonData options:NSJSONReadingMutableContainers error:nil]; NSDictionary *dict2 = [NSJSONSerialization JSONObjectWithData:jsonData2 options:NSJSONReadingMutableContainers error:nil]; The expected results are: dict1:key1,key2,key3,key4 dict2:key2,key1,key4,key3 Is there any way to make that happen?
Posted
by
Post not yet marked as solved
0 Replies
40 Views
Hi, My application doesn't start playback anymore after signing it with entitlements. <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>com.apple.security.app-sandbox</key> <true/> <key>com.apple.security.files.user-selected.read-only</key> <true/> <key>com.apple.security.device.audio-input</key> <true/> <key>com.apple.security.device.microphone</key> <true/> <key>com.apple.security.assets.music.read-write</key> <true/> <key>com.apple.security.network.server</key> <true/> </dict> </plist> regards, Joël
Posted
by
Post not yet marked as solved
0 Replies
37 Views
Hi All, I have a strange issue. I am using enableBackgroundDelivery for updating user step count in background mode using health kit. It works fine when I execute the app by pressing 'Run' in xcode. But the code is not triggering when I am directly launching it on my device. I have tried many different things but cannot figure out the issue from 2 days 😭. I would really appreciate any suggestions. Thanks
Posted
by
Post not yet marked as solved
0 Replies
30 Views
I have old ScreenCaptureKit sample downloaded on Oct 2022. That sample worked on Oct 2022. But it does not work on Apr 2024 on Sonoma 14.4.1 M1 MacBook. It only shows black screen. I also download updated ScreenCaptureKit sample and test it. It works on Sonoma 14.4.1 M1 MacBook. I noticed latest sample have SCContentSharingPicker and other changes. I have my screen capture application based on old ScreenCaptureKit sample. My app only shows black screen. Do I have to add SCContentSharingPicker and SCContentSharingPickerObserver on my application for capturing screen on Sonoma? Old way of screen capture without SCContentSharingPicker is not supported anymore on Sonoma?
Posted
by
Post not yet marked as solved
0 Replies
45 Views
I'm using CoreData+CloudKit. It works fine in development, so I deployed to production. But I cannot see the data in CloudKit DataBase in a testflight version. I don't know where I missed. Can I just run in Release mode to check it works?
Posted
by
Post not yet marked as solved
0 Replies
48 Views
I am currently writing an app that is about writing stories. As of now, it is fairly simple: ContentView is your "Collection" of stories. PopupView is when you click on a button in ContentView. In popup view you enter the story title. Once you do that, you are brought to a blank page which is StoryView, where the NavigationTitle is what your story title is. Once I finish the story and leave the page / StoryView, it is still there, but once I close the app on my phone and reopen it, the story is gone and is not saved. I am a relatively new developer, so ive been relying on ChatGPT and Google Gemini for the saving parts of this, but it rarely works, and the furthest ive gotten with it is that it saves the story title but doesn't save the content of the story. I have a feeling that the AI is overdoing it as well. If anyone could help, please do so. Ive been trying to fix this for days. If you need me to provide any code, I am happy to do so. [Edited by Moderator]
Posted
by
Post marked as solved
1 Replies
50 Views
I have a two-view app where the main view is a procedural animation and a secondary view controls settings for the animation. I want to use Play/Pause to toggle between the views, but can't figure out how to do this. Ideally the main view does not have any visible control and the whole screen can be dedicated to the animation view. Attaching onPlayPauseCommand to the main view does not work. I've also tried managing focus using onFocus without success. I'm open to other ways to toggle between the main and settings views, it's just that Play/Pause seems the most intuitive.
Posted
by
Post not yet marked as solved
0 Replies
46 Views
hi, in the settings in the application settings, how do I put a button there to allow the use of the camera?
Posted
by
Post not yet marked as solved
0 Replies
61 Views
In this code, I aim to enable users to select an image from their phone gallery and display it with less opacity on top of the z-index. The selected image should appear on top of the user's phone camera feed, allowing them to see the canvas on which they are drawing as well as the low-opacity image. The app's purpose is to enable users to trace an image on the canvas while simultaneously seeing the camera feed. CameraView.swift import SwiftUI import AVFoundation struct CameraView: View { let selectedImage: UIImage var body: some View { ZStack { CameraPreview() Image(uiImage: selectedImage) .resizable() .aspectRatio(contentMode: .fill) .opacity(0.5) // Adjust the opacity as needed .edgesIgnoringSafeArea(.all) } } } struct CameraPreview: UIViewRepresentable { func makeUIView(context: Context) -> UIView { let cameraPreview = CameraPreviewView() return cameraPreview } func updateUIView(_ uiView: UIView, context: Context) {} } class CameraPreviewView: UIView { private let captureSession = AVCaptureSession() override init(frame: CGRect) { super.init(frame: frame) setupCamera() } required init?(coder: NSCoder) { fatalError("init(coder:) has not been implemented") } private func setupCamera() { guard let backCamera = AVCaptureDevice.default(for: .video) else { print("Unable to access camera") return } do { let input = try AVCaptureDeviceInput(device: backCamera) if captureSession.canAddInput(input) { captureSession.addInput(input) let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) previewLayer.videoGravity = .resizeAspectFill previewLayer.frame = bounds layer.addSublayer(previewLayer) captureSession.startRunning() } } catch { print("Error setting up camera input:", error.localizedDescription) } } } Thanks for helping and your time.
Posted
by
Post not yet marked as solved
0 Replies
58 Views
I have an iPhone app that was created in Xcode that downloads some data from Google Firebase (Firestore). Today I got a scary email from Google Firebase (as seen on the screenshot below). I'm not an expert on Google Firebase and honestly don't understand what this means. Two questions: Does this mean our database is publicly available on the surface web (so for example search engines can crawl/index it)? What do I do about this? Extremely thankful for any help!
Posted
by
Post not yet marked as solved
0 Replies
45 Views
Please run the following UIKit app. It uses a collection view with compositional layout (list layout) and a diffable data source. It has one section with one row. The cell has an image view as a leading accessory. Unfortunately, as soon as I set an image for the image view, the accessory is no longer centered: import UIKit class ViewController: UIViewController { var collectionView: UICollectionView! var dataSource: UICollectionViewDiffableDataSource<String, String>! override func viewDidLoad() { super.viewDidLoad() configureHierarchy() configureDataSource() } func configureHierarchy() { collectionView = .init(frame: .zero, collectionViewLayout: createLayout()) view.addSubview(collectionView) collectionView.frame = view.bounds } func createLayout() -> UICollectionViewLayout { UICollectionViewCompositionalLayout { section, layoutEnvironment in let config = UICollectionLayoutListConfiguration(appearance: .insetGrouped) return NSCollectionLayoutSection.list(using: config, layoutEnvironment: layoutEnvironment) } } func configureDataSource() { let cellRegistration = UICollectionView.CellRegistration<UICollectionViewListCell, String> { cell, indexPath, itemIdentifier in let iv = UIImageView() iv.backgroundColor = .systemRed // iv.image = .init(systemName: "camera") iv.contentMode = .scaleAspectFit iv.frame.size = .init( width: 40, height: 40 ) cell.accessories = [.customView(configuration: .init( customView: iv, placement: .leading(), reservedLayoutWidth: .actual, maintainsFixedSize: true ))] } dataSource = .init(collectionView: collectionView) { collectionView, indexPath, itemIdentifier in collectionView.dequeueConfiguredReusableCell(using: cellRegistration, for: indexPath, item: itemIdentifier) } var snapshot = NSDiffableDataSourceSnapshot<String, String>() snapshot.appendSections(["main"]) snapshot.appendItems(["demo"]) dataSource.apply(snapshot, animatingDifferences: false) } } This seems like a bug but then if I set the image view's size to 100x100, even without giving it an image, the cell doesn't resize, which makes me think I'm making a mistake.
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all