Posts

Sort by:
Post not yet marked as solved
0 Replies
3 Views
I've tried uploading a build through xcode, and transporter app with no luck. I have waited hours and it never completes. Also, I have tried using fastlane our main way of uploading builds and have included a verbose flag to see why the build upload never finishes or errors out. The problem is an endless loop of the following error: DEBUG [2024-05-09 16:11:52.04]: [altool]: "LocalUploadTask <EF4ED1D8-1CB0-47F7-876B-C126DF9A3696>.<302624>" DEBUG [2024-05-09 16:11:52.04]: [altool]: ), NSLocalizedDescription=A server with the specified hostname could not be found., NSErrorFailingURLStringKey=https://northamerica-1.object-storage.apple.com/itmspod11-assets-massilia-200001/PurpleSource211%2Fv4%2F22%2F2d%2F17%2F222d1786-e205-57c4-7c62-a8e88503ff18%2FNRkOOpVWKZwGMd9KSYr92VImlIecE45alTBupZzt5Z0_U003d-1715285171894?uploadId=951d5230-0e3f-11ef-be19-783fd2f0b3f1&Signature={REDACTED}&AWSAccessKeyId={REDACTED}&partNumber=1&Expires=1715889972, NSErrorFailingURLKey=https://northamerica-1.object-storage.apple.com/itmspod11-assets-massilia-200001/PurpleSource211%2Fv4%2F22%2F2d%2F17%2F222d1786-e205-57c4-7c62-a8e88503ff18%2FNRkOOpVWKZwGMd9KSYr92VImlIecE45alTBupZzt5Z0_U003d-1715285171894?uploadId={REDACTED}&Signature={REDACTED}&AWSAccessKeyId={REDACTED}&partNumber=1&Expires=1715889972, _kCFStreamErrorDomainKey=12} DEBUG [2024-05-09 16:11:52.04]: [altool]: 2024-05-09 16:11:52.048 DEBUG: [ContentDelivery.Uploader] Created new upload task (0x139d071e0) for part 1. DEBUG [2024-05-09 16:11:52.04]: [altool]: 2024-05-09 16:11:52.048 DEBUG: [ContentDelivery.Uploader] Saving uploader state (CDUploaderStateUploadAssetDescription) for identifier 'com.apple.cds_0C29B49B-B8A0-45FA-BFF3-10A7424FE286'. DEBUG [2024-05-09 16:11:52.04]: [altool]: 2024-05-09 16:11:52.049 DEBUG: [ContentDelivery.Uploader] There are 2 parts remaining to upload. DEBUG [2024-05-09 16:11:52.04]: [altool]: 2024-05-09 16:11:52.049 DEBUG: [ContentDelivery.Uploader] LOST 0 bytes for part 2. DEBUG [2024-05-09 16:11:52.04]: [altool]: 2024-05-09 16:11:52.049 DEBUG: [ContentDelivery.Uploader] Adding upload task 302627 for part 2.
Posted
by
Post not yet marked as solved
0 Replies
5 Views
i'm having trouble modifying an optional environment object. i'm using the .environment modifier to pass along an optional object to other views. to access it in other views, i have to get it through an @Environment property wrapper. but i can't modify it even if i redeclare it in the body as @Bindable. here's an example code: @main struct MyApp: App { @State private var mySession: MySession? var body: some Scene { HomeScreen() .environment(mySession) } } now for the HomeScreen: struct HomeScreen: View { @Environment(MySession.self) private var mySession: MySession? var body: some View { @Bindable var mySession = mySession Button { mySession = MySession() } label: { Text("Create Session") } } } an error shows up in the @Bindable declaration saying init(wrappedValue:)' is unavailable: The wrapped value must be an object that conforms to Observable. but MySession is declared as @Observable. in fact it works just fine if i don't make the environment optional, but i have to setup MySession in the root of the app, which goes against the app flow.
Posted
by
Post not yet marked as solved
0 Replies
17 Views
Steps to reproduce: Install & Launch App When push notifications are registered, the push notification token received is a "production" push notification token rather than a "development" push notification token. We are trying to test on the sandbox environment with development push notification tokens, however the apn-environment for all builds we release via TestFlight are being set to "production". We wish to distribute builds via TestFlight with apn-environment set to "development". At the moment we have only found one way to run the app with apn-environment set to "development" which is to "Export" a Debug build, then manually install it on a device via iTunes on Windows machines. This method is not efficient or considered seamless enough for non-technical testers and stakeholders. They require a seamless way to receive Debug builds via TestFlight without resorting to other third-party platforms which allow us to manually upload the "Exported Debug build". If anyone knows how to upload a "Debug" build to Testflight which will allow the user to receive a sandbox development push token with "apn-environment" set to development I would really appreciate it.
Posted
by
Post not yet marked as solved
0 Replies
15 Views
I have a problem like this. I am a developer who has been developing iOS applications for many years. I experience this situation when I open an individual account. Your enrollment in the Apple Developer Program is under review. Please contact us. When I want to open a ticket, I am asked to choose an organization for the ticket, but my individual account has nothing to do with my organizations.
Posted
by
Post not yet marked as solved
1 Replies
23 Views
First off - I have read and fully understand this post - Apple doesn't want us abusing users' hardware so as to maximize the quality of experience across apps for their customers. I am 100% on board with this philosophy, I understand all design decisions and agree with them. To the problem: I have an app that takes photo assets, processes them for network (exportSession.shouldOptimizeForNetworkUse = true), and then uploads them. Some users have been having trouble, did some digging, they're trying to upload 4K 60FPS videos. I think that is ridiculous, but it's not my place. The issue is that the export time for a 4K60FPS video that is ~40s long can be as long as 2m. So if they select a video to upload, and then background the app that upload will ALWAYS fail because the processing fails (I have BG uploads working just fine). My understanding is that default I have 30s to run things in the background. I can use UIApplication.pleasegivemebackgroundtime to request up to 30 more seconds. That is obviously not enough. So my second option is BGProcessingTask - but that's not guaranteed to run ever. Which I understand and agree with, but when the user selects a video while the app is in the foreground the expectation is that it immediately begins processing. So I can't use a BGProcessingTask? Just wondering what the expected resolution here is. I run tasks, beg for time, if it doesn't complete I queue up a BGTask that may or may not ever run? That seems ****** for a user - they start the process, see it begin, but then if the video is too big they just have to deal with it possibly just not happening until later? They open up the app and the progress bar has magically regressed. That would infuriate me. Is there another option I'm not seeing? Some way to say "this is a large background task that will ideally take 30-60s, but it may take up to ~5-7m. It is user-facing though so it must start right away"? (I have never seen 5-7m, but 1-2m is common) Another option is to just upload the full 4K60FPS behemoth and do the processing on my end, but that seems worse for the user? We're gobbling upload bandwidth for something we're going to downsample anyway. Seems worse to waste data than battery (since that's the tradeoff at the end of the day). Anyway, just wondering what the right way to do this is. Trivially reproducible - record 1m 4K60FPS video, create an export session, export, background, enjoy failure.
Posted
by
Post not yet marked as solved
0 Replies
14 Views
Hello, I was wondering, is it possible to run SMAppService.daemon... as root? let service = SMAppService.daemon(plistName: "myApp.agent.plist") Also, is it possible to launch the SMAppService.daemon without the XPC connection? The daemon currently supports grpc. I was thinking about running it via Process?
Posted
by
Post not yet marked as solved
0 Replies
19 Views
When rendering gradient of luma values you can clearly see that the monitor output from Apple Silicon GPU has lifted/curved blacks (values 0-31 out of 1024 in 10-bit output). Any idea what is going on and how to remediate? It looks like some kind of compensation/calibration for the screen panel response, but it is baked into output regardless of what kind of monitor (or non-monitor) is connected. The same rendering on Intel architecture produces correct linear gradation. We have tried M1, M2, M3 and they all seem to be affected.
Posted
by
Post not yet marked as solved
0 Replies
21 Views
Hi everyone, I’m just starting with swift and Xcode and have a basic question. I have the following code I found online for an app that generates math addition questions. I would like to run this Math app on my iPhone just before I open the apps I use most often (let’s say mail, WhatsApp, calendar and notes) ask me a maths question and if I answer correctly, carryon with the app originally intended to be opened. I can do the opening of the Math app before the apps I use more often with shortcuts. I would like to modify the code bellow so that if I answer correctly it “closes” itself and returns to the originally intended app. With that intention I included the “exit(0)”, but I get an error. Thanks for your help in advance! Best, Tom struct ContentView: View { @State private var correctAnswer = 0 @State private var choiceArray : [Int] = [0, 1, 2, 3] @State private var firstNumber = 0 @State private var secondNumber = 0 @State private var difficulty = 1000 var body: some View { VStack { Text("(firstNumber) + (secondNumber)") .font(.largeTitle) .bold() HStack { ForEach(0..<2) {index in Button { answerIsCorrect(answer: choiceArray[index]) generateAnswers() } label: { AnswerButton(number: choiceArray[index]) } } } HStack { ForEach(2..<4) {index in Button { answerIsCorrect(answer: choiceArray[index]) generateAnswers() } label: { AnswerButton(number: choiceArray[index]) } } } } func answerIsCorrect(answer: Int){ if answer == correctAnswer {exit(0)} } } func generateAnswers(){ firstNumber = Int.random(in: 0...(difficulty/2)) secondNumber = Int.random(in: 0...(difficulty/2)) var answerList = Int correctAnswer = firstNumber + secondNumber for _ in 0...2 { answerList.append(Int.random(in: 0...difficulty)) } answerList.append(correctAnswer) choiceArray = answerList.shuffled() } } struct ContentView_Previews: PreviewProvider { static var previews: some View { ContentView() } }
Posted
by
Post not yet marked as solved
1 Replies
24 Views
My app downloads files from AWS S3. What we'd like to do is replicate our files across several of Amazon's data centers (regions) to put the content closer to our users, who are worldwide. What I need is a way to determine in a very ***** way which data center would be best to use. For example North America, Europe, Asia, etc. I don't want to use location services since I don't really need the exact location. Is there a simpler way to do this? I suppose I could use the localization settings, but I don't think that's really guaranteed to represent their actual location. Thanks, Frank
Posted
by
Post not yet marked as solved
1 Replies
28 Views
I was watching https://developer.apple.com/videos/play/wwdc2023/10248/ , in this video it is adviced to make below shated property async to benefit from concurency (in video 40:04 , exact time) , do yo know how to do it ? class ColorizingService { static let shared = ColorizingService() func colorize(_ grayscaleImage: CGImage) async throws -> CGImage // [...] } struct ImageTile: View { // [...] // implicit @MainActor var body: some View { mainContent .task() { // inherits @MainActor isolation // [...] result = try await ColorizingService.shared.colorize(image) } } }
Posted
by
Post not yet marked as solved
0 Replies
27 Views
I have a data object that dynamically changes the UIImage assigned to one of its instance variables, but when showing this image in SwiftUI, it's always black and white. The following sample code shows the difference between the same image, but using first the native constructor Image(systemName:) and then Image(uiImage:). When using AppKit and Image(nsImage:) this issue doesn't happen. import SwiftUI import UIKit struct ContentView: View { @State var object = MyObject() var body: some View { Image(systemName: "exclamationmark.triangle.fill") .symbolRenderingMode(.palette) .foregroundStyle(.white, .yellow) Image(uiImage: object.image) } } class MyObject { var image = UIImage(systemName: "exclamationmark.triangle.fill")! .applyingSymbolConfiguration(.init(paletteColors: [.white, .systemYellow]))! } #Preview { ContentView() }
Posted
by
Post not yet marked as solved
0 Replies
26 Views
ive been sending a lot of request to the enrollment program, putting my card info and receiving the emails thats says the case number and to wait 2 business days, but have no response at all. try the same through the appstore developer and says : enrollment through the apple developer app is not available for this apple id. so ive try web and got the email just told and never got any response back. ive used my personal apple id from my iphone and through the app let me all the way, when i hit pay it says:: APPLE ID ISSUE, the region from the apple id is not the same as the region in system configuration, ive checked both and they are the same. send emails everyday through the contact options, been calling to apple support and no one can help me
Posted
by
Post not yet marked as solved
0 Replies
30 Views
I'd like to know how to test behavior in Swift on Desktop that needs to interact with external elements, in my case the Finder. My goal is simple: add an option in the right-click menu of the Finder that will open my application with the selected entry or entries (file or folder) from the Finder. I have thus set the elements NSMenuItem, NSMessage, NSPortName, NSRequiredContext (NSApplicationIdentifier: com.apple.finder) etc. I also created a class FinderService with a function performService having this declaration: func performService(_ pboard: NSPasteboard, userData: String, error: AutoreleasingUnsafeMutablePointer<NSString>) { NSLog("performService called!") } And I instantiated my class like this: NSApplication.shared.servicesProvider = FinderService(). However, when I build and launch the application nothing happens, well my application runs fine and the instantiation of the class seems to be correctly called. But when I open my Finder, my action is not displayed in the right-click context menu. And in the logs of my application, no error appears. How can I test this?
Posted
by
Post marked as solved
2 Replies
42 Views
I have been working on an ios application which I decided to use the new SwiftData architecture, and I now have realized that SwiftData does not support public or shared databases using SwiftData. I am a new and upcoming Swift developer, who has been self learning the Apple Swift technology. I just learned in this forum that Swift Data does not support Public and Shared Data and I also understand that no plans that have been announced to addressed this feature in the IOS 18 release time frame. The use case for my application is to implement a social type application, something like applications including X, facebook, etc. These type of applications appear to use Public, Shared and Private data in various existing Apple application. I would like to complete an application using Swift using ICloud, but I must assume that these current social applications are using other technologies. I am assuming you can do this in Core Data, but my understanding is Apple is asking that we use Swift Data to replace core data. I also am assuming that Swift data is built on core data technology layers. So ... to me it seems hopeful, somehow, to accomplish this using IOS 17 and follow the recommend Swift Data Path. What are my options for completing these data objectives in IOS? I hope I am addressing these questions in the proper and best forum? Much thanks for any suggestions.
Posted
by
Post not yet marked as solved
0 Replies
27 Views
Hi Team Is there a way to extract a colorized scan as well with using the roomplan SDK ? . If yes, can you point me to the right reference link ? Does the roomplan SDK provide dimensions of the room ?
Posted
by
Post not yet marked as solved
4 Replies
49 Views
I just bought a m2 MacBook Pro running latest Xcode / Sonoma from apple refurbed. I am porting from an older macbook pro/intel chip a project that uses pthread. as this is my first week using the laptop, I recompiled the project but the apple linker does not find the libpthread library. I see it is not included in the installed Xcode. how do I get this library from Apple? regards Dave
Posted
by

TestFlight Public Links

Get Started

Pinned Posts

Categories

See all