Post not yet marked as solved
In our app, we are retrieving the SIM card country code.
CTTelephonyNetworkInfo().serviceSubscriberCellularProviders
But 'serviceSubscriberCellularProviders' was deprecated in iOS 16.0 with no replacement.
So in iOS 17, is there any other way to get a SIM card-based country code?
Post not yet marked as solved
Hi all,
I've not found any articles about this change: iOS Camera app can no longer decode a QR code encoded with Shift_JIS.
Sample QR code:
Text: こんにちは
iOS 17.0.2: detected and decoded successfully
iOS 17.1: detected but couldn't decode the value
I also created simple app for scanning QR code (based on tutorial a https://www.hackingwithswift.com/example-code/media/how-to-scan-a-qr-code
and it has the same results.
I cannot find a way to inject the corresponding to AVCaptureMetadataOutput
Is this a bug of iOS 17.1 or an expected update?
PS: The same behavior still persists in iOS 17.1.1
Thanks
Post not yet marked as solved
Hi,
Is there a way I can find all the permissions which are granted/denied to an app in iOS?
Any particular API which can run through all the permissions at one place?
Post not yet marked as solved
It appears not but maybe there is a setting buried somewhere which will allow that. Does anyone have idea if this is true or not?
Post not yet marked as solved
I get a curl(7) error response when connecting to APNS and cannot send push notifications.
The transmission is sending 10000 messages at 0.1 second intervals of 500 messages each. Part of the situation is that the iPhone device is notified.
Can anyone who has experienced a similar event please advise?
Post not yet marked as solved
I was going through the eligibility criteria to participate and it said we had to be part of the Apple Developer Program. Does that mean that we have to pay in order to participate?
Post not yet marked as solved
In an app I am working on that goes on an iPad air, I have the camera facing the user turn on and take a live video feed of the user is placed in an outlet called ImagePreviewOutlet. It works but the aspect ratio is too narrow leaving 2 white spaces on the left and right of the video. I want the aspect ration to be wider so the video fills the whole of the iPad. Here is the code for the 3 methods that handles the video:
(void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
// Convert CMSampleBufferRef to CIImage
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
// Calculate the fitting rect based on 4:3 aspect ratio
CGFloat targetAspectRatio = 4.0 / 3.0; // 4:3 aspect ratio
CGFloat targetWidth = CGRectGetHeight(self.ImagePreviewOutlet.bounds) * targetAspectRatio;
CGFloat targetX = (CGRectGetWidth(self.ImagePreviewOutlet.bounds) - targetWidth) / 2.0;
CGRect fittingRect = CGRectMake(targetX, 0, targetWidth, CGRectGetHeight(self.ImagePreviewOutlet.bounds));
// Apply filters (if needed)
// Resize the CIImage to fit the ImagePreviewOutlet
CIImage *resizedImage = [ciImage imageByApplyingTransform:CGAffineTransformMakeScale(fittingRect.size.width / CGRectGetWidth(ciImage.extent), fittingRect.size.height / CGRectGetHeight(ciImage.extent))];
// Render the CIImage into a CGImage
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef cgImage = [context createCGImage:resizedImage fromRect:resizedImage.extent];
// Create a UIImage from the CGImage
UIImage *image = [UIImage imageWithCGImage:cgImage];
// Update the UI on the main thread
dispatch_async(dispatch_get_main_queue(), ^{
self.ImagePreviewOutlet.image = image; // Replace with your UI element
});
// Release resources
CGImageRelease(cgImage);
}
(void)showImagePreviewOutlet
{
self.ImagePreviewOutlet.hidden = NO;
// Create a capture session
self.captureSession = [[AVCaptureSession alloc] init];
// Configure the capture device (use front camera)
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInWideAngleCamera
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionFront];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:nil];
[self.captureSession addInput:input];
// Create a preview layer to display the live video
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
previewLayer.frame = self.ImagePreviewOutlet.bounds;
[self.ImagePreviewOutlet.layer addSublayer:previewLayer];
// Start the capture session on a background thread
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[self.captureSession startRunning];
});
// Schedule a timer to stop the capture session after 10 seconds
[NSTimer scheduledTimerWithTimeInterval:10 target:self selector:@selector(hideImagePreviewOutlet) userInfo:nil repeats:NO];
}
(void)setupCaptureSession
{
AVCaptureSession *captureSession = [[AVCaptureSession alloc] init];
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithDeviceType:AVCaptureDeviceTypeBuiltInWideAngleCamera
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionBack];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (input) {
if ([captureSession canAddInput:input]) {
[captureSession addInput:input];
}
AVCaptureVideoDataOutput *videoOutput = [[AVCaptureVideoDataOutput alloc] init];
if ([captureSession canAddOutput:videoOutput]) {
[captureSession addOutput:videoOutput];
}
[captureSession startRunning];
} else {
NSLog(@"Error setting up capture session: %@", error.localizedDescription);
}
}
I want the video to be wider.
Any ideas?
JR
Post not yet marked as solved
Hello
We have a react native iOS app and we are having issues with password autofill. We have read some documentation on "associated domains" but cant find anything specific to our issue. We don't currently have a website with login. Is that necessary to utilize password auto fill?
Thanks for any help
Post not yet marked as solved
Hi guys,
I am trying to get my on Unreal Engine based iOS app running with OSC (https://de.wikipedia.org/wiki/Open_Sound_Control). OSC uses UDP as base to send network messages that can be used to interact with DAWs (like Ableton or Logic Pro). It works fine on MacOS (and all other platforms, except latest iOS). I can RECEIVE OSC messages within the iOS app, but can't SEND some.
Meanwhile I know this is related to privacy system around com.apple.developer.networking.multicast entitlement. I learned a lot from that posts: *https://developer.apple.com/forums/thread/663271 and https://forum.unity.com/threads/ios-14-5-can-not-send-udp-broadcast.1116352/.
So far I did the following:
Got the com.apple.developer.networking.multicast approval - yeah
Enabled it in my provisioning file
Changed my info.plist to include NSLocalNetworkUsageDescription and the Bonjour services
<key>NSLocalNetworkUsageDescription</key>
<string>Exchange OSC messages with nearby devices.</string>
<key>NSBonjourServices</key>
<array>
<string>_APP_NAME._tcp</string>
<string>_APP_NAME._udp</string>
</array>
Add entitlement to my Xcode project
Confirmed that, after build and signing app includes entitlement (as described in link *)
But still, I never see a request for access to local network nor does it SEND messages, which is kind of frustrating. It seems iOS never "sees" an access to local network that triggers ask for permission, which is weird because in log I can see LogSocket warning about using IP4 instead of IP6:
LogSockets: Warning: Destination protocol of 'IPv4' does not match protocol: 'IPv6' for address: '192.168.172.32:1338'
Assumptions:
My OSC implementation uses plugin from UE (which is based on C++) and is not using Bonjour service. Do I still have to add *._tcp and *._udp keys to *.plist file? I used name of my app, but I think it should be name of the actual used service, shouldn't it? How can I get this? Is there a way to trace or log the services?
I can't see any blocking message for my socket connection in the logs, is there a way to do "deeper" traces?
Any other suggestions, I am really stuck a bit with ideas to get it running.
All the best,
Max
Post not yet marked as solved
I was reading through this post:
https://developer.apple.com/forums/thread/718583
I've been able to reproduce this behavior by double-clicking a DMG in the Finder while the Mac is Offline. I checked the Notarization status of the app via spctl and it shows "Notarized Developer ID". So sure enough, Quinn's comment about Gatekeeper "ingesting" the notarization ticket stapled to the DMG and automatically applying it to the app inside is 100% spot-on.
However, I can't seem to get the same behavior to happen when mounting the DMG via hdiutil in Terminal. While Offline, I do a:
hdiutil attach /path/to/my/dmg.dmg
and then
spctl -a -t exec -vvv /Volumes/path/to/my/mounted/dmg/myapp.app
After the spctl I'm seeing
/Volumes/path/to/my/mounted/dmg/myapp.app: rejected
source=Unnotarized Developer ID
origin=Developer ID Application: My Developer Creds (XXXXXXXXXX)
Is there a way to get Gatekeeper to "ingest" the notarization ticket stapled to the DMG when using hdiutil while Offline?
Note 1: If I use hdiutil while online, everything works as expected.
Note 2: I'm testing all this via a VM of macOS 12.7.1, if that makes any difference.
Thanks!
Post not yet marked as solved
after I updated tp Monterey 12.7.1 scrolling with the mouse and the trackpad stopped working. I can still go up and down with the arrow buttons. After that the trackpad works for some minutes and then stops working again.
I even cannot work in the menus without scrolling.
I am working on a iMac and never had an issue.
APPLE PLEASE HELP WITH AN UPDATE!!!!!
Post not yet marked as solved
Hey everyone,
I'm facing the following issue:
I'm using one signal as a push notification service, and I want to create two projects: staging (especially to test some backend notifications) and production.
For production, I can attach the .p8 file and then we're good.
For staging, if I attach the same .p8 file, I would be in trouble because I can send a notifications to a real user even being on staging environment.
How should I proceed in this situation?
Post not yet marked as solved
Hi,
I am in the need to control standard and non-standard entity components over the time.
For example, I want to change opacity of few entities over the time with a timer.
To do that I have added component Opacity over the entities I want to change opacity , created a system and registered it.
The system fires update method and inside that I am able to change the opacity, but after few seconds it stops firing.
Once I move the window, the update method fires again for few seconds, then stops.
Any idea on to why?
Any idea what to change in order to have that continuously running?
If that is by deisgn, how can I access components at any time to have those changed when I need to?
I am using Windows, not Volumes or Immersive Spaces.
Post not yet marked as solved
After update to ios17 and xcode 15 I cant use my 64gb Iphone 11 as Runner target
Xcode says that device capacity is unknown, but when I installed Xcode 14 - device recognised correctly
I tried to install xcode 15 beta, Mac 14 os beta, iOS 17 beta, nothing works, and I rollback to last stable versions, also I tried hard reset my Iphone, unpair device, clear Xcode cache
Please can someone help with this issue?
Post not yet marked as solved
We're building one of our iPad apps for "Apple Vision Pro (Designed for iPad)" simulator and need to have certain different build settings for that simulator variant, compared to the default iOS simulator variant.
It seems both variants use the iphonesimulator SDK. And in the Xcode build settings, the "Any iOS Simulator SDK" variant of any setting is what is also used, even when building for "Apple Vision Pro (Designed for iPad)".
We tried adding a variant to the "Any iOS Simulator SDK" variant for one of the settings by editing the project file directly, using some possible variants we could think of. The added variant does show in the Xcode UI but it shows as "Any iOS Simulator SDK" also. But when building for "Apple Vision Pro (Designed for iPad)" Xcode still uses the original variant, as if it doesn't see that the added variant is matching the build destination.
Example, where we want to exclude arm64 for iOS simulator builds, but not for "Apple Vision Pro (Designed for iPad)" builds.
Original "Any iOS Simulator SDK" variant:
"EXCLUDED_ARCHS[sdk=iphonesimulator*]" = "arm64";
Added variant:
"EXCLUDED_ARCHS[sdk=iphonesimulator*][variant=Apple Vision Pro]" = "";
We have tried using variant names such as 'xr', 'xors', 'vision', 'visionos' and of course 'Apple Vision Pro' to no avail. The variant shows in the UI but the build doesn't use it.
Does anyone know if there is a variant (or other property or other way of distinguishing such variant) that we can use that Xcode will recognize and use when building for "Apple Vision Pro (Designed for iPad)"?
Thanks.
Post not yet marked as solved
We have code inside our network extension to create NWConnection with destination to loopback address 127.0.0.1 with a specific port number that another process is listening to. This method has been working fine until we test it on macOS 14.2 Beta (23C5030f) where we observed the following error:
(Network) [com.apple.network:connection] nw_connection_copy_connected_local_endpoint_block_invoke [C42] Connection has no local endpoint and accessing the connection.currentPath.localEndpoint would return nil. This eventually leads to connection failure.
This error is only observed with connection to loopback address. NWConnection created with destination to real address (Eg. on en0 interface) does not have any issue.
This error is not observed with all connections to loopback address. Some NWConnection objects would still contain the localEndpoint.
Here is an example of all log messages we get for a specific connection:
(Network) [com.apple.network:connection] [C42 D9F43B3D-6832-4581-9B3B-12F6F5C7C408 127.0.0.1:49154 tcp, attribution: developer, context: Default Network Context (private), proc: 0D165C5B-CDFE-3F43-BC52-1DFCD61739E6, prefer no proxy] start
(Network) [com.apple.network:connection] [C42 127.0.0.1:49154 initial parent-flow ((null))] event: path:start @0.000s
(Network) [com.apple.network:connection] [C42 127.0.0.1:49154 waiting parent-flow (satisfied (Path is satisfied), viable, interface: lo0)] event: path:satisfied @0.000s, uuid: 6655EA53-47F9-4B16-85D6-7B81FA0C360E
(Network) [com.apple.network:connection] [C42 127.0.0.1:49154 in_progress socket-flow (satisfied (Path is satisfied), viable, interface: lo0)] event: flow:start_connect @0.000s
(Network) [com.apple.network:connection] nw_connection_report_state_with_handler_on_nw_queue [C42] reporting state preparing
(Network) [com.apple.network:connection] nw_socket_handle_socket_event [C42:1] Socket received CONNECTED event
(Network) [com.apple.network:connection] nw_flow_connected [C42 127.0.0.1:49154 in_progress socket-flow (satisfied (Path is satisfied), viable, interface: lo0)] Output protocol connected (socket)
(Network) [com.apple.network:connection] [C42 127.0.0.1:49154 ready socket-flow (satisfied (Path is satisfied), viable, interface: lo0)] event: flow:finish_connect @0.000s
(Network) [com.apple.network:connection] nw_connection_report_state_with_handler_on_nw_queue [C42] reporting state ready
(Network) [com.apple.network:connection] [C42 D9F43B3D-6832-4581-9B3B-12F6F5C7C408 127.0.0.1:49154 tcp, prefer no proxy, attribution: developer] cancel
(Network) [com.apple.network:connection] nw_connection_copy_connected_local_endpoint_block_invoke [C42] Connection has no local endpoint
(Network) [com.apple.network:connection] [C42 D9F43B3D-6832-4581-9B3B-12F6F5C7C408 127.0.0.1:49154 tcp, prefer no proxy, attribution: developer] cancelled
[C42 6655EA53-47F9-4B16-85D6-7B81FA0C360E <NULL><->127.0.0.1:49154]
Connected Path: satisfied (Path is satisfied), interface: lo0
Privacy Stance: Not Eligible
Duration: 0.001s, TCP @0.000s took 0.000s
bytes in/out: 0/0, packets in/out: 0/0, rtt: 0.001s, retransmitted bytes: 0, out-of-order bytes: 0
ecn packets sent/acked/marked/lost: 0/0/0/0
Questions:
Why is loopback address special in this case as issue is only observed with connections to loopback address?
What we should do in terms of updating our code inside network extension to get this working in macOS 14.2?
Hi everyone,
I wanted to create a Table and make its rows sortable. For lack of better ideas how to do this I followed Apple's example here [https://developer.apple.com/documentation/swiftui/table)
Here's the code in question:
@State private var sortOrder = [KeyPathComparator(\Person.givenName)]
var body: some View {
Table(people, sortOrder: $sortOrder) {
TableColumn("Given Name", value: \.givenName)
TableColumn("Family Name", value: \.familyName)
TableColumn("E-Mail address", value: \.emailAddress)
}
.onChange(of: sortOrder) {
people.sort(using: $0)
}
}
}
To my amazement Xcode shows this message:
'onChange(of:perform:)' was deprecated in macOS 14.0: Use onChange with a two or zero parameter action closure instead.
I find this message rather annoying. Do you have any ideas how to fix this?
Post not yet marked as solved
SayI have a function that will receive a UIImage as an argument, perform the logic inside it and then return it updated
So I came up with something like this:
func processImage(image: UIImage?) -> UIImage? {
if let image = image, let cgImage = image.cgImage {
let width = cgImage.width
let height = cgImage.height
let colorSpace = CGColorSpaceCreateDeviceRGB()
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
if let context = CGContext(data: nil,
width: width,
height: height,
bitsPerComponent: 8,
bytesPerRow: 4 * width,
space: colorSpace,
bitmapInfo: bitmapInfo.rawValue) {
context.draw(cgImage, in: CGRect(x: 0, y: 0, width: CGFloat(width), height: CGFloat(height)))
if let data = context.data {
let buffer = data.bindMemory(to: UInt8.self, capacity: width * height * 4)
for i in 0..<(width * height) {
let pixelIndex = i * 4
let red = buffer[pixelIndex]
let green = buffer[pixelIndex + 1]
let blue = buffer[pixelIndex + 2]
if isSystemGrayPixel(red: red, green: green, blue: blue) {
// Convert systemGray to systemPink
buffer[pixelIndex] = 255 // R component
buffer[pixelIndex + 1] = 182 // G component
buffer[pixelIndex + 2] = 193 // B component
}
}
if let modifiedCGImage = context.makeImage() {
let processedImage = UIImage(cgImage: modifiedCGImage)
return processedImage
}
}
}
}
return image
}
insde it I have called a helper function that will conditionally check if it contains .systemGrey and if so than change it as I
func isSystemGrayPixel(red: UInt8, green: UInt8, blue: UInt8) -> Bool {
let systemGrayColor = UIColor.systemBlue
let ciSystemGrayColor = CIColor(color: systemGrayColor)
let tolerance: CGFloat = 10.0 / 255.0
let redDiff = abs(ciSystemGrayColor.red - CGFloat(red) / 255.0)
let greenDiff = abs(ciSystemGrayColor.green - CGFloat(green) / 255.0)
let blueDiff = abs(ciSystemGrayColor.blue - CGFloat(blue) / 255.0)
return redDiff <= tolerance && greenDiff <= tolerance && blueDiff <= tolerance
}
When I try to save it into a saveCanvas I will show in the console that the entry is saved but when i try to retrieve it later I will get nil
This is my saveCanvas to serve as a reference
@objc func saveCanvas(_ sender: Any) {
guard let canvas = Canvas(name: "", canvas: mainImageView, numOfPages: 0) else {
return
}
var savedImageView2 = UIImageView()
savedImageView2.image = mainImageView.image?.copy() as? UIImage
let updatedImage = processImage(image: savedImageView2.image)
canvas.canvas.image = updatedImage
// array that stores UIimageView gobally defined
canvasArray.append(canvas)
if canvasArray.count > 0 {
canvasArray.forEach{ canvas in
print("\(canvas.canvas == savedImageView )")
print("clicked button \( canvasArray) ")
}
}
}
I am expecting to retrieve each iteration of canvasArray when i call it later in another function(which worked fine before I created the processImage function) .
Like I said the purpose of processImage is to check if my UIImage contains .systemGrey and i so returned updated as I defined inside my isSystemGrayPixel function.
Is there anything you might consider I do different rather do in my processImage to make it work as expected?
Post not yet marked as solved
I'm trying to finish the migration process from individual to company account, and now I've been asked to enter the Tax ID number from Brazil (CNPJ) form my company. But the form is rejecting saying that I can not use that ID.
The ID is correct, I have already opened a support ticket, but they asked me to include the question on the forum. The support ticket is: Nº do caso: 102143162870.
This is the only step missing on the migration process. Can anyone help me ?
Post not yet marked as solved
Why can't I select only the packages that I need in Xcode 15, when I add dependencies with Swift Package Manager? For example, I want to install Firebase, but I don't need all of those dependencies that come with it. There is no checkmark next to the package name, so I can't deselect it.