I’m engaged on a private prototype the place I’m constructing a selfie cellphone case that has a customized display screen on the again. The iPhone connects to the case through USB-C, and I wish to show the reside digital camera feed (or a processed model) on that display screen.
The concept is to seize frames within the iOS app, convert them to both RGB565, RGB888, or JPEG, after which ship them over USB-C to my very own microcontroller-based board (MIMX8MM6CVTKZAA ), which might push the info to a small show (SPI or parallel interface).
I perceive that Apple restricts USB entry in iOS, so I’m making an attempt to make clear:
Can a local iOS app (for private use solely, not App Retailer) ship uncooked information over USB-C to a customized accent?
I’m not MFi-certified. I simply wish to ship pixel/body information to a microcontroller (e.g. ESP32, RP2040, STM32) over a uncooked USB or serial protocol.
Does iOS assist speaking to any USB class (e.g. CDC, HID, bulk switch) from inside a local app?
I’m very eager to make use of usbC for information switch as I’ve used bluetooth earlier than and could be very laggy but when USB-C uncooked information switch will not be potential, are Wi-Fi or BLE the one actual alternate options for sending picture information to my display screen?
Might I probably, put a usb to Hdmi adapter onto my PCB and mirror the display screen that manner?
I’ve learn blended issues about needing MFi for ExternalAccessory and limitations round uncooked USB entry in iOS, so I’d love readability from anybody who has finished one thing comparable or labored with customized iOS {hardware}.
Thanks upfront!
I used to be initially simply making an attempt to reflect the display screen and show on a generic display screen. I’m fairly new to electronics so I apologise upfront.