Readit News logoReadit News
Uncorrelated commented on The iPhone 15 Pro’s Depth Maps   tech.marksblogg.com/apple... · Posted by u/marklit
heliographe · 3 months ago
> but apparently recent iPhones capture them for standard photos as well.

Yes, they will capture them from the main photo mode if there’s a subject (human or pet) in the scene.

> I made an app that used the depth maps and portrait effects mattes from Portraits for some creative filters. It was pretty fun, but it's no longer available

What was your app called? Is there any video of it available anywhere? Would be curious to see it!

I also made a little tool, Matte Viewer, as part of my photo tool series - but it’s just for viewing/exporting them, no effects bundled:

https://apps.apple.com/us/app/matte-viewer/id6476831058

Uncorrelated · 3 months ago
I'm sorry for neglecting to respond until now. The app was called Portrait Effects Studio and later Portrait Effects Playground; I took it down because it didn't meet my quality standards. I don't have any public videos anymore, but it supported background replacement and filters like duotone, outline, difference-of-Gaussians, etc., all applied based on depth or the portrait effects matte. I can send you a TestFlight link if you're curious.

I looked at your apps, and it turns out I'm already familiar with some, like 65x24. I had to laugh -- internally, anyway -- at the unfortunate one-star review you received on Matte Viewer from a user that didn't appear to understand the purpose of the app.

One that really surprised me was Trichromy, because I independently came up with and prototyped the same concept! And, even more surprisingly, there's at least one other such app on the App Store. And I thought I was so creative coming up with the idea. I tried Trichromy; it's quite elegant, and fast.

Actually, I feel we have a similar spirit in terms of our approach to creative photography, though your development skills apparently surpass mine. I'm impressed by the polish on your websites, too. Cheers.

Uncorrelated commented on The iPhone 15 Pro’s Depth Maps   tech.marksblogg.com/apple... · Posted by u/marklit
Uncorrelated · 3 months ago
Other commenters here are correct that the LIDAR is too low-resolution to be used as the primary source for the depth maps. In fact, iPhones use four-ish methods, that I know of, to capture depth data, depending on the model and camera used. Traditionally these depth maps were only captured for Portrait photos, but apparently recent iPhones capture them for standard photos as well.

1. The original method uses two cameras on the back, taking a picture from both simultaneously and using parallax to construct a depth map, similar to human vision. This was introduced on the iPhone 7 Plus, the first iPhone with two rear cameras (a 1x main camera and 2x telephoto camera.) Since the depth map depends on comparing the two images, it will naturally be limited to the field of view of the narrower lens.

2. A second method was later used on iPhone XR, which has only a single rear camera, using focus pixels on the sensor to roughly gauge depth. The raw result is low-res and imprecise, so it's refined using machine learning. See: https://www.lux.camera/iphone-xr-a-deep-dive-into-depth/

3. An extension of this method was used on an iPhone SE that didn't even have focus pixels, producing depth maps purely based on machine learning. As you would expect, such depth maps have the least correlation to reality, and the system could be fooled by taking a picture of a picture. See: https://www.lux.camera/iphone-se-the-one-eyed-king/

4. The fourth method is used for selfies on iPhones with FaceID; it uses the TrueDepth camera's 3D scanning to produce a depth map. You can see this with the selfie in the article; it has a noticeably fuzzier and low-res look.

You can also see some other auxiliary images in the article, which use white to indicate the human subject, glasses, hair, and skin. Apple calls these portrait effects mattes and they are produced using machine learning.

I made an app that used the depth maps and portrait effects mattes from Portraits for some creative filters. It was pretty fun, but it's no longer available. There are a lot of novel artistic possibilities for depth maps.

Uncorrelated commented on What is HDR, anyway?   lux.camera/what-is-hdr/... · Posted by u/_kush
springhalo · 4 months ago
I see that it says it stores images "in an HDR format by default" but keeps referencing JPEG output. Are you using JPEG-XT? There aren't a lot of "before and after" comparisons so it's hard to know how much it's taking out. I figure those would probably hurt the reputation of the app considering its purpose is to un-pop photos, but I'm in the boat of not really being sure whether I do actually like the pop or not. Is there live-photo support, or is that something that you shouldn't expect from a artist-focused product?
Uncorrelated · 4 months ago
It's a JPEG + gain map format where the gain map is stored in the metadata. Same thing, as far as I can tell, that Halide is now using. It's what the industry is moving towards; it means that images display well on both SDR and HDR displays. I don't know what JPEG-XT is, aside from what I just skimmed on the Wikipedia page.

Not having before-and-after comparisons is mostly down to my being concerned about whether that would pass App Review; the guidelines indicate that the App Store images are supposed to be screenshots of the app, and I'm already pushing that rule with the example images for filters. I'm not sure a hubristic "here's how much better my photos are than Apple's" image would go over well. Maybe in my next update? I should at least have some comparisons on my website, but I've been bad at keeping that updated.

There's no Live Photo support, though I've been thinking about it. The reason is that my current iPhone 14 Pro Max does not support Live Photos while shooting in 48-megapixel mode; the capture process takes too long. I'd have to come up with a compromise such as only having video up to the moment of capture. That doesn't prevent me from implementing it for other iPhones/cameras/resolutions, but I don't like having features unevenly available.

Uncorrelated commented on What is HDR, anyway?   lux.camera/what-is-hdr/... · Posted by u/_kush
hatsunearu · 4 months ago
Is there any workflow that can output HDR photos (like the real HDR kind, with metadata to tell the display to go into HDR mode) for photos shot with a mirrorless and not an iPhone?
Uncorrelated · 4 months ago
Yes. For example, Lightroom and Camera Raw support HDR editing and export from RAW images, and Adobe published a good rundown on the feature when they introduced it.

https://blog.adobe.com/en/publish/2023/10/10/hdr-explained

Greg Benz Photography maintains a list of software here:

https://gregbenzphotography.com/hdr-display-photo-software/

I'm not sure what FOSS options there are; it's difficult to search for given that "HDR" can mean three or four different things in common usage.

Uncorrelated commented on What is HDR, anyway?   lux.camera/what-is-hdr/... · Posted by u/_kush
Uncorrelated · 4 months ago
I find the default HDR (as in gain map) presentation of iPhone photos to look rather garish, rendering highlights too bright and distracting from the content of the images. The solution I came up with for my own camera app was to roll off and lower the highlights in the gain map, which results in final images that I find way more pleasing. This seems to be somewhat similar to what Halide is introducing with their "Standard" option for HDR.

Hopefully HN allows me to share an App Store link... this app works best on Pro iPhones, which support ProRAW, although I do some clever stuff on non-Pro iPhones to get a more natural look.

https://apps.apple.com/us/app/unpro-camera/id6535677796

Uncorrelated commented on Little Sisyphus A physics-based platformer for the NES   pubby.games/sisyphus.html... · Posted by u/colinprince
Uncorrelated · 7 months ago
I played some of this after I read the NESFab page posted about a week ago. It's an impressive NES game for any length of time spent on development, let alone a month. Now that I know that it's from the creator of NESFab, the polish makes sense -- obviously the creator is intimately familiar with both the hardware and their own development tools. Compliments must also be paid to the art and appropriately Sisyphean music.

I gave up at 35 souls.

Uncorrelated commented on Nikon reveals a lens that captures wide and telephoto images simultaneously   digitalcameraworld.com/ca... · Posted by u/giuliomagnifico
usrusr · 8 months ago
On an only slightly related note, I'd be happy if the same was available on smartphones, in software: my mobile photography is of the school "take a lot and discard almost as many" and having to choose between the different lens/sensor pairs ahead of snap is entirely alien to that process. So the camera software is forever set to that main lense and all the other ones are just dead weight in my pocket (and stuff manufacturers don't allow me to not buy when I need a new phone, preferably one with a good main camera)

I think I understand that the precessor would not be able to read out the sensors at the same time, but time-multiplexed bracketing has been done before, it really should not be too hard or weird to apply that concept to multiple sensors? (some sensors with integrated memory might even be able to do concurrent capture/deferred readout?)

Uncorrelated · 8 months ago
iPhones can do this. They support taking photos simultaneously from the two or three cameras on the back; the cameras are hardware-synchronized and automatically match their settings to provide similar outputs. The catch is you need a third-party app to access it, and you'll end up with two or three separate photos per shot which you'll have to manage yourself. You also won't get manual controls over white balance, focus, or ISO, and you can't shoot in RAW or ProRAW.

There are probably a good number of camera apps that support this mode; two I know of are ProCam 8 and Camera M.

Uncorrelated commented on T * sin (t)' ≈ Ornamented Christmas Tree (2013)   community.wolfram.com/c/p... · Posted by u/ryeguy_24
layer8 · 9 months ago
The function is neither a drill nor a Christmas tree, but similar to how it happens to look (≈) like a Christmas tree, it also happens to look like a drill. This is what I wanted to point out. It’s a multipurpose function.
Uncorrelated · 9 months ago
block_dagger was making a pun based on the sense of drill as a training exercise. A similar joke went over the heads of nearly everyone on a recent episode of Taskmaster:

https://www.youtube.com/watch?v=6PJkA3o_Im0

Uncorrelated commented on How JPEG XL compares to other image codecs   cloudinary.com/blog/how_j... · Posted by u/bentocorp
Uncorrelated · a year ago
Articles about the merits of JPEG XL come up with some regularity on Hacker News, as if to ask, "why aren't we all using this yet?"

This one has a section on animation and cinemagraphs, saying that video formats like AV1 and HEVC are better suited, which makes sense. Here's my somewhat off-topic question: is there a video format that requires support for looping, like GIFs? GIF is a pretty shoddy format for video compared to a modern video codec, but if a GIF loops, you can expect it to loop seamlessly in any decent viewer.

With videos it seems you have to hope that the video player has an option to loop, and oftentimes there's a brief delay at the end of the video before playback resumes at the beginning. It would be nice if there were a video format that included seamless looping as part of the spec -- but as far as I can tell, there isn't one. Why not? Is it just assumed that anyone who wants looping video will configure their player to do it?

u/Uncorrelated

KarmaCake day87July 13, 2024
About
Software engineer living in Portland. I make art, apps, and videos. Theoretically.
View Original