The title doesn’t do it justice - everything with images quickly adds up.
Doing 120 fps video at 4K so that any chosen frame looks amazing without artifacts is really quite an achievement.
The microphones were actually more interesting to me, that you can get lavalier performance from the tiny mics in the phone that are physically far from the person being recorded is seriously clever.
Getting this to work some of the time is already an achievement but I think people underestimate how much work goes into making it work across all different scenarios.
Mirrorless cameras nowadays can push as much as 8k60 (twice the amount of data over 4k120), every frame at full quality. All that with better optics/sensors and less "perceptual testing", thus less of an overprocessed look.
And they cost at least double, and rarely all the time with you.
I am a bit annoyed by the "AI/processing" part of cellphones, I would love to get back to camera but I can't justify it anymore.
can that mirrorless camera make a phone call, send a text, edit the photo it takes, browse the web, upload the picture to an app, run other apps, or any of the many many other things a phone does that a single function dedicated camera cannot?
what trade offs do you accept for that mirrorless camera to do all of the same functionality while still taking an above decent looking image for the vast majority of its users without all of those fancy lenses fixed or interchangeable? As the saying goes, the best came you have is the one you have on you. Only snooty photo/video types care about your comment. Most videos used by phone people are only ever viewed on that device or other similar devices of others viewing it on whatever app they shared it. To even consider them comparable is just not an honest take in the slightest. I say this as someone with several DSLRs and a couple of cinema camera bodies in the next room.
> 38% of people said that better cameras are a main motivation for buying a new phone
This strikes me as just a reflection of the ad campaigns. Apple promotes “better cameras” for every new iPhone, almost exclusively in their ads, so it’s not surprising that’s what people would say. With every new phone being just an incremental upgrade, hyping up the camera is the only way to get people to drop a $1000 on a new one. Most of these 38% won’t be able to tell the difference between a phone pic taken 5 or more years ago.
Apple at least has certainly emphasized photography (and video) a lot.
But that actually seems to be a very reasoned response to consumers asking themselves "Why should I upgrade my phone?" And over at least some timescale--maybe not every model--a better camera actually seems like a pretty reasonable answer for people who care.
Wouldn't it make more sense to assume the opposite? That the ad campaigns are a reflection of what product details generate the most ongoing interest in the product? Otherwise you're left with the premise that the ad creates the interest in which case the topic (camera power) is largely coincidental.
Better camera is not just the optical quality or even processing though.
Older phones gets slower, either because of the dying battery (that could be replaced, but many of these people are on subsidized phones so upgrading is baked into their plan), or software bloat slowly creeping on.
When camera boot up, shutter lag or post processing starts to take one sec or two, moving to a newer phone is the obvious option.
> Most of these 38% won’t be able to tell the difference between a phone pic taken 5 or more years ago.
That is an arbitrary statement, that came from your biases and not observation. Maybe in best of lighting some 5-year-old photos maybe as good as latest flagship.
Do less in terms of what exactly? Because I have photos from old digital cameras from nearly 20 years ago that look drastically better and more detailed than Android or iPhone photos from several years back. I would be surprised if the quality is down. In terms of features the quality is key.
Theory 1: People treasure memories and have been let down by pictures tnat don’t age well from cheap film cameras, then early digital cameras, then phone cameras. Phones have gotten good enough to replace separate devices for most people, but the cameras are still not as versatile. People will spend money for incremental improvements because the payoff will last decades and benefit their children. Apple invests heavily in camera tech to satisfy this need, and in marketing to communicate the benefits.
Theory 2: sheeple buy whatever marketing tells them to, those rubes don’t see any real benefit.
I just don’t see how anyone who enjoys even casual photography could go for theory 2.
If phone cameras hadn’t changed over the past 10 years, most people would still be happy with the quality of photos taken. Most photos are never printed out or viewed on any device other than their phone. They may be poorly lit, composed badly, and be blurry, but they still serve their primary purpose for most people, which is to relive a memory. That function is the same whether the sensor is 2 or 48 megapixels.
Newer tech of course looks better, and people can definitely tell the difference when comparing them side-by-side. But there’s also an element of the reality distortion field when it comes to convincing people that they need to upgrade from a two year old phone to the latest release just because of the camera upgrades.
The 808, while a great high resolution sensor, processed stills at 5MP after pixel binning, compared to the 48/24/12MP of the iPhones. It did have a non-binned mode but again for stills.
The majority of the post is actually about video and the comparison is a lot more dramatic there.
A 4k frame from a modern phone ,like the iPhone 16 pro, is ~8MP.
The 808 could “only” do 1080p at 30fps (62M pixels a second). Compare that to 4k at 120fps (995M pixels a second). That’s a far cry from 1Bn per second that you’re claiming. Impressive for the time though.
There’s a significant difference here.
The sensor on the 808 was amazing. The real meat of the tech here is the image processing pipeline that can process everything off the sensor for further processing.
The issue is not what Nokia did with it, but the presence of the functionality.
The binning and other features were enabled (poorly or not isn’t part of the discussion) by a chip that could do about 1 billion pixels per second; that’s the issue; Apple doing it 12 years later and hyping it is what I was commenting on.
Even so, the "camera first" Nokia 808 PureView took notably worse photos (DxOMark Mobile Photo score: 60) than the iPhone 7 (score: 86) and other flagship phones (Samsung Galaxy S6 Edge score: 86) of that time.
Unless apple invented it, no they didn't and you're wrong...
The cult of apple kinda says everything wrong about the valley to me. This isn't magicians at work with secret spells, this is just cutting edge tech that is at the point of being commodity. Making something appealing to idiots has drawbacks, less repairability, less durability etc and now even apple is slowly reverting even this under govt changes...
If iOS has a majority market share in the US, wouldn’t anyone who is not using an iPhone be part of the “cult”?
And I bet if I drop my phone, it would be a lot easier to drive to one of the five Apple Stores in the metro area to get it repaired than it would be for you to get your camera repaired.
That’s not even considering the fact that it could service a drop better than your camera and is water resistant
One thing about the iPhone microphone. I had a beautiful day the other day here with some light drizzle and the sea waves were rocking nicely it was a great sound and the beach was empty. Great feeling. I wanted to record this but I only had my iPhone - was just walking in the beach. So neither the default recording app of the iphone and neither any of the ones I tried, could capture the ocean waves or the sound of the drizzle. I had previously (intentionally) set out to the beach with the intent to capture audio and did so with my Laptop + external mic. My conclusion is that unfortunately that is the way it has to be, and that it's not possible to capture audio on a whim as is with photos.
The problem is that you're trying to record relatively quiet sounds that >99% of people would consider undesirable background noise, so many consumer devices will default to filtering out those sounds and do an excellent job. It would be quite easy for Apple to add a "nature sounds" mode to the recording app and work their computational magic in reverse, but I'm not sure that idea would occur to anyone in a design meeting or make it through review.
If you do want to record those kinds of sounds, the term of art is "field recording" and there's tons of good information available on how to do it well.
Yes, I understand. Currently the way I do is I use my laptop, a T14s, set it to power efficiency (so it barely turns the fans), turn on Audacity, and use an old headset that has a microphone, and a somehow very long cable - the actual rubber on the headset is long gone and I never replaced it. It serves me well but requires a backpack and it's a weird setup - but it works. I occasionally make games, and it's usually a bit hard to find nice chill nature and environment sounds that have a good quality, are calm and are royalty free (and cheap or free), so I end up capturing these sounds myself.
The experience I described was with the 14 but I also tried in the 15 pro something similar and had same experience.
For now my approach is to use a external microphone with a longer cable and a notebook, and this works, but it would be nice to make this work with the iPhone.
The reality is journalists don't really have the knowledge to explain any specifics at all, so you just get this fluff. But hey at least there's pretty pictures
is that the reason or just a convenient side effect? the last two models had to move the lenses around slightly so they could capture whatever they called their stereo AR/VR type acquisition so the pupil distances worked correctly. after that, there's only so many ways to arrange 2 or 3 lenses.
That's not the market they're going after, even with all the ads and promotion. It's more an aspirational positioning than a hardcore one.
For instance Xiaomi is a lot more serious about the photo part (and they also sell a lot less as well, this model was only noticed in photography circles)
Packaging. A phone is absolutely stuffed with components, most of the market highly values thinness, so every mm^3 is jealously fought over. Putting the camera in the middle might mean making the phone bigger, having to shrink the battery, or putting antennas in less-than-ideal places. It's design trade-offs all the way down.
Another reason that comes to mind is the camera bump. If you put the camera in the middle then it’d kinda rest on that on a flat surface and wobble. With the camera in a corner it sits on the camera bump + the opposite corner which is a bit more stable and less obvious.
If we are playing with phone camera form factors, I vote for thinking outside the phone. Remove the camera entirely from the phone, and put it in a separate cylindrical device; a smaller incarnation of Apple's ancient iSight webcam. Stuff in better optics, put controls on the cylinder, and allow viewing the image from my watch when taking a photo/video. So I can leave the phone at home...
Most users would want to use the screen of their smartphone as a viewfinder. That either means using both hands or requiring a way to attach the thing to the smartphone.
Also, to do the image processing, that cylinder would need to have most of the processing power of the iPhone and, thus, a fairly large battery.
> why not put the camera in the middle of the phone?
This is one of those things that doesn't seem like it should matter, but it does. If the lens is mounted in the exact center of the body, the images come out looking unbalanced. To produce balanced images, you have to offset the lens. Even very expensive pro mirrorless bodies are offset; that is, if you look directly down the center of the lens, you'll notice there is more camera body sticking out on one side than the other.
This is called the chirality of the optical path and it is surprisingly difficult to predict analytically. Companies will typically design the optical path, prototype it, and mount it on a jig to precisely measure the chirality. From this, they design the body with the proper offset.
Chirality is more noticeable the smaller the sensor and the shorter the lens. So on smart phones, which put tiny sensors behind wide-angle lenses, they have to get the offset just right. This explains why the lenses are in slightly different places on the body every time Apple updates their cameras.
I vouched for this comment just so I could reply to it. The entire thing is so fascinatingly, unbelievably, OBVIOUSLY, violently incorrect in every single way, yet it doesn’t feel like straight GPT output.
It was also posted by a person with a huge karma. I want to understand what happened.
Every mirrorless body has a center marking for the middle of the sensor, so the camera can be mounted exactly centered on a tripod. It's actually important for photos to be exactly on axis if you want to do panoramas or stitching.
The only reason the body is not exactly symmetrical is engineering and ergonomics. Many point and shoots of the past in fact had the lens exactly in the center.
And "chirality of the optical path" is not anything related to this, in fact the term is not usable in this context at all.
Photos on my wife's camera look slightly deformed. The people don't look like themselves, their faces are just slightly off. I'm blaming the AI inside of the photos app on iPhone, but I'm not sure.
People's face look perfect on my mediocre Android though.
I'm never going to buy or use an iPhone. Even the questionable advantage which was supposed to be the iPhone's camera is fake.
This is probably a combination of the lens corrections, the pretty awful auto white balance (warmth), the terrible oversharpening and also a bit of True Tone. Portrait mode also wrecks a lot of photos due to the crappy emulation of DOF. Output is clever but shit.
Due to the general flatness of the lens there is a lot of distortion around the edges which is digitally removed after the photo is taken. This isn't 100% perfect and causes some rather uncanny looks in some of the photos. You can use this for artistic effect but it looks crap mostly. Generally if you're using a proper camera there's a big chunk of glass in front of it so the main part of a portrait is well outside the distorted edges of the frame so it's not noticeable. Even new cameras use minimal lens corrections in body as well to eliminate this.
As for the white balance, Apple never seem to get this right. The colours are always slightly too orange / warm and vivid and never quite match reality in experience. You can crank the warmth down a bit after in photos.app to kill some of it.
Oversharpening - everything is too sharp. This makes the image pop out but nothing more. It's a terrible curse on smartphones. Not much you can do about it. Even shooting ProRAW on mine is oversharpened.
If you turn the True Tone feature off in Display/Settings it looks a bit better as well. That seems to completely mung viewing any photos later, giving them sometimes an over-blue tone.
Urgh all this is why I bought a mirrorless. Smartphones are really not very good. Even good ones (mine is a 15 Pro). Mine gets mostly used to take photos of an AirBnB when I leave it now or where the car got parked.
My bonus pet peeve about portrait mode is that the internet is full of portrait mode photos, which means AI gets trained on portrait mode photos, which means AI generates pictures that look like portrait mode. Garbage in, garbage out.
I wonder how much that wire mesh floor distorts the recording result? I guess it must be insignificant, since the walls, ceiling and floor absorb almost all reflected sound waves?
Sound waves are really long. 20kHz is about the limit of human hearing, so the shortest wave we can hear is about 17cm; the longest wave we can hear is about 17 meters. A suitably designed mesh will be effectively transparent across that frequency range.
Doing 120 fps video at 4K so that any chosen frame looks amazing without artifacts is really quite an achievement.
The microphones were actually more interesting to me, that you can get lavalier performance from the tiny mics in the phone that are physically far from the person being recorded is seriously clever.
Getting this to work some of the time is already an achievement but I think people underestimate how much work goes into making it work across all different scenarios.
Nobody is claiming a world record from Apple …
what trade offs do you accept for that mirrorless camera to do all of the same functionality while still taking an above decent looking image for the vast majority of its users without all of those fancy lenses fixed or interchangeable? As the saying goes, the best came you have is the one you have on you. Only snooty photo/video types care about your comment. Most videos used by phone people are only ever viewed on that device or other similar devices of others viewing it on whatever app they shared it. To even consider them comparable is just not an honest take in the slightest. I say this as someone with several DSLRs and a couple of cinema camera bodies in the next room.
This strikes me as just a reflection of the ad campaigns. Apple promotes “better cameras” for every new iPhone, almost exclusively in their ads, so it’s not surprising that’s what people would say. With every new phone being just an incremental upgrade, hyping up the camera is the only way to get people to drop a $1000 on a new one. Most of these 38% won’t be able to tell the difference between a phone pic taken 5 or more years ago.
But that actually seems to be a very reasoned response to consumers asking themselves "Why should I upgrade my phone?" And over at least some timescale--maybe not every model--a better camera actually seems like a pretty reasonable answer for people who care.
Older phones gets slower, either because of the dying battery (that could be replaced, but many of these people are on subsidized phones so upgrading is baked into their plan), or software bloat slowly creeping on.
When camera boot up, shutter lag or post processing starts to take one sec or two, moving to a newer phone is the obvious option.
That is an arbitrary statement, that came from your biases and not observation. Maybe in best of lighting some 5-year-old photos maybe as good as latest flagship.
Theory 1: People treasure memories and have been let down by pictures tnat don’t age well from cheap film cameras, then early digital cameras, then phone cameras. Phones have gotten good enough to replace separate devices for most people, but the cameras are still not as versatile. People will spend money for incremental improvements because the payoff will last decades and benefit their children. Apple invests heavily in camera tech to satisfy this need, and in marketing to communicate the benefits.
Theory 2: sheeple buy whatever marketing tells them to, those rubes don’t see any real benefit.
I just don’t see how anyone who enjoys even casual photography could go for theory 2.
Newer tech of course looks better, and people can definitely tell the difference when comparing them side-by-side. But there’s also an element of the reality distortion field when it comes to convincing people that they need to upgrade from a two year old phone to the latest release just because of the camera upgrades.
The 808, while a great high resolution sensor, processed stills at 5MP after pixel binning, compared to the 48/24/12MP of the iPhones. It did have a non-binned mode but again for stills.
https://en.wikipedia.org/wiki/Nokia_808_PureView
The majority of the post is actually about video and the comparison is a lot more dramatic there.
A 4k frame from a modern phone ,like the iPhone 16 pro, is ~8MP.
The 808 could “only” do 1080p at 30fps (62M pixels a second). Compare that to 4k at 120fps (995M pixels a second). That’s a far cry from 1Bn per second that you’re claiming. Impressive for the time though.
There’s a significant difference here.
The sensor on the 808 was amazing. The real meat of the tech here is the image processing pipeline that can process everything off the sensor for further processing.
The binning and other features were enabled (poorly or not isn’t part of the discussion) by a chip that could do about 1 billion pixels per second; that’s the issue; Apple doing it 12 years later and hyping it is what I was commenting on.
Can't see what your point is.
This is really about hiding their entire SIRI recording your conversations without your permission.
The cult of apple kinda says everything wrong about the valley to me. This isn't magicians at work with secret spells, this is just cutting edge tech that is at the point of being commodity. Making something appealing to idiots has drawbacks, less repairability, less durability etc and now even apple is slowly reverting even this under govt changes...
And I bet if I drop my phone, it would be a lot easier to drive to one of the five Apple Stores in the metro area to get it repaired than it would be for you to get your camera repaired.
That’s not even considering the fact that it could service a drop better than your camera and is water resistant
Deleted Comment
If you do want to record those kinds of sounds, the term of art is "field recording" and there's tons of good information available on how to do it well.
For now my approach is to use a external microphone with a longer cable and a notebook, and this works, but it would be nice to make this work with the iPhone.
e.g.: https://www.mi.com/global/product/xiaomi-11t-pro-120w-xiaomi...
Because when you pick it up, your hand is covering up the middle of the phone. However the top quarter is unobstructed.
> And why do the cameras keep moving around?
Easy way to tell models apart.
is that the reason or just a convenient side effect? the last two models had to move the lenses around slightly so they could capture whatever they called their stereo AR/VR type acquisition so the pupil distances worked correctly. after that, there's only so many ways to arrange 2 or 3 lenses.
For instance Xiaomi is a lot more serious about the photo part (and they also sell a lot less as well, this model was only noticed in photography circles)
https://www.mi.com/global/product/xiaomi-14-ultra/
Low end model https://www.dpreview.com/products/sony/compacts/sony_dscqx10
High end model https://www.dpreview.com/reviews/sony-cybershot-dsc-qx100
Most users would want to use the screen of their smartphone as a viewfinder. That either means using both hands or requiring a way to attach the thing to the smartphone.
Also, to do the image processing, that cylinder would need to have most of the processing power of the iPhone and, thus, a fairly large battery.
This is one of those things that doesn't seem like it should matter, but it does. If the lens is mounted in the exact center of the body, the images come out looking unbalanced. To produce balanced images, you have to offset the lens. Even very expensive pro mirrorless bodies are offset; that is, if you look directly down the center of the lens, you'll notice there is more camera body sticking out on one side than the other.
This is called the chirality of the optical path and it is surprisingly difficult to predict analytically. Companies will typically design the optical path, prototype it, and mount it on a jig to precisely measure the chirality. From this, they design the body with the proper offset.
Chirality is more noticeable the smaller the sensor and the shorter the lens. So on smart phones, which put tiny sensors behind wide-angle lenses, they have to get the offset just right. This explains why the lenses are in slightly different places on the body every time Apple updates their cameras.
It was also posted by a person with a huge karma. I want to understand what happened.
Every mirrorless body has a center marking for the middle of the sensor, so the camera can be mounted exactly centered on a tripod. It's actually important for photos to be exactly on axis if you want to do panoramas or stitching.
The only reason the body is not exactly symmetrical is engineering and ergonomics. Many point and shoots of the past in fact had the lens exactly in the center.
And "chirality of the optical path" is not anything related to this, in fact the term is not usable in this context at all.
People's face look perfect on my mediocre Android though.
I'm never going to buy or use an iPhone. Even the questionable advantage which was supposed to be the iPhone's camera is fake.
Due to the general flatness of the lens there is a lot of distortion around the edges which is digitally removed after the photo is taken. This isn't 100% perfect and causes some rather uncanny looks in some of the photos. You can use this for artistic effect but it looks crap mostly. Generally if you're using a proper camera there's a big chunk of glass in front of it so the main part of a portrait is well outside the distorted edges of the frame so it's not noticeable. Even new cameras use minimal lens corrections in body as well to eliminate this.
As for the white balance, Apple never seem to get this right. The colours are always slightly too orange / warm and vivid and never quite match reality in experience. You can crank the warmth down a bit after in photos.app to kill some of it.
Oversharpening - everything is too sharp. This makes the image pop out but nothing more. It's a terrible curse on smartphones. Not much you can do about it. Even shooting ProRAW on mine is oversharpened.
If you turn the True Tone feature off in Display/Settings it looks a bit better as well. That seems to completely mung viewing any photos later, giving them sometimes an over-blue tone.
Urgh all this is why I bought a mirrorless. Smartphones are really not very good. Even good ones (mine is a 15 Pro). Mine gets mostly used to take photos of an AirBnB when I leave it now or where the car got parked.