Most of the iPhone 13 and 13 mini’s upgrades are around photography and video. Apple has improved both the rear sensors here and the ultra-wide lens should let in more light than before. Unfortunately, many of the notable additions depend on the A15 chipsets, which means that things like Cinematic Mode will not come to older iPhones.
For example, the faster image signal processor (ISP) on the chip means that things like photos at night will not take that long. I definitely do not have to hold the iPhone 13 still for as many seconds as the iPhone 12 when I both shoot a candlelight in a super dark room. The difference was probably about a second, which sounds insignificant, but it can feel like forever if you struggle to stay motionless.
The ultra-wide photos I took with the new phone were actually darker than the iPhone 12s, but they were generally better exposed. Buildings against the night sky have cleaner lines, less noise and a more neutral tone than that of the iPhone 12. However, Google’s Night Sight on the Pixel 5 still gives more detail in the shadow, and I prefer the cooler images it delivers.
I previously chose photos of Pixels because the photos from Apple had a yellow tint. But with the iPhone 13s, Apple is introducing a way to better suit users’ individual preferences called Photographic Styles. It lets you choose from one of five profiles: standard, rich contrast, vibrant, warm and cool, which differ in contrast levels and color temperature.
You can also customize these modes to your liking. But in their original settings, my favorite style was alive. Unlike filters, it felt more like a set-and-forget thing — nice for people like me who had never been in Apple’s standard treatment before. Overall, the iPhone 13 took colorful and sharp photos, but compared to Google’s images, it was unnecessarily brighter with clear HDR effects.
In addition to the hardware and software enhancements I have already mentioned, the company has also updated its HDR algorithm to better accommodate each person in the scene. It also worked to improve video quality, promising better dynamic range, detail and highlights. Plus, you can now record in Dolby Vision in 4K resolution at up to 60 frames per second.
But the most intriguing new video feature (and probably of all the camera updates) is Cinematic Mode. Using the neural engine of the A15 chip, the iPhone 13 can create a portrait mode-like effect in your tracks, keeping your selected subjects in focus while dimming the rest of the scene. You can tap on parts of your viewfinder to change focus points while shooting, or let the iPhone decide for you by analyzing who and what is in the scene.
On its own, Apple’s system is pretty smart. The iPhone 13 did very well in identifying faces (both human and dog) in my shots, and yellow or white boxes indicate possible things to focus on. As my subjects turn away from and away from the camera, they become clearer and more vague, respectively. But when I tried to exercise more control and adjust the focal point, the system struggled. Sometimes my intended subject was vague, even after I typed on his rectangle. Other times, the iPhone did not follow the person I chose after walking behind an obstruction, but that’s a reasonable situation.
If it was as expected, Cinematic Mode produced a pleasant effect that gave videos a professional atmosphere. But at the standard intensity, the vagueness seemed strange or artificial. The circumference of my colleague’s head was sharp against the softened background and I had to adjust the F-stop to the highest (f / 16) to get a more natural feel.
It is noteworthy that the movie mode only works in 1080p at 30 frames per second, even if you have set the camera to a higher quality.
The movie mode is also available via the 12-megapixel selfie camera, which also offers photographic styles, and both features were just as effective via the front sensors as through the rear.
Gallery: iPhone 13 Camera Sample Photos | 16 photos
Gallery: iPhone 13 Camera Sample Photos | 16 photos
In low light, the iPhone 13 took selfies that were slightly blurry compared to the Pixel 5 and Galaxy S21, but when I was well-lit, Apple’s camera produced just as sharp photos as the competition. It has an even more neutral tone than the other two, with a more accurate white balance (though Samsung was pretty close).
I covered most of the changes that came via iOS 15 when I tested the beta, including things like focus modes and SharePlay. Focus modes, which allow you to set personal homepages and notification profiles based on your location or time of day, are still one of the most useful new features on any smartphone platform in recent years. In the meantime, SharePlay will only be available for a later release.
Each time you open an appropriate app, such as Photos or Tips, this time Apple shows you what’s new – like Memories set to music from the company’s music library. Safari has also been redesigned (and made some adjustments during the beta window), which has largely made it easier to browse and organize your tabs.
I’ve never been a big Safari user, but prefer Chrome for its convenience, but it’s nice to see Apple update the interface for easier one – handed navigation. Chrome and Safari are pretty similar on iOS, though unfortunately Google still has its address and search bar at the top of the screen. If you prefer, you can also go back to the traditional layout in Safari.
Other notable iOS updates include Live Text in Photos, which makes it easier to find specific photos from the Spotlight search. The Maps and Weather apps also got an update, while Shared With You in Messages makes it a little easier to find things you and your friends have been talking about. However, as most of these will come to older iPhones, iOS 15 features are unlikely to influence your decision to upgrade this year.
We’ll get a more in-depth overview of Apple’s latest operating system soon, but for now, I’m happy with the level of control of iOS 15 and look forward to testing out a stable version of SharePlay.