The iPhone 16 was introduced at an Apple event on Monday, September 9. The device comes with a number of new features, including improved camera controls, a new ultra-wide sensor that can capture up to 2.6 times more light, and an action button that lets you set shortcuts to open specific apps or tasks.
But the biggest innovation is undoubtedly Apple Intelligence, a revolutionary new visual intelligence feature that can actually identify objects and interact with the world around you.
As part of the new camera features, there’s now a new camera control button on the side of the device that allows the user to not only take a photo, but also adjust the zoom and switch between modes and settings. However, this button also has the ability to scan the real world to identify objects, similar to how Google Lens works. To use the feature, users simply tap and hold the button, then point the phone’s camera at whatever they’re curious about.
For example, per CNETyou can scan a photo of a real dog and your iPhone will identify the breed for you. The visual intelligence also integrates with other iPhone features, allowing you to scan a poster promoting an event and add details from the poster, such as the time, date, and location, directly to your calendar. Similarly, you can scan an object and then buy it on Google.
“Camera control provides instant and easy access to the camera with just one click. It lets you adjust various camera features with a swipe of your finger,” the company said during the event. “Its convenient design keeps it accessible at all times.”
The iPhone 16, 16 Plus, 16 Pro, and 16 Pro Max will be available for pre-order on Friday, September 13, and will be available on September 20. The new models also come in a range of new colors, including black, white, navy, turquoise, and pink.