28.2 C
Hyderabad
Sunday, December 22, 2024

iPhone 12 may be Apple’s first quad-camera phone

Must read

Apple isn’t exactly in a rush when it comes to redesigning products. The company has had a rather methodical and calculated approach in shaping the look and feel of its products over the years. So much so, in fact, that it’s often drawn criticisms over this, with many feeling like Apple is playing too much on the safe side.

With the iPhone 12, however, Apple is rumored to be moving away from the softer contours of present models in favor of a sharper, boxier aesthetic that channels the design sensibilities of the iPhone 4 and 5. Actually, Apple has already tried something similar with the new iPad Pro, which has decidedly sharper edges than previous models. The MacBook, too, has shifted to a similar design, so it would make sense for the iPhone to join them. After all, Apple is all about consistency.

According to a recent rumor, Apple may be experimenting with an iPhone prototype that has no notch, but thicker bezels. The idea is that the company will be able to miniaturize the TrueDepth Face ID camera, earpiece, and front-facing camera so that they could fit in the top bezel of the phone. Of course, this would result in thicker bezels all around, but may be a good trade-off, should it ever come to pass.

iPhone 12 may be Apple's first quad-camera phone

Apple has been investing heavily in AR in recent years. With ARKit established as a capable platform for mobile augmented reality, and with work on the Apple Glasses reportedly moving along at a healthy pace, we weren’t the least bit surprised when Apple launched the new iPad Pro with a LiDAR scanner on the back. At this point, we are practically expecting the iPhone 12 Pro and Pro Max to also be equipped with LiDAR scanners.

Also see: https://www.apple.com/in/iphone-se/

Light detection and ranging scanner (or LiDAR for short) can measure depth much more accurately than a regular camera, so it a perfect choice for augmented reality uses. A LiDAR camera uses ultraviolet or near-infrared light to map its surroundings in great detail. It works in conjunction with a sensor that pulses light signals, which bounce off of the environment and return to the sensor. The phone then calculates the time it took the signal to bounce back into the sensor, thus creating a more accurate depth map of the scene than a regular camera could.

More articles

1 COMMENT

0 0 votes
Article Rating
Subscribe
Notify of
guest

1 Comment
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
trackback
4 years ago

[…] patent was granted by USPTO just a day ago, and it details how Apple looks to create 3D sound positioning by using the Head related transfer function […]

Latest article

1
0
Would love your thoughts, please comment.x
()
x