Apple just slashed its iPhone trade-in prices – sell your phone here instead

If you're looking to swap your old iPhone for a spanking new model like the iPhone 14, Apple's official Trade In program is one of the simplest ways to do it. But thanks to some recent downgrades to its trade-in values, it's also now one of the worst value trade-in services out there.

As spotted by MacRumors, some models of iPhone are now worth significantly less on trade-ins than they were last week. The worst hit, the iPhone 13 Pro series, saw its trade-in price drop by $80 in the US and Apple is offering similar prices in the UK. The iPhone 13's trade-in value also went down by $50.

While Apple hasn't cut its valuations for every iPhone, its full list of trade-in prices underlines that there is a cost to the convenience of offloading your old phone directly through the manufacturer. Apple's trade-ins aren't necessarily recycled – it says that if your phone is "in good shape", it'll "help it go to a new owner". And the price of doing that is factored into the prices it offers for old iPhones.

So what exactly are your alternatives? While the best choice depends on which iPhone you're looking to get, the table below shows there are some financial benefits in shopping around for trade-in offers.

The good news is that, compared to Android phones, iPhones generally hold their value better – a US trade-in comparison site said that a used iPhone lost 68.8% of its value last year, compared to 84.2% for Samsung and 89.5% for Google.

If you have a particularly old iPhone that's no longer working properly, you may still prefer to trade it in through Apple safe in the knowledge that it'll go to one of the manufacturer's approved recycling partners. But if you're looking to get a solid discount off your next phone, or perhaps even a free Apple Watch, these are the places to consider.

1. Check your phone carrier

If you bought your current iPhone through a network operator and are looking to stay with them, there are some impressive trade-in offers available that may well trump going through a reseller or selling privately.

These typically give you account credit or a promotion card to put towards a new one, or offer free recycling. Here are some links to the major ones in the US and UK.

US network trade-in schemes

UK network phone trade-in schemes

Not everyone will be on a pricey unlimited plan and be looking to buy a brand-new iPhone 14, but if you are there are some impressive trade-in deals like the one below.

Impressively, Verizon is offering a free Apple Watch 7 and an extra $200 off an iPad when you trade an iPhone in, on top of the usual rebate. For more deals like that one, check out our guide to the best iPhone deals.

Apple iPhone 14: up to $800 off with trade, plus free Apple Watch at Verizon
Verizon's iPhone deals offer the usual trade-in rebate of up to $800 off on the iPhone 14 this week - a great promotion, but nothing too special. What's really sweetening the deal this week is that the carrier is also offering a free Apple Watch 7 and an additional $200 off an iPad as a bonus promotion to the trade-in, which is absolutely awesome value. While you'll still need that pricey unlimited plan to take part here, grabbing some freebies on top of the usual device saving is a great option. View Deal

2. Use trade-in comparison sites

If you don't have time to trawl through every mobile reseller, trade-in comparison sites can give you a quick temperature check on what your iPhone could fetch at the many reselling rivals to Apple's program.

A laptop screen showing the Sell My Mobile website

(Image credit: Future)

In the US, Flipsy is a handy place to see how much your phone is worth and offers free shipping, while in the UK some of the best options are Sell My Mobile and Compare and Recycle.

You might be able to find a better price for your iPhone by going directly to the resellers rather than via a comparison site, but they're a good way to quickly check how much more your iPhone might be able to fetch elsewhere. 

3. Go direct to trade-in and recycling services

Another bonus of being an iPhone owner who's for a trade-in deal is that most of the major third party sites have a strong focus on Apple, alongside Samsung. So where should you get your online quotes?

In the US, ItsWorthMore is a reliable and long-standing place to sell your iPhone (along with tablets and laptops). EcoATM (formerly known as Gazelle) is also a simple option, while USell will happily quote you for a broken or damaged iPhone that may not qualify for other trade-in programs.

A laptop screen showing an iPhone 13 valuation on the It's Worth More site

(Image credit: Future)

If you're in the UK, Carphone Warehouse offers competitive trade-in prices (as you can see in the table above), while CeX will give you valuations for credit in its physical stores. 

For recycling options, check out EcoATM and Best Buy (in the US) or Fonebank (in the UK). Some networks in the UK have also set up schemes for you to donate your old iPhone to someone in need – see Three's Reconnected and Vodafone's Great British Tech Appeal for good examples of those.

4. Sell privately

Naturally, selling an old iPhone yourself brings a higher potential price ceiling, but also the most hassle. Avoiding the latter is a big part of the appeal of trade-in services like the ones above.

Still, if you're happy to field dozens of potential questions from interested buyers in order to squeeze the most value from that phone, selling privately remains a profitable alternative to Apple's Trade In program.

A laptop screen showing an iPhone 13 Pro listing on eBay

(Image credit: Future)

The two big guns remain eBay and FaceBook Marketplace, due to their vast audiences and secure payment setups. When should you sell? While it can leave you with a tricky period between having your old phone and getting a new one, the best times to flog an old iPhone are typically August (which is at least a month before Apple usually announces new models), with the worst being late September.

Whichever way you decide to go, make sure you read our excellent tips on how to sell or trade in your old smartphone, which we gathered after an insightful chat with the experts at Backmarket. 

Posted in Uncategorised

Got an old iPhone or iPad? You need to install this security update now

If you own an iPhone 5S, iPhone 6 or iPhone 6 Plus, you're probably no longer in the habit of giving it software updates – but a new security update from Apple is an essential install if you want to keep your device secure.

A newly discovered vulnerability, which also affects some iPads (the iPad Air, iPad mini 2 and iPad mini 3), was recently picked up by Google's Threat Analysis Group, potentially allowing attackers to trick affected users into visiting "maliciously crafted web content". 

As a result, Apple has released an iOS 12.5.7 firmware update for those six affected devices, which owners should install now. If you don't have automatic updates enabled, just go to the Settings app then General > Software Update.

Apple says that it's "aware of a report that this issue may have been actively exploited against versions of iOS released before iOS 15.1", so it's definitely a good idea to install the update.  

Given that the iPhone 5S is now nine years old, it's impressive to see Apple stretching its security updates back that far. Neither the iPhone 5S, iPhone 6 nor iPhone 6 Plus can run iOS 13, which is why those phones in particular have been given updates.

Feeling secure

An iPhone on a grey background showing a software update screen

(Image credit: Apple)

Apple has led the way when it comes to providing firmware or security updates to older phones – last year, the seven-year-old iPhone 6S got iOS 15, and the five-year-old iPhone 8 is supported by the latest iOS 16.

But despite having a comparatively poor reputation for firmware support, Android manufacturers have also been boosting their credentials here. Last year, OnePlus said that "select models" of its phones would get four major Android updates and five years of security patches – the same as Samsung's support for the Galaxy S22 and other handsets.

And Google is slowly improving in this regard, with its most recent Pixel phones (including the Pixel 7 series) getting a promised five years of security updates, if only three years of OS updates. Android 14 is also expected to block the installation of apps that target outdated versions of the operating system.

While that news has caused something of an outcry from hardcore Android fans who see it as impinging on Android's open nature, these moves (and in particular, long-term security updates) will definitely be appreciated by most smartphone owners, who'd rather simply stay clear of malware or security threats.

Posted in Uncategorised

Google’s incoming AirTag rival could be an Android moment for Bluetooth trackers

Google is strongly rumored to be launching its own Bluetooth location tracker to rival Apple's AirTag and Tile this year – and it could take the little object-finding tools to the next level, for good and bad.

The reliable leaker Kuba Wojciechowski has unearthed a lot of evidence for the Google tracker, codenamed Grogu, suggesting that it's both real and could arrive at Google I/O 2023. Like AirTags, they apparently have an onboard speaker for emitting sounds from lost devices and pack both UWB (Ultra-wideband) and Bluetooth Low Energy connectivity.

But the feature that could make Google's device an Android-sized moment for trackers is its upcoming version of Apple's 'Find My' network. Google's tracker is expected to support Fast Pair, an existing Android feature that helps you find nearby Bluetooth devices. And with Google apparently bringing support for trackers and locator tags to Fast Pair, another giant object-tracking ecosystem could be imminent.

Google's 'Finder Network' (as separate rumors have branded it) could really take Bluetooth trackers mainstream. If your Apple AirTag is out of Bluetooth range, you can still get location info thanks to the 'Find My' network, which anonymously uses the Bluetooth connections of fellow Apple users to look for other trackers. But AirTags are naturally tied into Apple devices, whereas Google's system could be open to both global Android users and third-party manufacturers.

See more

The leaker Kuba Wojciechowski claims that Google is working with several chipset makers "to bring support for the new Fast Pair-based technology to their products". Combine multiple manufacturers making an Android version of AirTags, plus the several billion Android devices out there, and you have a potentially huge Bluetooth tracking network.

Android devices do currently have a 'Find My Device' function, but this is restricted to remotely encrypting lost devices or finding their last known location based on your own connection. A crowd-sourced 'Finder Network' would take that to another level entirely, potentially making it much easier to find devices – but also opening up the same possibilities for misuse that have dogged Apple's AirTags.

Another potential boost for Google's system is that your phone or tablet apparently won't need to have UWB connectivity to work with the so-called 'Finder Network'. According to Kuba Wojciechowski, Bluetooth Low Energy will be enough – and that would help open it up to current devices, rather than just new ones.


Analysis: The end of losing things?

A concept render of a hand holding a phone for a Google tracking device

A concept for a Google equivalent to AirTag was recently posted on Yanko Design above. We're not sure about the name, though. (Image credit: Obi Fidler / Yanko Design)

Google is clearly working on more than just a rival to AirTags or Tile – it appears to building its own vast 'Find My' network, with Android as the foundation. And that could be big deal if you're prone to losing expensive tech or prized possessions.

At last year's Google I/O 2022 conference, the company revealed that there were over three billion active Android devices in the world. With that level of reach, the 'Finder Network' (or whatever it's ultimately called) could make Bluetooth trackers an almost essential purchase for the absent-minded.

But it could also raise lots of new privacy concerns, too. Apple has done a lot to address issues with so-called 'AirTag stalking', where the trackers have been used to follow people without their knowledge. You'll now get an alert if an AirTag has been following you for a while or if you're near an unknown AirTag.

This system doesn't work brilliantly for everyone, though, with the equivalent app for Android requiring you to check manually for AirTags rather than scanning away in the background. These are the kinds of issues that Google's trackers and network will need to iron out, particularly if they're anonymously linked to a good proportion of those three billion global Android devices.

With third-party manufacturers also apparently invited to Google's tracker party, though, the potential is huge – and the next year could see the AirTag and Tile concept taken to vast new scale.

Posted in Uncategorised

Apple’s AI audiobooks are a long way from killing off human narrators

If you're an audiobooks fan, Apple has just given you a taste of the future by launching its first batch of AI-narrated books. But while the move is a fascinating one with big implications, the narrators' robotic tones show that much-loved human voices will be around for some time yet.

You can find the AI-voiced audiobooks, which use text-to-speech translation, in the Apple Books app by searching for 'AI narration'. This brings up a list of romance or fiction books (both free and paid for) that come with the description "narrated by Apple Books". 

Apple Books offers two types of AI voice – a soprano called Madison and a baritone voice called Jackson – which both have an American accent and currently speak in English-only. You can get a taste of what they sound like by tapping the 'preview' button under one of the Apple Books-narrated titles.

A tablet screen showing Apple's AI audiobook narrators

(Image credit: Apple)

Right now, there's undoubtedly a robotic, artificial quality to both of Apple's AI voices. You won't be mistaking them for warm, expressive tones of popular narrators like Stephen Fry or Julia Whelan anytime soon. But while the uncanny valley remains a tough obstacle for AI narrators to cross, they're undoubtedly on a fast-track to our ears.

Naturally, Apple says its AI voices have been developed to make audiobooks "more accessible to all". But they also make the multi-billion dollar audiobook industry more accessible to tech giant. And the new Apple Books feature is just the start of a fierce battle with the likes of Amazon and Spotify for our audiobook-loving ears.

Voicing concerns

For now, we'll mostly see AI narrators restricted to books from smaller independent publishers. This could spark an explosion in the number of audiobooks available to readers on all devices, as digital narration opens up a new market to publishers and authors who'd previously been unable to afford the leap from print to audio.

But pushback from larger publishers and voice actors could also slow down the rise of robo-narrators. Amazon's Kindle e-readers officially lost their text-to-speech powers several years ago, even if there are workarounds in the accessibility menus. That decision was at least in part down to copyright issues and audiobooks being legally considered as distinct pieces of art.

The Amazon-owned Audible has also written at length about which narrators suit different kinds of books and how publishers choose the right ones. Its blog says that "the most important aspect when it comes to audiobooks is that the voice matches the tone and genre of the book". This is somewhat difficult to achieve if, like Apple Books, you only have two voices.

An iPhone showing a list of titles in Apple Books

(Image credit: Apple)

Audible also says that "experienced voice actors are able to differentiate pretty easily between vocal characteristics by playing with pitch, intonation, volume and accents", And this is certainly where AI voice actors need to do some intense vocal training, and perhaps also take some night classes in emotional mirroring.

But the new Apple Books feature is clearly just the start of an inevitable boom in AI voice technology. And the really big moment for audiobooks might be when AI can, rather than robotically reading a script, convincingly impersonate a famous voice actor – a leap that may not be too far off, based on recent showcases from the likes of Amazon. 

Fake empire

After all, deepfakes aren't just restricted to frighteningly convincing videos of a synthetic Morgan Freeman or Tom Cruise – AI-powered voice tech is also developing fast.

Viral web apps like Uberduck let you generate speech in the voices of past presidents or cartoon characters, while last year Amazon showed off a mildly terrifying new Alexa skill that could read out The Wizard of Oz to a child in the voice of their grandmother.

With the likes of Google Wavenet also pushing the tech forward, AI voices are only going to get more convincing. For now, the barriers to widespread adoption in audiobooks will likely be more legal and ethical, rather than technological. But Apple Books' artificial narrators are the sound of our synthetic future – and in the not-too-distant future, famous voice narrators will likely copyright and license their own voices, too.

For now, the evidence in Apple Books suggests these AI voices are currently best-suited to non-fiction and factual works, rather than emotive storytelling. For novels, just like movies, we're still some way off artificial actors being able to convincingly tug our heart-strings without ruining the suspense with a robotic inflection or flat note. But a torrent of AI audiobooks is definitely coming, regardless.

Posted in Uncategorised

Samsung’s Flex Hybrid concept stretches foldable phones to their limit

What's better than a tablet that can fold in half to become your phone? According to Samsung, it's a foldable the screen of which can also slide out to give you even more visual real estate. But several practical limitations mean the Flex Hybrid concept, which Samsung is showing off at CES 2023, will remain closer to the whiteboard than our pockets for a while yet.

The Flex Hybrid is very much a prototype 'smart mobile device'. It's the first one we've seen that combines two types of innovative screen tech – the foldable displays that we already know (and mostly love) in phones such as the Samsung Galaxy Z Fold 4, and newer 'slidable' screens that Samsung has been flaunting at various trade shows for the past year.

The idea is that you'll unfold the left-hand side of the Flex Hybrid, much like the existing Fold series. But if you want an even larger screen, you'll be able to stretch it from a 10.5-inch 4:3 display to a 12.4-inch screen with 16:10 ratio by pulling out the rest of the display from the right-hand side, to give you a proper tablet experience from a pocketable device.

The Samsung Flex Hybrid foldable phone on a blue background

The Flex Hybrid (above) can be unfolded from the left-hand side, and expanded with a sliding component from the right (Image credit: Samsung)

In theory, the Flex Hybrid is the kind of do-everything mobile device that we could only dream of a few years ago. But there's good reason to believe that we'll need to continue dreaming about it for a few years yet.

Firstly, the idea is clearly still at the early concept stage, as Samsung hasn't revealed any specs for the display, or hinted at which products could potentially use the tech. Also, this time last year, at CES 2022, Samsung demoed its Flex G and S displays, which could fold twice in two different directions, but which have yet to be fully realized as consumer-ready products.

Fold that thought

The Samsung Flex Duet concept being held at CES 2023

(Image credit: Samsung)

The most likely scenario is that Samsung will need to polish its respective display technologies before they can be combined in one do-it-all tablet. On the slidables front, Samsung will be demoing two larger 17-inch prototypes with sliding screens for laptops CES 2023. 

These will be called the Flex Slidable Solo, the screen on which can be expanded in one direction, and the Flex Slidable Duet (pictured above), which can be stretched in both directions to create a 17.3-inch display (from a 13-14-inch device) for gaming or watching movies.

Folding displays still a lot of refinement, though. While we're big fans of the Samsung Galaxy Z Fold 4, its crease is still visible down the middle of the screen – and run your finger across the display and you'll feel the crease, too. It isn't hugely noticeable when you're watching videos, but this is still an area for improvement before Samsung moves onto sliding foldables.

The other hurdle is that foldables remain premium products that, in the current financial climate, are hard for most tech fans to justify. The starting price of the Galaxy Z Fold 4, for example, is $1,799 / £1,649 / AU$2,499. Adding a slidable component to that display – however exciting that would be – will only make that price tag even more prohibitive.

Factor in the question marks around the long-term durability of foldables, and the current limitations of Android tablet apps, and you can see that it'll definitely be a few years before a Flex Hybrid becomes a viable product. But we'll certainly be first in the queue to try one out – and we'll be doing our best to hunt one down at CES 2023, if it indeed exists.

Posted in Uncategorised

Google Pixel 7 and Pixel 7 Pro: the 7 most exciting new camera features

The arrival of new Google Pixel phones is always a big moment for point-and-shoot snapping – and so it's proved again with the launch of the Pixel 7 and Pixel 7 Pro.

While the new flagships don't have a headline moment quite as big as the Pixel 3's introduction of 'Night Sight', they do bring a combination of exciting hardware and software upgrades that could fire them into the upper echelons of our best camera phones guide.

The basic hardware recipes of both the Pixel 7 and Pixel 7 Pro aren't radically different from their predecessors. Both have the same 50MP main cameras and 12MP ultra-wides, with the Pixel 7 Pro bringing an extra 48MP telephoto lens with 5x optical zoom powers.

But under the hood, Google's new Tensor G2 processor powers some fancy computational photography features, including Photo Unblur and a new Cinematic Blur mode that looks suspiciously similar to Apple's Cinematic Mode

So what are the two phones' most exciting photographic features? We've ranked the ones we're most looking forward to testing here – starting with that cheat mode for all our snapping mistakes, Photo Unblur...

1. Photo Unblur (Pixel 7 and Pixel 7 Pro)

Luckily, every one of our photos is perfectly crisp and never contains any mistakes (okay, that's a lie), but if your library is dotted with blurry clangers then Google's Photo Unblur trick could be a welcome godsend.

Initially only available in the Google Photos app on the Pixel 7 and Pixel 7 Pro (although we suspect it'll come to other phones soon), Photo Unblur is a development of Google's existing de-noise and sharpening tools and should nicely complement the Face Unblur trick that arrived last year on the Pixel 6 series.

Unlike Face Unblur, Photo Unblur is designed to be used retroactively on existing pics rather than in the moment of capture. While it can't work miracles on disastrous snapping incidents, the early demos show an impressive ability to rescue shots that have been sullied by slow shutter speeds, focusing issues, or mild hand-shake. And it'll work on photos taken on any camera, too.

2. Macro Focus (Pixel 7 Pro)

It's far from the first phone with a dedicated macro mode, but the addition of autofocus to the Pixel 7 Pro's upgraded ultra-wide lens is a big deal for fans of Google smartphones.

Our US Mobiles Editor Philip Berne explained why macro was the Pixel 7 Pro feature he was most excited about before the phone's launch. And Google granted his wish with a mode that should match the close-up shots possible on rivals like the iPhone 14 Pro.

A cat's eye, water droplets on a leaf and a human eye

(Image credit: Google)

It isn't yet clear what software trickery Google has brought to this mode, but it promises to let you focus on objects from as close as 3cm away. Macro Focus will also kick in automatically when you move close to a subject, switching from the main camera to the ultra-wide. 

It's one mode we're very much looking forward to taking for a spin (watch out, spiders). In the meantime, you can check out some sample shots in this Google Photos gallery. 

3. Improved Super-Res Zoom (Pixel 7 Pro)

Zoom promises to be one of the biggest improvements on the Pixel 7 and Pixel 7 Pro. The Pro model now has 5x optical zoom (rather than the Pixel 6 Pro's 4x zoom), but the more interesting improvement is the software trickery available on both models.

Just like the iPhone 14 Pro, both phones can crop into their 50MP resolution for an effective 2x zoom at a 12.5MP resolution, thanks to some added noise processing. But a more useful improvement is likely to be the processing that takes place in between the Pixel 7 Pro's native focal lengths.

Two tennis players on a court

(Image credit: Google)

Previously, these 3x or 4x optical zoom spots have been covered by fairly rudimentary digital zoom. But Google claims that the Pixel 7 Pro can fill in some extra details using its 5x telephoto camera, which should create far more consistent results throughout that zoom range (in theory at least). That's definitely something we're looking forward to trying out.  

4. Cinematic Blur mode (Pixel 7 and Pixel 7 Pro)

Apple's Cinematic mode brought simulated background blur, like the kind you'll find in portrait mode photos, to video last year on the iPhone 13 Pro. It's still early days for the technology, but Google has now jumped into the computational pool party with its take on fake video bokeh.

The problem these modes are trying to solve is that smartphone cameras have too large a depth-of-field to deliver the kind of blur that makes videos shot with dedicated cameras look, well, cinematic. 

It's a tough nut to crack because every single frame needs to be processed to look like it was shot with a bright prime lens – and based on Google's demo above, the Pixel 7 series hasn't made any huge leaps forward.

The fall-off from subject to background still looks a bit artificial and heavy-handed, but it could certainly be a handy mode for the odd cut scene. We'll be sticking to the best vlogging cameras for a little while yet, though.

5. Improved Night Sight (Pixel 7 and Pixel 7 Pro)

Google's 'Night Sight' mode was a revelation when it arrived on the Pixel 3 back in 2018. Rather than using the traditional long-exposure method to expose dark scenes, it let you shoot them handheld thanks to its staggering ability to instantly re-assemble the best bits from a burst of frames.

The mode has steadily improved over the years, but its issue has always been the motion blur created if anything in your scene dares to move an inch during the burst sequence. Well, Google is promising that this problem has, if not been solved, at least improved on the Pixel 7 and Pixel 7 Pro.

A tree under a night sky and a woman leaning against a dark wall

(Image credit: Google)

This is because its machine learning techniques allow a reduction in noise, which in turn means each frame can use a shutter speed that's half as long as before. The result? In theory, far fewer issues with motion blur ruining your cityscapes and night-time portraits.

6. Guided Frame (Pixel 7 and Pixel 7 Pro)

An impressive example of an AI accessibility feature, Guided Frame is designed to help people who are blind or have low vision take selfies more easily on the Pixel 7 and Pixel 7 Pro.

When you open the front-facing camera and hold it to your face, the feature's voice will tell you where to position the phone to compose the shot, nudging you in the right direction before letting you know when you've got the money shot.

Two woman smiling in a bar

Google's Real Tone feature (above) promises to deliver even more accurate skin tones in your photos. (Image credit: Google)

You'll get prompts like "move your phone slightly right and up", while a count-down lets you know when the shot is about to be taken. Hopefully, it'll spur other manufacturers to make equivalent modes.

Google has also boosted its Real Tone feature on the new Pixels to make sure every subject's skin tone is accurate and well-exposed in your photos. With the feature tested on over 10,000 portraits and refined in collaboration with Diversify Photo, it should now be much improved.

7. Improved selfie camera (Pixel 7)

Photographers may scoff at the selfie camera, but it's one of the most frequently used lenses on smartphones. The Pixel 7 now has an improved version that should be a decent step up from its predecessor.

The Google Pixel 7 phone on a yellow background

(Image credit: Google)

The Pixel 7 now has the same 10.8MP sensor (with f/2.2 aperture) that you'll find on the Pixel 7 Pro and 6 Pro. This means it has an ultra-wide 20mm focal length, which is handy for squeezing multiple people into the frame. You can also use it to shoot 4K/60p video.

It still only has fixed focus, but should be a more useful tool for when you need a social media mug shot or quick video for your YouTube channel.

Posted in Uncategorised

What is Photo Unblur? How Google’s new magic trick fixes your old photos

Google has been a computational photography pioneer over the past few years and its latest trick, called 'Photo Unblur', could be one of its most impressive tricks so far. A Google Photos feature that'll initially be exclusive to the Pixel 7 and Pixel 7 Pro, it promises to rescue your new and old snaps from blurry oblivion. 

Photo Unblur is an expansion of 'Face Unblur', which arrived last year on the Pixel 6 and Pixel 6 Pro. The latter has quickly become one of the most popular computational photography features since Google unveiled 'Night Sight' on the Pixel 4 back in 2019. But it's also quite different from Photo Unblur, which means the two will act as complementary modes for varying situations. 

Both features use machine learning to improve your pictures, but Photo Unblur is designed to improve the shots you've already taken on any camera. Face Unblur, meanwhile, is a pre-emptive mode that uses the power of Google's Tensor chip to detect when someone is moving too quickly in your scene. It then automatically takes two photos, which are then combined you give you a well-exposed, sharp snap.

So how exactly does Google's new Photo Unblur mode work without being fed multiple snaps of the same scene? Google hasn't fully expanded on its inner workings yet, but we can get a good idea by looking at where it's come from.

How does Photo Unblur work?

Photo Unblur hasn't arrived completely out of the blue – while Google hasn't yet expanded on its inner workings, it's likely built on some existing features we've seen in the Google Photos app. And that means it could ultimately be available on devices beyond the Pixel 7 and Pixel 7 Pro.

In 2021, the Google AI Blog described the tech behind two new Google Photos features called 'Denoise' and 'Sharpen'. These arrived to help you boost photos that were shot in tricky conditions, or with older phones that had noisy sensors or ancient optics. And these likely form the basis of Photo Unblur.

Photo editors have long had sliders to help you adjust noise and sharpening, but Google's new tech is much smarter than those. For starters, it analyzes your whole image to work out the levels of noise and blur down to a pixel level, regardless of which camera the photos were taken on.

A diagram showing how Google's denoise algorithm works

In 2021, Google revealed how its algorithms create noise maps to support efficient image improvements. (Image credit: Google)

This crucial step allow the noise reduction and de-blurring to occur on a more granular level than older techniques, which makes them less processor intensive. And this makes them ideal for running on-device or in the cloud. Once Google's analyzed your image, it can then apply its slightly counter-intuitive methods for reducing blur and noise.

These are counter-intuitive because they involve pushing your photo in the seemingly 'wrong' direction, before bringing it back to an improvement on the original. To reduce noise, Google combines noisy pixels (effectively downsampling the image), then merges them together while regenerating finer detail. The sharpening works in a similar fashion, with Google's algorithms re-blurring the image several times in an efficient, phone-friendly process.  

Two photos of a girl smiling

Last year's 'Denoise' and 'Sharpen' tools in the Google Photos editor were able to achieve results like this. The new Photo Unblur is based on similar tech. (Image credit: Google)

So how does Photo Unblur build on these techniques? Right now, we don't know the specifics, but a year is a long time in machine learning – and some of Google's examples during the Pixel 7 launch certainly looked impressive.

The image below, for example, has been impressively cleaned up from its almost unusable origins, which appear to have been caused by light movement and an excessively slow shutter speed.

Because Photo Unblur doesn't work with two images of the same scene, like Face Unblur, it may struggle to be quite as powerful as that older feature, particularly for issues caused by movement. But we're looking forward to taking it for a spin on our old snaps when the Pixel 7 and Pixel 7 Pro launch.

How do you use Photo Unblur?

Google again hasn't revealed the specifics of how you'll use Photo Unblur on the Pixel 7 and Pixel 7 Pro yet. But it has said that that in "just a few taps" you'll be able to remove blur and visual noise in a process that sounds just as straightforward as last year's Magic Eraser (for removing unwanted objects).

This process will take place in the Google Photos app, with Photo Unblur initially only being available on the PIxel 7 and Pixel 7 Pro. But we're expecting the tech to eventually be available on all devices running the Google Photos app at a later date. 

While Photo Unblur isn't quite as automated as Face Unblur, which works during the photo-taking process on phones from the Pixel 6 series onwards, it does look like another very simple example of computational photography improving our snaps. Including the old ones we'd written off.

It looks likely that the two modes will be complementary, with Face Unblur kicking in (on supported devices) before you take a photo, and Photo Unblur being useful for old snaps taken on any camera. We'll be taking Photo Unblur for a spin very soon and will update this article with all of our findings. 

Posted in Uncategorised

These award-winning iPhone photos show what you can do with your older model

You might understandably be pining for an iPhone 14, but the iPhone Photography Awards 2022 has just landed to proved you don't really need that rumored 48MP camera to take incredible snaps.

The annual competition, which runs independently from Apple but is now in its 15th year, has just announced its impressive winner's list. And it's by no means dominated by the latest iPhones, with the winners stretching all the way back to the iPhone 6S Plus from 2015.

We've done some tallying up and 44% of the winners were actually taken on models from the iPhone 11 series or earlier. That said, the most well-represented phone by far is the iPhone 12 Pro Max, which was behind 23% of the competition's award-winning shots.

iPhone 12 Pro Max

The iPhone 12 Pro Max, from 2020, was the most well-represented model among this year's iPhone Photography Awards award winners. (Image credit: Apple)

The rules of the IPPA 2022 awards state that photos "should not be altered in any desktop image processing program such as Photoshop", so how is there such a varied range of styles across the categories, from "abstract" to "travel"?

That's because the rules do allow you to "use any iOS apps", which means some of the best photo editing apps and the best camera apps are almost certainly behind some of the shots you can see in our gallery below. That said, many photos are also likely straight "out of camera" and, collectively, the set shows what's possible with iPhone cameras, whichever model you have.


Analysis: Find a subject and nail the basics

Two iPhones showing portrait photos of men

Our guide on how to take portrait photos with your iPhone contains some pro tips on getting shots like the ones above. (Image credit: Future)

Other than underlining that you don't need the latest iPhone to take great photos, the lesson from this year's iPhone Photography Awards is that you only need two things for a great snap: an interesting subject and an understanding of the photographic basics.

You'll notice that effects like Portrait mode and filters are notably absent from the photos below. Instead, the winners show a familiarity with the main rules of composition, an eye for good light, and an openness to finding new angles on familiar subjects.

We'd wager that most winning entries were shot using the iPhone's main camera, rather than the telephoto or super-wide. The IPPA's rules do state that "iPhone add-on lenses can be used", so it's possible that some may have used some of the best iPhone lenses for some extra reach. But in the main, a little light editing is all that's likely been required, given the arresting subjects.

If you fancy making an entry for next year's competition, check out our guides on how to take professional portrait photos with your iPhone and how to take epic landscape photos with your iPhone. For now, though, here's a gallery of this year's winners of the iPhone Photography Awards (use our navigation bar on the left to jump to your favorite category).

Overall winner

A soldier talking to a young boy in front of ruins

'Photographer of the Year Grand Prize' winner. By Antonio Denti (Italy). Location: Mosul, Iraq. Shot on iPhone 11. (Image credit: Antonio Denti / IPPAWARDS)

Abstract

An abstract photo of a basketball court

'First Place - Abstract' winner. By Marcello Raggini (San Marino). Shot on iPhone 11. (Image credit: Marcello Raggini / IPPAWARDS)

Animals

Two cows in a doorway

'First Place - Animals' winner. By Pier Luigi Dodi (Italy). Shot on iPhone 11 Pro Max. (Image credit: Pier Luigi Dodi / IPPAWARDS)

Architecture

The shadow of a tall building over a city

'First Place - Architecture' winner. By Kaustav Sarkar (India). Location: Empire State, New York. Shot on iPhone 12 Pro. (Image credit: Kaustav Sarkar / IPPAWARDS)

Children

A child with orange hair holding a balloon

'First Place - Children' winner. By Huapeng-Zhao (China). Shot on iPhone 13 Pro Max. (Image credit: Huapeng-Zhao / IPPAWARDS)

City Life

An aerial view of a large motorway junction

'First Place - City Life' winner. By Yongmei Wang (China). Shot on iPhone 12 Pro Max. (Image credit: Yongmei Wang / IPPAWARDS)

Environment

A large chimney emitting smoke

'First Place - Environment' winner. By Yang Li (China). Location: Hegang, Heilongjiang Province. iPhone 11 Pro Max. (Image credit: Yang Li / IPPAWARDS)

Landscape

A road going through a misty forest

'First Place - Landscape' winner. By Linda Repasky (USA). Location: Ware, Massachussets. Shot on iPhone 13 Pro. (Image credit: Linda Repasky / IPPAWARDS)

Lifestyle

Underwater shot of a boy swimming

'First Place - Lifestyle' winner. By Laila Bakker (Netherlands). Shot on iPhone 11 Pro Max. (Image credit: Laila Bakker / IPPAWARDS)

Nature

View through some silver trees with yellow leaves

'First Place - Nature' winner. By Andrea Buchanan (USA). Location: Utah. Shot on iPhone 12 Pro Max. (Image credit: Andrea Buchanan / IPPAWARDS)

Portrait

A man leaning against a black and white wall

'First Place - Portrait' winner. By Arevik Martirosyan (USA). Shot on iPhone 12 Pro Max. (Image credit: Arevik Martirosyan / IPPAWARDS)

Sunset

A hot air balloon in front of a sunset

'First Place - Sunset' winner. By Leping Cheng (China). Location: Xiamen, China. Shot on iPhone 12 Pro Max. (Image credit: Leping Cheng / IPPAWARDS)

Travel

Three people's legs off the back of a boat

'First Place - Travel' winner. By Marina Klutse (USA). Location: Caño de la Guasa, Colombia. Shot on iPhone 11 Pro (Image credit: Marina Klutse / IPPAWARDS)
Posted in Uncategorised

Apple has quietly built an automated Photoshop into iOS 16

The dazzling new iPhone lock screen designs in iOS 16 may have grabbed all the headlines at WWDC 2022, but behind them lies a new feature that's also a highly unusual one for Apple – Photoshop-style editing skills.

Apple's AI tools have traditionally been focused on helping you take great iPhone photos, rather than edit them. But a new 'Visual Look Up' feature, which you'll be able to find in the Photos app and across iOS 16, lets you tap on a photo's subject (for example, a dog) then lift them out of the snap to be pasted somewhere else, like in Messages.

That may not sound too spectacular, but the unnamed feature – which has echoes of Google's 'Magic Eraser' for Pixel phones – will be a significant addition to iPhones when it lands in the software update later this year. Apple usually leaves these kinds of tricks to the best photo editing apps, but it's now dabbling with automated Photoshop skills.

Just a few years ago, cutting out a complex subject in a photo used to be the preserve of Photoshop nerds. But Apple says its Visual Look Up feature, which also automatically serves up info on the subject you tap on, is based on advanced machine learning models. 

Simply lifting a French Bulldog from a photo's background is, Apple says, powered by a model and neural engine that performs 40 billion operations in milliseconds. This means it'll only be supported by the iPhone XS (and later models). 

Beyond the Photos app, the feature will apparently also work in Quick Look, which lets you quickly preview images in apps. There are also echoes of it in iOS 16's new customizable lock screens, which can automatically place elements of a photo in front of your iPhone's clock for a more modern look.

Right now, the feature is limited to letting you quickly cut out and paste subjects in photos, but Apple clearly has an appetite for building Photoshop-style tools into its iPhones. And iOS 16 could just be the start of its battle with the likes of Adobe and Google when it comes to letting you quickly tweak and edit your photos. 


Analysis: The AI-powered editing race heats up

A phone screen showing Google's Magic Eraser tool

(Image credit: Google)

Photoshop and Lightroom will always be popular among pro photographers and keen hobbyists, but we're starting to see tech giants bake automated equivalents of Adobe's most popular tools into their operating systems. 

Just last month Google announced that its Magic Eraser tool, available on Pixel phones, now lets you change the color of objects in your photos with just one tap. This new feature joined the tool's existing ability to remove unwanted objects or people from your photos.

Apple hasn't quite gone that far with Visual Look Up's new feature, which is more like Photoshop's 'Select Subject' tool than Google's take on the healing brush. But the iOS 16 upgrade is a significant one in the context of the wider race to build the ultimate mobile editing skills for point-and-shoot photographers.

There's no reason why Apple couldn't extend the concept to let you, for example, select and replace a drab sky with a more dramatic one. This 'Sky Replacement' feature is one we've recently seen come to Photoshop and other AI desktop photo editors, and today's smartphones certainly have the processing power to pull it off.

Of course, Adobe won't idly stand by and let Apple and Google eat its editing lunch, even if Apple appears to be coming at it sideways. By baking these technologies into core features like the new iOS 16 lock screen, Apple makes them part of not just an iPhone stock app, but the core OS. That's trouble for Adobe but good news for anyone who doesn't want to learn, or pay for, Photoshop.

Posted in Uncategorised

10 things you might have missed from WWDC 2022

The WWDC 2022 keynote is over and it's delivered some big, exciting software changes to our Apple gadgets, plus some new hardware in the form of the MacBook Air (M2, 2022).

But hidden among the headliners were some support acts that you may have missed in the noise. Now the dust's settled on a WWDC 2022, it's time to shine a light on those unsung announcements that could well be more important to you than Apple's bigger announcements.

Here are our ten favorite things that went a little under the radar during Apple's long presentation, but didn't escape our trusty searchlight.

1. You'll be able to unsend disastrous iPhone Messages

An iPhone showing a Message

(Image credit: Apple)

Recently experienced the head-in-hands regret of sending a text to the wrong person in your iPhone's Messages app? Apple can't help you now, but it will in the future thanks to some handy Messages updates in iOS 16.

When iOS 16 arrives later this year, you'll be able to edit or unsend a message after you've tapped that fateful arrow icon. You'll need to be fairly quick, as the option will only be available for up to 15 minutes after you've sent a message, but it could be a lifeline. After all, we pretty much use Gmail's 'undo send' feature on a daily basis.

2. Apple has killed the Apple Watch Series 3

Two Apple Watches on an orange background

(Image credit: Apple)

In a slightly controversial move, Apple's new watchOS 9 software will only be supported by Apple Watch Series 4 models or later – which means it's effectively killed the Apple Watch Series 3, a product that's still currently on sale.

The Series 3 came out in 2017, so it's certainly showing its age. But it is one of the Watches that supports Apple Fitness and, until now, represented an affordable way to get the Apple Watch experience. Not now, though – that honor now falls to the Apple Watch SE, as we'd highly recommend avoiding the Series 3 given its surprising lack of support for watchOS 9.  

3. Your iPhone's lock screen could actually become useful

An iPhone showing a photo of a woman

(Image credit: Apple)

If you often stare with dismay at how bland your iPhone lock screen is, fear not –Apple is finally planning to let you jazz it up in iOS 16, which will be out in public beta from July. Apple's "biggest update ever to the lock screen" will let you customize the clock's font and add Apple Watch-style widgets.

These widgets will give you glanceable info like calendar events and the weather, and you'll also be able to create several custom lock screens that you can swipe between. Android owners may roll their eyes as this 'innovation', but it's a big new piece of personalization for the iPhone.

4. iPhones will soon double as Mac webcams

A MacBook showing Apple's Continuity Camera feature

(Image credit: Apple)

The built-in webcams on Apple's Macs have often felt like afterthoughts, but Apple has just announced a workaround – you'll soon be able to use your iPhone as a wireless Mac webcam instead. The new Continuity Camera feature in macOS Ventura, which is coming in late 2022, will automatically detect your iPhone and let you use it as a camera for any macOS video conferencing app without any cables.

That might sound like a slightly underwhelming stopgap, but Apple's built some pretty cool features into Continuity Camera. Alongside the usual Portrait Mode and Center Stage, for blurring the background and keeping you in the middle of the frame, there's Studio Light (think a virtual ring light) and a Desk View that somehow creates an overhead view of your desk from the iPhone's ultra-wide camera.

5. Your Apple Watch will quietly judge how efficiently run

A man running with an Apple Watch

(Image credit: Apple)

The Apple Watch has always lacked some of the more advanced fitness-tracking features found in some of its rivals, but the incoming watchOS 9 is going some way towards fixing that. Beyond some new watch faces and improved notifications, one of its most interesting updates are its new running form metrics.  

Using some machine learning wizardry, watchOS 9 will be able to gather data on three running metrics – vertical oscillation, stride length and ground contact time – to help you improve your efficiency and inch towards some PBs. All of these will be served up in watchOS 9 (supported by the Apple Watch Series 4 and above) in some handy new Workout Views. We'll reluctantly accept its advice if it powers us to fun run victory.

6. Apple quietly announced new Apple TV treats

The Apple TV 4K on an orange background

(Image credit: Apple)

It looked like WWDC 2022 had left Apple TV owners completely empty-handed when it came to software gifts, but buried in the smallprint of some developer notes were some details about tvOS 16

Unfortunately, the update looks like the Apple equivalent of a new pair of socks, with minor details about improvements to cross-device connectivity and boosted smart home support. Still, these may turn out to be more exciting than they look once developers get stuck into tvOS 16, with one improvement describing second-screen info on your iPhone or iPad as your watch your main TV. All very promising, as long as Apple is more committed to the Apple TV platform than it appears.

7. The Apple Home app has finally been rebuilt 

An iPhone showing the Apple Home app

(Image credit: Apple)

Yes, Apple is supporting the new smart home standard Matter, which we saw talked up at Google IO 2022, but perhaps more important is that's finally redesigned the underwhelming Home app in iOS 16.

This update, which apparently rebuilds the Home app from the ground up, looks like a much more modern (and usable) hub for your smart home tech. There are categories at the top for things like lights, security and climate, and below that is live info for all of them, including feeds from up to four security cameras. Now all we need is smart home tech that lives up to this glossy promise.

8. CarPlay wants to take over your entire dashboard

A car dashboard showing the next generation Apple CarPlay

(Image credit: Apple)

Apple's CarPlay software is a fine way to level up your car's infotainment experience, but its ambitions apparently don't stop there – at WWDC 2022, Apple demoed a next-gen version of CarPlay that takes over the entire dashboard, including features like the speedometer, temperature controls and more.

Like an in-car version of your iPhone's homescreen, you'll be able to customize the look of different gauges and choose widgets to give you glanceable info for the weather or music. The downside? It won't be here till later in 2023, and we suspect some car makers might be a bit resistant to its dashboard takeover ambitions.

9. iPhones will scan your ears for better Airpods sound

A logo for Apple's personalized spatial audio tech

(Image credit: Apple)

Apple's iPhones can certainly take a mean photo, but their front cameras will also soon be used for a slightly more unusual purpose – scanning your ears. Why? So they can create Personalized Spatial Audio for a "more precise and immersive listening experience", of course.

The scanning will be done by Apple's TrueDepth camera, which means you'll need an iPhone X or later to take advantage of the feature. And while Apple hasn't yet stated which headphones will support the 3D effect, we reckon the AirPods ProAirPods 3 and AirPods Max are prime candidates given Apple's familiarity with their inner workings. The only question is, how protective are you about your ear data?

10. The end of passwords is nigh

A screen showing Apple's Passkeys

(Image credit: Apple)

Some welcome news if your online security is still built on the dubious foundations of that one 'unbreakable' password you came up with in 2006 – Apple has joined Google and Microsoft in supporting the password-less authentication standard detailed by the FIDO alliance, giving a Safari demo of its 'passkeys' at WWDC 2022.

Rather than using passwords, the idea is that you'll use your iPhone's more modern identification features like Touch ID or Face ID to log into your favorite web services. According to Apple, Passkeys can't be phished as they never leave your devices, or leaked as they're not stored on a web server. Does the end of passwords also mean the end of password managers, though?

Posted in Uncategorised

Next-gen Apple CarPlay wants to take over your car’s entire dashboard

It's only fair to share...Share on RedditShare on FacebookShare on Google+Tweet about this on TwitterPin on PinterestShare on Tumblr

Apple has just given motorists an exciting early peak at the next-generation of Apple CarPlay – and it takes the car infotainment experience to a new level.

Like Android Auto, CarPlay already makes it easier to access apps like Maps, Phone and Messages via your car's touchscreen dashboard. But the next version will see CarPlay integrate with your car's whole instrument cluster, including the speedometer, temperate controls and more.

This means Apple would effectively create the entire UI for your car, with CarPlay apparently adapting to any kind of in-car screen shape or layout, as well as providing content for multiple screens. 

The most exciting thing for CarPlay fans is the level of customizability in the new software. You'll be able to choose different gauge cluster designs, choose your own widgets and get glanceable info for things like Weather and the Music you're playing.

Image 1 of 3

WWDC 2022

(Image credit: Apple)
Image 2 of 3

WWDC 2022

(Image credit: Apple)
Image 3 of 3

WWDC 2022

(Image credit: Apple)

Naturally, you'll also be able to choose to have Apple Maps show you where to go in the behind-the-wheel screen, although this looked a tad distracting in Apple's early demos of next-gen CarPlay.  

The downside? The new Apple CarPlay isn't ready just yet, with Apple saying that more information will be shared soon ahead of a launch in 2023 — and that “vehicles will start to be announced late next year”.

Analysis: Will car makers be on board?

WWDC 2022

(Image credit: Apple)

Apple says that automakers are “excited” about the new version of CarPlay, but that's unlikely to be a universal reaction given the extent to which it takes over a car's entire UI.

While the next-gen CarPlay looks very polished and pretty, particularly with its customizable colors, it effectively shunts the car manufacturers to one side when it comes to dashboard design.

It also isn't yet clear whether it will work with today's CarPlay-compatible vehicles. Still, we're hopeful that there will be some element of backwards compatibility and that more developers will join the best Apple CarPlay apps.

The number of CarPlay apps currently numbers more in the dozens than hundreds, but hopefully this new version of the software – which first landed over six years ago – might prompt more apps to join the likes of Spotify and WhatsApp in making CarPlay-specific versions.

It's only fair to share...Share on RedditShare on FacebookShare on Google+Tweet about this on TwitterPin on PinterestShare on Tumblr

Smartphones will kill off the DSLR within three years says Sony

Smartphone cameras and DSLRs have been moving in opposite directions for the past few years, and image quality from phones will finally trump that of their single-lens reflex rivals by 2024, according to Sony.

As reported by Nikkei Japan, the President and CEO of Sony Semiconductor Solutions (SSS), Terushi Shimizu, told a business briefing that "we expect that still images [from smartphones] will exceed the image quality of single-lens reflex cameras within the next few years".

Some fascinating slides presented during the briefing were even more specific, with one slide showing that, according to Sony, "still images are expected to exceed ILC [interchangeable lens camera] image quality" sometime during 2024.

Those are two slightly different claims, with 'ILCs' also including today's mirrorless cameras, alongside the older DSLR tech that most camera manufacturers are now largely abandoning. 

But the broader conclusion remains – far from hitting a tech ceiling, smartphones are expected to continue their imaging evolution and, for most people, make standalone cameras redundant. 

A laptop screen showing a Sony slide on the future of mobile imaging

A slide from Sony's 'Imaging & Sensing Solutions' briefing, taken from the full presentation. (Image credit: Sony Semiconductor Solutions Corporation)

So what tech will drive this continued rise of the best phone cameras? Sony points to a few factors, including “quantum saturation” and improvements to "AI processing". Interestingly, Sony also expects the sensor size in "high-end model" phones to double by 2024.

The larger pixels on these sensors will, it says, allow phone makers to apply multi-frame processing that "realizes a new imaging experience", including improved Super HDR modes and zooms that combine folded optics (as on the Sony Xperia 1 IV) with AI algorithms. 

Sony also highlighted the development of its 'two-layer transistor pixel technology', which we heard about last year, which promises to drastically improve the dynamic range on phone cameras and help reduce low-light noise.

Similar advances are coming for video too, according to Sony's presentation, with the higher read-out speeds of next-gen sensors supporting 8K video, multi-frame processing (including video HDR) and a general realization of "AI processing for video". In other words, computational video techniques like Apple's Cinematic Mode.

While it isn't unusual for Sony to make bold predictions about a sector it's heavily invested in, there does seem to be substance behind its predictions for the continued evolution of phone cameras at the expense of DSLRs and mirrorless cameras. 

And that's significant for all smartphones, because according to Statista, Sony has 42% of the global image sensor market for phones, while teardowns of the iPhone 13 Pro Max show that it uses three Sony IMX 7-series sensors.


Analysis: Phones continue their meteoric rise

A diagram showing Sony's two-layer transistor pixel technology

(Image credit: Sony Semiconductor Solutions Corporation)

Predictions about the demise of DSLR cameras are nothing new – without saying anything explicitly, Canon and Nikon have both admitted that DSLRs are a legacy format by discontinuing some models, such as the Nikon D3500, without replacing them. But Sony's latest statements highlight that phone cameras still have a long way to go before they hit their tech ceiling.

The biggest advances in recent years have come in multi-frame processing, otherwise known as computational photography. But Sony was understandably keen to stress the role that new hardware will play in lifting phone cameras to new photographic heights.

Its prediction that the sensor sizes in high-end phones will double by 2024 is slightly surprising, given that this is limited by factors like lenses. For example, the Sony Xperia Pro-I became Sony's first phone to have a 1-inch sensor last year, but its lens wasn't able to project a large enough image circle to cover the whole of that sensor, so it could only take 12MP photos rather than the native 20MP resolution.

Perhaps more significant is Sony's new stacked CMOS sensor with two-layer transistor pixels, which effectively exposes each pixel to twice as much light as a standard sensor. This sounds like a hardware advance that computational algorithms could definitely get their teeth stuck into in order to boost dynamic range and noise performance.

But given how good the latest phones are at photography, the most noticeable advances over the next few years are likely to be in video. Sony's presentation highlighted this with references to multi-frame processing and its Edge AI platform, which promises to boost both video performance and support for augmented reality apps.

While DSLRs and mirrorless cameras will always have an audience among hobbyists and pros due to their handling, creative control, viewfinders and single-shot image quality, the kinds of advances outlined in Sony's presentation show that the next few years are going to be a particularly exciting time for phone cameras.

Posted in Uncategorised

Google’s Magic Eraser is turning into a hassle-free Photoshop

The Google Pixel 6a may have been a relatively unsung part of Google I/O 2022, but one of its new camera features shows that Google clearly has Photoshop-rivaling ambitions.

We've seen its nifty Magic Eraser tool, which lets you quickly remove unwanted people or objects from a photo, previously on the Google Pixel 6. But now Google has announced that the tool is getting a new feature that'll let you change the color of objects in your photos with a tap.

The example Google gave at Google I/O 2022 was a beach photo with a garish, green icebox in the background. Rather than removing the object and ruining your composition, the enhanced Magic Eraser instead made the object's color and shading blend in naturally with the whole scene.

This might seem like a minor update, but it means that one tool now lets you do some pretty major photo edits – ones that a few years ago would have involved dabbling with masks and eyedropper tools – with one tap. And that means Magic Eraser, and its cousins Face Unblur and Motion Mode, and rapidly turning into Photoshop who people who don't like, or need, real Photoshop.

The updated Magic Eraser tool will also no doubt spark some 'photography vs digital art' debates, ones that prospective Pixel 6a owners likely won't care about. For traditionalists, the line between the two is crossed when you start adding light or elements to a scene that weren't there at the point of capture – removing objects is one thing, but letting AI and its digital paint brush loose on your snaps is quite another.

But what Google is doing is clearly aimed at the point-and-shoot crowd. The Magic Eraser is a next-generation healing brush, one that outstrips rivals like Snapseed and PhotoShop – and that 'healing' now includes the color palette of your photos.

Google vs Adobe

A phone screen showing Google's Magic Eraser tool

(Image credit: Google)

This means Photoshop is something of an arm's race with built-in tools like Google's Magic Eraser, which explains why Adobe recently hired the person who was the driving force behind Google's Pixel phones, Marc Levoy.

In a fascinating recent chat with Adobe's own Life blog, Levoy revealed that Adobe is working on a "universal camera app" that'll have some of the computational "sorcery" that we saw in those early Pixel phones.

But Adobe is actually taking the opposite approach to Google's Magic Eraser. Levoy said that while his role at Google was to "democratize good photography", his goal at Adobe is instead to "democratize creative photography". And that means "marrying pro controls to computational photography image processing pipelines".

Instead, Google's enhanced Magic Eraser falls firmly into that "democratizing good photography" camp, and it's something that the tool is becoming increasingly adept at. Keen photographers often spend hours pondering the color palette of a scene or waiting for an opportune moment, but with Magic Eraser you'll soon be able to do it with a tap. And that's likely just the start of its talents.

Which weapon is mightier, the Magic Eraser or Photoshop? It depends which side of the photographic fence you're on, but there's no doubt that Google is winning the point-and-shoot side of the battle.

Posted in Uncategorised

Google Lens’s new mode makes shopping dangerously easy

Google Lens already has lots of fancy AR tricks up its sleeve, but it'll soon get what might be its most useful feature yet, called Scene Exploration.

At Google I/O 2022, Google previewed the new feature, which it says acts as a 'Ctrl+F' shortcut for finding things in the world in front of you. Hold your phone's camera up to a scene, and Google Lens will soon be able to overlay useful information on top of products to help you make quick choices.

Google's example demo of the feature was shelves of candy bars, which Lens overlaid with information including not just the type of chocolate (for example, dark) but also their customer rating.

In theory, this Google Lens feature could be super-powerful and a big time-saver, particularly for shopping. And Google says it runs on some smart real-time tech, including knowledge graphs that crunch together multiple streams of info to give you local tips.

The downside? Scene Exploration doesn't yet have a release date, with Google saying it's coming "in the future", with no precise timescale. This means it could be one to file next to Google Lens's earliest promises, which took a few years to mature. But it doesn't look like a huge leap from Lens's existing shopping tools, so we're hoping to see the first signs of it sometime this year. 


Analysis: one of AR's most useful tricks so far

A knowledge graph used by Google Lens

(Image credit: Google)

There's no doubt that Scene Exploration mode has massive potential for shopping, with old-school browsing in shops likely to increasingly take place from behind a phone screen – or perhaps ultimately, smart glasses.

But Google says it also has more benevolent applications. The feature could apparently help conservationists identify plant species that are in danger of extinction, or give volunteers a handy way to sort through donations.

Either way, it's certainly looks like a powerful and intuitive development of another Lens feature that Google announced at I/O 2022, called Multi-Search. This allows you to combine image search with a keyword to help you find obscure products or objects, without needing to know their name.

Multi-Search arrived in Google Search last month (check in the Search app on Android or iOS), and you'll soon be able to use a more specific version called 'Near Me'. Google's example was taking a photo of a certain dish, and then being able to search local restaurants that serve that particular food.

You could argue that these kinds of features are turning us all into idiots, helplessly reliant on the crutch of Google's powerful Lens and Search tech. But features like Scene Exploration and Multi-Search do look like some of the most useful examples of AR we've seen, and their versatility should prove a boon for all kinds of users.

Now all we have to do is wait to see how long they take to fully materialize on Google Lens.

Posted in Uncategorised

These powerful MagSafe mounts turn your iPhone into a GoPro

One of the big appeals of GoPros has always been the versatility of their mounting options, but some new iPhone MagSafe accessories have landed to bring some of that flexibility to your iOS smartphone.

The accessory maker Moment made some of the first MagSafe tripod mounts for the iPhone 12, back at the dawn of Apple's magnetic system. But now the company has made its most powerful add-ons yet, which include extra-strong magnets to keep your iPhone secure.

Like GoPro's Mod accessories, in particular its Media Mod, the Moment Mobile Filmmaker Cage ($99, or around £75 / AU$135) is designed to help you add various accessories – including microphones and lights – to your iPhone, while bringing extra stability to complement Apple's already impressive image stabilization.

It's the first video cage with MagSafe compatibility (which is built into the iPhone 12 and iPhone 13 series) and comes with a proprietary magnet array called (M)Force that's apparently significantly stronger than the standard MagSafe connection. 

Alongside two cold-shoe mounts for your mics or lights, the Cage is studded with 1/4-inch mounts (28 of them in total), so you can mount it on mini tripods in various ways. There are also four 3/8-inch threads that are compatible with accessories like Arca Swiss tripod plates.

With the ability to stand flat on a surface without a tripod and built-in cable management, it looks like the ideal MagSafe accessory for vloggers and YouTubers – particularly if you already have some of Moment's lenses and filters, and don't have the budget for a dedicated setup from the likes of GoPro or Sony.

Joining the Filmmaker Cage are a couple of other new MagSafe mounts – the Strap Anywhere Mount ($39, around £30 / AU$55) and Stick-on Adapter ($9.99, around £8 / AU$13). The latter is a 3M sticker accessory that makes any iPhone compatible with Moment's (M)Force mounts, even if you don't get wireless MagSafe charging.

And the Strap Anywhere is much like GoPro's Tube Mount, letting you add a magnetic phone mount to fitness equipment like Peloton bikes, ellipticals or pretty much any tube-shaped surface. It looks particularly handy for more casual filmmaking or an easy way to FaceTime while you're mid-workout.


Analysis: The mobile filmmaking wars heat up

An iPhone attached to the Moment Strap Anywhere Mount

(Image credit: Moment)

The latest iPhones are already powerful filmmaking tools, but MagSafe accessories like these from Moment are improving their versatility – even pushing them towards being a rival to GoPro, albeit with slightly more fragility.

The Filmmaker Cage in particular looks like a particularly useful accessory for capturing B-roll footage, or turning your iPhone into a main camera for vlogging or YouTubing. The MagSafe component is more about usability and convenience than improving your videos, but if it makes you take shoot more movies that that's a good thing. If you don't trust the magnets, it's also compatible with traditional phone clamps thanks to its 1/4-inch threads.

Moment's other MagSafe mounts also make it easier to attach your iPhone to objects for more interesting compositions and angles that you'd otherwise get from nervously holding a slippery rectangle.

So while iPhones have arguably slipped behind rivals like the Sony Xperia 1 III when it comes to pro-friendly filmmaking features, accessories like these do boost its appeal for videographers, particularly when you combine them with some of Moment's lenses.

With the iPhone 14 now strongly rumored to be arriving with a 48MP for 8K video later this year, it seems the smartphone battle against the best vlogging cameras from the likes of GoPro, DJI and Sony is going to heat up later this year.

Posted in Uncategorised