Articles
Behind the Scenes: the story behind new features in Adobe Photoshop & Lightroom
DPReview NewsLogos: Adobe |
This year, at Adobe's Max conference, the company announced several new AI features coming to Photoshop, Lightroom, and Adobe Camera Raw. We talked to some of the managers and engineers behind these products to get an idea of how those features came about and to try to get a sense of what the future holds for Adobe's photo editing suite.
Lightroom
The Quick Actions UI gives you easy access to a variety of subject-specific edits. |
One of the major new features for Lightroom Web and Mobile is called Quick Actions. It's a panel that lets you easily adjust various parts of your image, giving you different sliders and suggestions based on what type of subject it detects.
"It really started with a multi-year investment into masking," said Rob Christensen, director of product management for Adobe Lightroom and Adobe Camera Raw. "We had to make sure that masking was amazing. And so for multiple years, our R&D teams and our design teams came up with an experience that was outstanding. So once we had masking in place, and you could identify a subject, hair, lips, teeth, all of that, we realized, well, let's pair that up now with edits, and we'll call them Adaptive Presets."
Quick Actions essentially serve to make that work more visible and accessible. "With Quick Actions, what you're selecting in many cases are just Adaptive Presets that are relevant to that specific photo," Christensen said. "We're building from masking, Adaptive Presets, now Quick Actions. And it's all coming together now into a unified experience – that was our vision years ago, and now it's coming to life."
Christensen said that Adobe actually quietly launched the feature on the web a few months ago. "We didn't make a lot of noise around it, but customers have been using it on the web. Part of the reason why we brought it to Web first is it's just easier. We could get some additional feedback, we could do more experimentation; the web is very easy to iterate on."
"Part of the reason we also brought it to mobile is it's really designed for the mobile user, where they want to get to a quick result," Christensen said. "They don't necessarily want to go through all the different panels. In a mobile UI, a lot of things are hidden – but what if we could surface all of these advanced capabilities for mobile users to get to an edit? A bit of a goal over the last six months that's connected with Quick Actions is how do we help users capture, import, and share an amazing photo in under 60 seconds?"
At the moment, it's unclear if the feature will be coming to the dedicated desktop apps. "We're definitely looking at and listening to customer feedback. And so far, I think there's a lot of excitement, especially from desktop users. But we're not making any official announcements at this time."
The selection tool for Generative Remove has also been improved. Image: Adobe |
Generative Remove, which lets you use AI to erase objects from a scene using AI, is also now generally available across all versions of Lightroom. It's the type of thing you could easily do if you opened an image in Photoshop, but now you don't have to leave Lightroom.
"The way we think about what we're building with Lightroom is it's purpose-built for photographers," said Christensen. "So if they have a specific use case that is important for photography, we will look at bringing that into Lightroom. Distraction removal is a great example of an area that makes sense for photography. That's how a lot of customers are using generative remove today."
Finally, for Lightroom Classic devotees worried about any plans to completely replace the older-school version of the app with the new cloud-based Lightroom, Christensen seemed to offer some reassurance. "As it stands right now, we're continuing to innovate on both surfaces. We have a lot of customers on both that love the unique benefits."
Adobe Camera Raw
Left: Adobe Color. Right: Adobe Adaptive Image: Adobe |
One of the most compelling photography-related features announced at Max is the new Adobe Adaptive profile for ACR. It's meant to give you a better starting point for your own edits than older profiles like Adobe Color.
"One of the things that makes Adobe Adaptive unique is the fact that it's a lot more image content aware," said Eric Chan, a Senior Principal Scientist on the Adobe Camera Raw team. "In the past we would look at basic properties in the histogram and other attributes of the image. But with AI models now, we have a lot more semantic information about whether there's a person in it, whether there's a sky in it, etc."
That awareness helps it make base-level adjustments, giving you a better starting point to put your own edits on top. "It can do things like fix skies, fix backlit portraits, it can do things nicely with faces, and it can control a lot more attributes of the image than our previous profiles," Chan said.
You can control how intense the Adobe Adaptive look using the 'Amount' slider. |
Unlike pressing the 'Auto' button on other profiles, Adobe Adaptive doesn't change the sliders for parameters like exposure, contrast, highlights, etc.; those are still set to 0, allowing you room to do your own edits. " I think the other unique aspect that there's an amount slider that's underneath the profile itself," said Chan. "You can do a quick edit. Like, I like what it's doing, but maybe it's too much, let's go to 80%, or maybe you want to go beyond, like 150%. But then there's the finer-granularity control, things like color panels that you can combine with that."
The company's also bringing its Generative AI features to ACR, including Generative Remove and Generative Expand, which lets you "go beyond the boundaries of your photo using the power of AI." In other words, you ask it to make your picture wider or taller, and it will try to fill in the space in a reasonable way. Any changes you make in ACR will also apply to the AI-generated portion of your picture, and the program will add a Content Credentials tag to the image, marking it as containing AI-generated content.
Generative Expand essentially lets you 'crop out' with AI imagery. |
Those are interesting features to see in Adobe Camera Raw since, as the name implies, the program has previously been dedicated to adjusting the data your camera captured. Editing content using AI or other tools has been the domain of Photoshop and, to a lesser extent, Lightroom, which has had the Generative Remove feature for a while.
We asked what the thinking was behind adding Generative AI to ACR and Christensen said: "With Lightroom and ACR we're trying to ensure that photographers can observe that moment as best they can. When we talk to customers, they feel it's unfortunate if they have 90% of an amazing photo, but it's just that 10% that is not how they remember the scene. Maybe because they couldn't get the camera at the right spot at the right time." He also reiterated that using the generative AI features was completely optional.
The line about making images according to people's memories isn't new; in fact, it's very similar to how phone manufacturers like Samsung and Google are talking about their generative AI features – it's just a bit odd to hear it in reference to an app dedicated to Raw photography. However, Christensen says there's a line between what you can do in ACR, and what you can do in Photoshop. "We are not introducing capabilities like Generative Fill, where you can say 'add an elephant flying from the sky with an umbrella.' That doesn't capture the moment; that's creativity."
Photoshop
This year, Adobe made several of its generative AI tools in Photoshop generally available and added a new "Distraction Removal" tool that can automatically remove wires, cables, and people from images. Removing wires can be done with a single click, while the people mode gives you the chance to refine the selection in case it selected people you still want in the picture, or didn't select people you want to get rid of.
The 'People' mode of the Find Distractions feature lets you decide which subjects you want to keep or to add more subjects to remove. |
According to Stephen Nielson, Senior Product Manager for Photoshop, Adobe plans to add an additional mode for the Distraction Removal tool to handle non-human or cable distractions. "The way we've approached this is, first, the most popular thing that people want to remove from a photo is people. So tourists or people in the background or whatever," he said. "And so the categories that we're working on are first: people. Second: cables and wires because they're a pretty specific thing. And then there's a category of basically everything else."
Nielson says the everything else category will be like the people one, where Photoshop will select what it thinks are distractions but let you add to or remove from the selection before hitting the remove button.
It's quite challenging to come up with a single model that can detect all sorts of distractons
Adobe's not currently announcing when that feature will roll out, as it's still in the process of building the model. "It's quite challenging to come up with a single model that can detect all sorts of distractions, whether it's somebody's shoulder that's in the image, or a garbage can, or a pile of leaves, or a random bicycle. It could be anything, right?"
According to Nielson, the training process involved a lot of human work. "We actually give pictures to people and say which objects are distracting?' You do that enough times, and you can train a model to say, 'Hey, this is what people usually say is distracting,'" he said. "That's not the only kind of data that's included in our training data set, but a lot of it is, like, hey, somebody's gone through and annotated data to suggest which objects are distracting."
If you want to use the Remove tool without generative AI, you can. |
Like many features in Photoshop, Distraction Removal can take advantage of Adobe's generative AI, though it's not 100% reliant on it. "It actually can either use Content-Aware fill or generative fill technology," said Nielson. "We've built an automatic selector that will, based on what you've selected and you're trying to remove, automatically choose either Content-Aware fill or generative fill, depending on which one's best."
Adobe has also added a drop-down menu that lets you manually select whether you want any part of the Remove tool, including the Distraction Removal feature, to use Generative AI or Content-Aware Fill. Nielson, however, recommended leaving it on auto. "Content-Aware Fill is better for areas with similar textures, areas where there's lots of noise, or higher resolution images. Whereas Generative Fill is really good at details, which Content-Aware Fill just isn't good at. So there's a good use case for both, and the auto selector we have allows the algorithm to choose which one's going to be best."
We think generative technology is huge, but it's not the answer for everything
Nielson thinks Generative AI will play a big part in future Photoshop features, but it won't be the only way the company improves the program. "There's still a lot of areas where we think generative technology is going to dramatically simplify things that were previously tedious and time-consuming inside Photoshop and give you more time to be creative."
The company showed off one such example at its Sneaks presentation, which showcases tech demos that may or may not actually make it into Adobe products in the future. The demo, nicknamed 'Perfect Blend,' automatically matches lighting and color between objects you're photoshopping in to a background.
"But there's also going to be a lot of other non-gen AI improvements that we want to put into Photoshop," Nielson said. "Just making the application run smoother, faster, be more efficient, speed up workflows with non-genitive technology. We think generative technology is huge, but it's not the answer for everything. So, there's a lot of other things that we are planning just to make the app better."