Melbourne continued to serve up some globally warmed weather over the weekend so we put the kid in the car and went to Cherry Hill Orchards. It was lovely, and a good reason to dust off my Fuji X-T3 and take some photos of our 2 year old running among some pleasant cherry trees .
As expected quite a few Melburnians had the same idea. This made getting clean shots of my daughter difficult, there was almost always something or someone messing up the frame (how dare they).
One particular shot I liked contained many distracting elements; people picnicing in the background, another kid wandering aimlessly etc.
I haven't done any professional photography work for quite some time, but Affinity Photo was my previous weapon of choice when it came to editing. It contains 95% of what Adobe Photoshop offers for a very reasonable once-off purchase. So I fired it up and got to work with the clone tool. I quickly realised this was going to be difficult, there were not enough clean sample pixels to clone over the unwanted subjects.
Now if I really wanted to clean this image up the traditional way I could have opened up other files to find more clean samples of grass to use when cloning, but that approach brings its own challenges when matching lighting, perspective and depth of field. Too much work.
Then I remembered seeing a video somewhere of Photoshop's new generative AI tools. I found it and watched it again - it was impressive. In theory this digital witchcraft could do exactly what I want with a few clicks.
I'd already used my free trial time with Adobe, so I'd have to pay to give it a spin. I was pleasantly suprised you can get get Photoshop and Lightroom for $14 AUD per month. Pretty fair, actually.
I installed all of Adobe's Creative Cloud bloat, then finally, Photoshop. I opened up the above image, did a few crude lasso selections and clicked "Generative fill" on the floating toolbar, then ignoring the text prompt clicked "Generate".
I see a progress bar as the image is sent to Adobe's billion dollar army of Nvidia GPUs.
A few seconds later everthing I wanted gone was gone, seamlessly replaced by synthetic pixels that looked very convincing at first glance.
The final image:
In the end it wasn't click and done, I did multiple generative fills, cleaned some things up manually with the clone tool and a bit of colour adjusting.
I'm very impressed with how Adobe has integrated AI into Photoshop. They've also taken the high road when it comes to copyright issues. Unlike others, Adobe only use images they are legally allowed to use to train their AI model.
I think this feature alone make Photoshop the obvious choice once again when it comes to professional editing. No one else has it? It reduces hours of tedious work down to a few seconds with incredible results, almost certainly better than I could have done by hand.
Many are worried how this will impact the photography industry. I'm just a hobbyhist now, but as you can see in the layers panel above, there was still plenty of work to do to get the image into its final form. Work that required some knowledge and skill.
I'm looking forward to seeing how this evolves, but even now, for $14 each month I can't see why a photographer wouldn't want this tool in their hands.