Same, except for list comprehension in python, I prefer sinlge character var names there.
nicky7
I was taking the comment thread (about how dangerous this could be in photographic evidence) a step further by imagining a hypothetical techno-distopian future where corporate controlled AI alters photos to make them look better, but in reality, it creates a back door where incriminating evidence can be created.
edit: since it wasn't obvious to readers, this is a hypothetical of a techno-distopian future...
Imagine taking a selfie only to see an image of you holding a knife. But there are no knives in your hands. Another snap. Same image displays on the screen, but there's a person of particular importance in the background. You turn your head but are all alone. Nobody is around. You're starting to freak out. Are you being pranked, maybe your phone has been hacked. Another shutter sound effect and you see an image of yourself over a victim. You frantically open your camera's gallery, thinking your eyes are fooling you, but the photos are the same. And are sent to the cloud. Deleting isn't allowed, AI detected felonious imagery. You've been reported to multiple agencies. You are alone. There are no knives in your hands.
I feel like they're being disingenuous. Lots of what-aboutisms and moving goal posts and ignoring the issues that got us to needing right to repair laws in the first place, namely Apple and John Deere and all the copy cats, but also with the goal of reducing e-waste.
๐คฃ