You are scrolling through your feed, and you stop. It’s a photo of a politician doing something wild, or maybe just a landscape that looks a little too perfect. You pause. Is this real?
Two years ago, you could tell instantly. The faces looked like melted wax. Today? Midjourney v6 and Flux are terrifyingly good. They get the lighting right. They get the skin texture right. They can even do text now (mostly).
But they aren't perfect. If you know where to look, the cracks still show. I’ve spent the last few months staring at thousands of generated images to build our detection tools, and I’ve noticed patterns that machines just can't seem to shake.
Here is a practical checklist to help you spot the fakes without needing a degree in computer science.
1. The Hands (Still the biggest tell)
Yes, AI has gotten better at hands. We don't see seven fingers as often as we used to. But look closer.
Real hands have tension. When you grip a coffee cup, your knuckles turn white, and your skin presses against the ceramic. AI hands often look like they are "floating" near the object rather than holding it. Also, check the fingernails. AI tends to make them look like perfect, unblemished plastic, or sometimes they merge into the finger itself.
The Check: Count the fingers, yes. But also ask: "Is that hand actually applying force?"
2. Text and Background Signs
AI used to output alien gibberish. Now, it can spell "STOP" on a stop sign. But it struggles with the boring stuff in the background.
Look at street signs, store logos, or newspapers in the background of the image. AI focuses all its computing power on the main subject (the person). The background is often an afterthought. You’ll often see letters that look like a mix of Cyrillic and English, or a Starbucks logo that has two heads.
The Check: Zoom in on the boring background text. If it looks like a dream-state alphabet, it’s fake.
3. The "AI Glaze"
This is harder to describe, but once you see it, you can't unsee it. AI images often have a specific sheen. Everything looks slightly wet or made of high-quality plastic.
In photography, skin absorbs and reflects light in complex ways (subsurface scattering). AI approximates this but often dials up the contrast and smoothness to 11. Everyone looks like they just had a professional facial and stepped out of a humid room.
The Check: Does the skin look like skin, or does it look like a high-end video game character?
4. Logic Failures
Physics is hard for models. They understand pixels, not gravity.
I saw a generated photo recently of a beautiful library. It looked perfect until I looked at the shadows. The light was coming from the window on the left, but the shadow of the chair was falling to the left. That’s impossible.
Also, look for accessories. Earrings that don't match. Eyeglasses where one lens is a slightly different shape than the other. Or a belt that disappears into a shirt and never comes out the other side.
The Check: Follow the lines. Do the shadows make sense? Do the accessories connect?
Using a Tool When Your Eyes Fail
Sometimes, the image is just too good, or the resolution is too low to count fingers. That’s where you need a second opinion.
We built the AI Image Detector for this exact moment. You upload the image, and instead of just saying "Fake" or "Real," it analyzes the noise patterns and compression artifacts that are invisible to the human eye. It gives you a probability score, which helps you verify your gut feeling.
It’s not magic, but it catches the subtle mathematical traces that generators leave behind.
When this checklist won't help
I want to be honest about the limits here.
- Heavily Edited Real Photos: A real photo with heavy Photoshop filters can trigger your "fake" radar. heavily airbrushed magazine covers often look like AI.
- Low Resolution: If an image is 300x300 pixels, it’s blurry. You can't check fingernails or background text. In these cases, it’s best to remain skeptical and look for the source rather than analyzing the pixels.
FAQ
Can AI generate perfect text now?
It's getting there. Models like Ideogram are built specifically for text, but they still slip up on long sentences or background details.
Does this work for video?
Video is a whole different beast (Sora, Kling). The "glaze" texture is usually more obvious in video, along with weird morphing when objects move behind other objects.
Why do AI images look so shiny?
It’s likely a bias in the training data. The models are trained on high-quality, aesthetically pleasing images, so they tend to over-smooth and over-saturate everything to make it look "good."
Conclusion
We are entering an era where "seeing is believing" is a dangerous mindset. You don't need to be paranoid, but you do need to be observant.
Next time a photo makes you feel a strong emotion—anger, awe, or shock—take five seconds. Check the hands. Check the background. Check the shadows. If it feels off, it probably is.