Hacker News new | past | comments | ask | show | jobs | submit login

I took a photo of my cat inside my house, with nothing from visible except the sky, stripped the EXIF, and it STILL managed to get within a few hundred metres of my location - just by inferring based on my interior design and the layout of my house.

I’m sure there was an element of luck involved but it was still eery.




Not sure if this is true or not but people have pointed out that it uses data from your past conversations to make a guess.


It’s true. Unfortunately I can’t post proof without doxxing myself obviously, but I understand the skepticism considering I’m not sure I’d believe it if I hadn't seen it myself.

I have no memories stored, and in any case it shouldn’t know where I live exactly. The reasoning output didn’t suggest it was relying on any other chat history or information outside the image, but obviously you can’t fully trust it either.


Yeah, I had to turn off chat history after I spotted it doing that.


Also wondering if, as another commenter mentioned, it might be trying to estimate your location just by network means.


It absolutely does that - o3 knows your current location based on IP address etc. This means for a fair test you need to use a photo taken nowhere near your current vicinity - that's why I added examples for Madagascar and Buenos Aires at the end of my post: https://simonwillison.net/2025/Apr/26/o3-photo-locations/#up...


And of course make sure you turn off geotagging in the exif :)

But really, if Google Street View data (or similar) is entirely part of the training dataset it is more than expected that it has this capability.


Thanks! Looks like I missed that last part somehow :)




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: