Astrophotography mode takes a 4 minute night sight exposure
I live in London so very hard to get away from light pollution. This was taken in Suffolk on my Pixel 7 Pro
That’s amazing for being in a city
oh, sick
Pretty good for a phone! I’m guessing it’s doing some kind of stacking process? I think there would be some streaking with a single 4 min exposure, but your stars look nice and sharp
I’m assuming there is some sort of stacking process involved in the software as well, but that’s just how it came out. It actually provided me with a one second video of the process too where you can watch the Stars move a little. But the picture I posted is exactly as I got it off the camera, I posted it from the same phone I took it with.
Edit: here’s the video, I converted it to a gif in the original orientation.
It seems to have improved since the pixel 8. I started doing astrophotography with my phone and I’d always get star trails. Haven’t tried again since getting my seestar lol I love that thing
Yes, iirc it’s 4 min worth of ~15 second exposures that get stitched together. Most smartphones within the last 5 years should be able to do this too. I know on my Pixel 7 you just have to put the camera in night mode and leave it still for a few seconds to trigger the astrophoto mode.
It probably crops in to overlay all images perfectly. I’m guessing if the trees were lit they’d be blurry.
The trees do have artifacts around them
Oh shit they do. Implementation confirmed.
I’m so far not impressed with the northern lights photography on my new Pixel 9 Pro XL. I’m coming from Xiaomi 12, the photos were better than with Pixel. It is rather dark.
Did you let the astrophotography mode go through? Make sure the moon on the phone turns into an icon with a few stars. You’ll know you’ve done it correctly if it starts a timer that is up to four minutes
Astrophotography on, Night mode.
HDR+ on?
I suppose you can’t turn HDR+ off on stock Pixel, or am I wrong? I’ll try Gcam, maybe it will produce better results.
you can’t turn HDR+ off on stock Pixel, or am I wrong?
Guess not. You can turn off Ultra HDR though.
You need to use a special long exposure mode
Some phones just insert stock photos
😦 Is that true?
Samsung GS22+ owner here. I’ve watched in realtime as a blurry blob moon photo I just took was replaced with a clean recognizable “moon” in my gallery, a few seconds after taking it.
It wouldn’t surprise me at all if this has gotten more advanced in recent years. Identify a few key stars as reference, and then paint in a pretty Milky Way from pre-defined images, while pretending to collect a timelapse.
Samsung themselves admitted to their moon pictures being faked some time ago, which makes me consider the idea that these photos could also be faked in a manner you’re describing (especially with all the effort Google has been putting into their AI).
Well of course their moon pictures are faked, the moon is fake
If the moon was made of cheese, which cheese variety would it be?
Wensleydale!
The stinkiest roquefort you’ve ever smelled. That’s why they put it so far away, because of how stinky it is
That’s awful. I want my pictures to be real.
It’s called post processing. Generally it takes multiple frames and combines them or gets other camera data.
In theory, if this were true, you should be able to get a light, place it where the moon is and point it at it, and it should replace it with a moon. But it doesn’t.
I had to dig through my phone to find these photos when i got home from work, but they were taken literally seconds apart with my phone on full optical + digital zoom. The detail on the second is absolutely absurd, while the first is what I typically see on my screen when I’ve tried to replicate since.
I reckon it’s just processing a raw capture. The only difference is exposure, which yeah, phones could easily post process.
Im familiar with post processing.
To be clear, the assumption is that the algorithms the phone is using to determine you were trying to take a photo of the moon are “smart” enough to identify it as a photo of a night sky focused on the moon, rather than a light bulb. I’m not sure how you’d set up a light of the correct brightness at infinite focal length to test this though.
ETA: I’ve never seen this post processing happen so starkly with anything other than a photo of the moon, so it sticks out pretty hard. And I take a lot of photos at work of things that are tough to capture clearly.
No.
k, gr8
This has been debunked
What about dmention7’s comment?
You got a better picture with your phone than I have ever got with my dslr setup and stacking software. I don’t know whether to applaud you or hate you lol.
This is not much better than what I can do with a single frame under the right conditions and with a little post tweaking. Canon rebel t6s, a f/2.8 lens in the 20-35mm range, steady mount, remote (cabled) shutter, 45-60s exposure, iso 1600, and most importantly, bortle 3 skies or better. You also get more star density by pointing at the galactic plane, specifically towards the center - the teapot asterism, or the scorpion. Then play with the photo in photoshop or gimp.
Again, I cannot stress this enough, but dark skies are the most important part. If the sky doesnt wow you by eye, your picture isn’t going to wow you either. Check a light pollution map and consider that cities 50+ miles away will still have light domes over them if you want anything clear near the horizon
I guarantee it wouldn’t have been this good if I weren’t in an area with no light pollution lol
Might be a stupid question, but is this mode available on GrapheneOS?
it is, with the google camera app. i just had it automatically trigger when shooting the night sky last night.
Seems not. Gcam on f droid plus google camera app combination, maybe does
No idea, it’s on the pixel camera app though
I have a 9A and didn’t know I could do this. Thanks.
Go to night sight on the camera and adjust the shutter settings to Astro and it’ll take a 4 min exposure
That worked great! Cool photo!
I’ve tried this with a dslr and ended up getting light streaks, so appreciate the wizardry involved in that mode
If I remember correctly it avoids the streaks by taking several long-exposure photos and then combining them.
That would still create streaks, the software needs to also rotate the pictures a specific amount to keep that from happening. My guess is that it’s using the GPS location to some extent in concert with the long exposures.
I took the photo late in the evening, about 11:40pm local time.
I assume I didn’t catch any satellites because the sun was too far below the horizon to illuminate any of the near earth ones (starlink) and I was 30 miles from the nearest town in the Upper Peninsula of Michigan I didn’t catch any city lights or low flying aircraft.
TIL