Google has a new camera mode on its Pixel phones called Night Sight, for handling low-light conditions. The short version is that instead of taking a long exposure, it takes a series of short exposures and stacks them to avoid motion blur from hand movements (plus a lot of additional processing). The long version is fascinating.

My Pixel 2 already takes better low-light photos than I would have expected, but I couldn’t wait to try out the new feature when I learned about it. When the updated Camera app finally hit the Play Store, I had to give it a try.

Two high-rise office buildings at night.

This is adjusted slightly to keep the colors from being too light. And it actually isn’t the best example, as it turns out the nighttime city scene already has enough light for the existing HDR+ mode. I’m going to have to try it on some darker scenes somewhere, but it’s still pretty cool. After the cut I’m posting a version with the phone’s regular mode, and the unaltered Night Sight photo. You can see they’re pretty close, but the Night Sight version picks up a bit more of the color, and it’s a little brighter.

Continue reading

I’ve had the “Google Assistant” on my phone for a few weeks now. Since I don’t use the always-on voice activation, this means it’s pushing extra notifications based on what it thinks I want/can use. Fortunately it doesn’t do audio alerts, so it’s a lot less intrusive than it could be. I figured I’d give it a try and see if it turned out to be useful (or creepy).

The alerts I’ve gotten fall into the following categories:

  • Estimated commute time based on current traffic. This would be more useful if it wasn’t based on the freeway, which I never use to get to or from work because it’s such a pain. Though on a trip to San Francisco, it popped up transit delays, which would have been helpful if I’d actually been going anywhere beyond walking distance that day.
  • Weather changes. This is kind of useful, but I have a widget to show the same info.
  • Hours and offers for stores I have just left. At least three times, I’ve walked around a grocery store or Target for 30-40 minutes, consulting my shopping list on my phone all the way, and it has popped up with info when I load up the car. What’s the point of that?
  • Potentially useful information for someplace nearby, but that I’m not going to. I work near an airport, and it’s repeatedly sent me the terminal layout. (The one time I actually went to the airport, I was already on a shuttle to the remote terminal — which isn’t on that map, incidentally — before it sent me that one. Better than the return trip, though, when it sent me a map after I’d boarded the plane.) Once it tried to help me with a mall restaurant while I was at a different restaurant. Another time it sent me info about a hotel I had driven past.
  • News articles about Trump. As if they’re hard to find. No further info in the notice, just “There is a new article about President Trump.” (Or something along those lines — I don’t recall the exact phrasing.)
  • Premiere dates for two TV shows that I watch. This one impressed me, since it correctly picked out two of three returning shows that I watch, and has not tried to plug anything else. It makes me wonder what it’s mined to figure that out, but I’m impressed it caught the nuance of which two DC/CW shows I watch. (OK, Flash is easy, but it somehow figured out that I was interested in Supergirl but not Arrow or Legends of Tomorrow.)

At this point, I think the experiment has run its course. The only category that’s been consistently useful is the TV premiere schedule… and that only comes up a couple of times a year.

Google Photos is overcompensating for the Daylight Saving Time switch on yesterday’s pictures. Photos taken at 6:00pm are labeled as 7:00pm. Everything from this summer/early fall (which might as well have been summer) is off, in the app anyway (the website shows the right time in PDT), which at least makes more sense than if it had only been one day. The weird thing: it’s not even showing the time in today’s timezone by mistake. It’s adjusting it the other way.

My best guess: There used to be a DST bug in Google Photos, but they adjusted for it. Now the bug’s been fixed, but the adjustment is still there.

Obligatory XKCD link:

Google Photos has been sending me its usual “Rediscover this day!” collages from Comic-Con 2013. On Tuesday it sent me a collage built from July 18, and on Thursday it sent me a collage built from July 20, marking Thursday and Saturday of the event.

Wait, what about Friday?

Well, here’s the interesting thing: Friday was the day I spent the afternoon in the emergency room.

I’d like to think someone programmed the algorithm to skip photos tagged for hospitals. Imagine if it was my wife’s photo collection, and I hadn’t made it — “Rediscover this day” would have been rather cruel in that case.

The more likely explanation is that I don’t have very many photos from that day for it to pick from, at least not on the account. On my computer I have 19 photos from my phone, 19 from my camera, and 5 from Katie’s phone. Of those, I picked 21 for my Flickr album. In Google Photos, I only have six from that day, all from my phone — and they don’t include the ER wristband shot.

My best guess is that I cleared most of them from my phone sometime before I started using the cloud backup feature… but I can’t figure out why I would have kept the photos that are there, and not some of the photos that aren’t.

Here are the photos that are on there:

It’s an odd collection, isn’t it? The three individual frames from the GoT protester collage are in there, so maybe the collage app saved extra copies of the sources, and Photos found the folder. I can see deliberately keeping the view of the blimp from the convention floor (literally the floor), but if I’d done that, I don’t know why I wouldn’t have saved the wristband shot as well. And why save the Duplo Enterprise, but not the photos of my son playing at the LEGO booth?

Phone cameras and cloud storage are supposed to augment our memory. But sometimes the context is just missing.

Google has released the first taste of what will become a larger Google+ API for third-party applications built on their social network. So far, all you can do is authenticate, retrieve someone’s public profile, and read their public activities. That doesn’t sound like much, does it?

Well, here are some ideas I came up with over lunch:

  • Add Google+ activity to a lifestream.
  • Allow someone to comment on your blog using their Google+ identity.
  • Create a map of movements of based on public checkins.
  • Analyze posting frequency & times.
  • Analyze most popular posts based on reshares, +1s, replies (basically: add Google+ to Klout [Update: That was fast!])
  • Associate a person with other profiles you might have from other social networks, based on their profile URLs.
  • Build a list of people who work at an organization and speak a particular language.

Of course, it’ll really start taking off when they enable write access and the link-sharing and cross-posting services can get in on the act.

So, how about you? What else do you think can be done with the limited API released today?