Does the Galaxy Camera tell a bleak story for the future of Android smartphone cameras?

Taylor Martin
 from  Concord, NC
| August 31, 2012

Two days ago in Berlin, Germany, Samsung announced two show-stopping products, the successor to their strangely popular and gigantic phablet device, the Galaxy Note II, and a hybrid point and shoot camera powered by none other than Android called the Galaxy Camera.

The allure of the Galaxy Note II is obvious. It's a productivity-based, pocket-sized tablet that doubles as a smartphone. The S Pen, while it could definitely be considered a gimmick, gives the Note II a leg-up over its smaller brethren, such as the Galaxy S III. It's a powerful device and the large battery should keep the device ticking all day long. To be concise, the Note II shows that Samsung can march to the beat of its own drum and isn't afraid to test different waters.

But I simply cannot wrap my head around the other device, nor can I comprehend why some people find it to be so … fascinating. The Galaxy Camera is definitely not the first Android-powered point and shoot camera. Panasonic beat Sammy to the punch with the LUMIX Phone. And I'm sure Panasonic wasn't the first company with such an idea either.

That said, I couldn't care less about who was the first to think of using Android to power a dedicated camera. Instead, I'm more concerned with the quality of images the Galaxy Camera can produce, the price point and the fact that Samsung is doing all it can to delay the death of the point and shoot camera.

Earlier today, Android Central posted a gallery of full-resolution images that were taken with the Galaxy Camera. At thumbnail size, the images aren't so bad. Even at my MacBook's native display resolution (1440 by 900 pixels), the images are okay – nothing to be utterly impressed or repulsed by. But at 100% zoom, full-size, the images show their true color. To be frank, they're horrendous, especially for a dedicated camera.

Despite the larger sensor and decent lens (F2.8, 23mm), the Galaxy Camera yields images that are hardly any better than the ones that are captured with my HTC One X. (And to be completely honest, I'm only impressed with the One X camera on rare occasion.) The major difference is the output size of the pictures – the Galaxy Camera yields pictures that are just shy of double the resolution of pictures taken with a standard 8-megapixel smartphone camera.

But why waste money on the Galaxy Camera if the quality is not significantly better? And why carry around a second device dedicated to taking photos if the quality of its photos are not discernible from ones taken with a smartphone?

Some will argue that the Galaxy Camera is capable of taking pictures that are just fine. For all intents and purposes, it probably is. And anyone serious about taking pictures shouldn't worry about run-of-the-mill point and shoot camera anyway. But it doesn't take a trained eye to see the lack of detail and significant level of noise in this picture, even at a reduced size. And the obvious compression of the defocused area in this picture is enough to send a chill down my spine. Don't even get me started on this one.

"Okay, Taylor. We get it. You don't like the Galaxy Camera. What's your point?"

I promise I have one. Cross my heart.

The Galaxy Camera has everything we thought would make the cameras embedded in Android smartphones better: a larger sensor, better optics, optical zoom. Yet I can't say I would prefer taking pictures with the Galaxy Camera over something a little more slim (and already connected) like my One X, let alone the iPhone 4S. It's just as susceptible to atrocious noise in low light as your average smartphone and even in the best conditions I fear it can't produce much better.

In all of this, however, there is a common denominator: Android. I'm no expert, I'm not a photographer and I'm not a software engineer who focuses on image sensing software. If I were to guess, though, I would say the answer to all of this lies somewhere in the Android source code. The chipsets used could also be the culprit. But I would put my money on Android and a compression algorithm gone awry being the issue.

And that could spell a bleak future for cameras in Android devices for the foreseeable future. While Apple and has struck gold with the iPhone 4 and 4S cameras (lest we forget they're among the most popular cameras used to take pictures that are uploaded to Flickr) and Windows Phone on the verge of being graced with Nokia's PureView technology, a dedicated camera with a large sensor and a mediocre lens aren't enough to produce a half-decent picture using Android.

Call me a hater, a skeptic or whatever you want. But after having used upwards of 70 Android devices, I have yet to ever be repeatedly impressed by a single Android phone's camera. I haven't had actual hands-on time with the 808 PureView, but samples taken in my fellows' reviews were impressive to say the least. And though it's still susceptible to some bad photos at times, I am still blown away by the quality of the iPhone 4S shooter all the time. I use it to take 95 percent of the pictures I upload to my networks and have next to no qualms with it.

The Galaxy Camera convinces me that hardware isn't necessarily the issue with Android cameras, especially considering the Galaxy S III and iPhone 4S share the same image sensor yet are capable of producing vastly different quality images. (The Galaxy S III and iPhone 4S do not share the same sensor. Somehow I missed that memo and have been spreading lies.) But hey, I've been wrong before.

What say you, folks? Does the Galaxy Camera come off as a must-have point and shoot camera to you? Or do you feel your smartphone's camera is just fine? Do you think the quality of images it produces, despite the improved hardware, could tell a dismal story for cameras in Android smartphones to come?