Your average smartphone camera is around 12 megapixels. This image of Shanghai is 195 gigapixels. One megapixel equals one million pixels, a gigapixel equals one billion pixels.
Put another way, this image is 16,250 times larger than the image you can take with your smart phone.
Let’s see what that feels like through the BigPixel viewer looking at the roof top of a building a few miles away.
How about people on the street?
Coordinated outfits or a glitch in the matrix? You decide.
For a sense of how invasive this technology could be for privacy try the BigPixel experience on a beach.
You might have noticed that the images look a little weird. This is because every part of the image is in focus at each level of zooming. It appears to use technology similar to what Lytro built to be able to refocus an image after it had been taken.
The focal point of the image looks like it gets altered and then the image gets stitched back together. There are some minor bugs that create ghosts in the images
Maybe …. but not for a while, there are still a few technical problems to solve. This does take what we used to think of as big data to an epic data scale.
The reason big new things sneak by incumbents is that the next big thing always starts out being dismissed as a “toy.” — Chris Dixon
I remember back in 2004 hearing about a Stanford research project that was attempting to take a photo of every address. I thought it was an impossible project. There was no way they could possibly photograph every street address …. right? Not to mention it would cost a fortune to store all that data.
The sheer scale of doing it for a country let alone the world was mind boggling. Then in 2007 Google Street View launched.
1 pixel takes up 1 byte.
Let’s assume no image compression. Storing 195 gigapixels (195,000,000,000 pixels) would take up 195 gigabytes. So if you wanted to take 1 image a second that amounts to 16,848,000 GB / day or 6,149,520,000 GB / year.
1,000 gigabytes = 1 terrabyte
1,000 terrabytes = 1 petabyte
1,000 petabytes = 1 exabyte
so 6.1 exabytes a year for one camera that takes 1 image a second.
Is that a lot?
Well let’s say you are lucky enough to have a Google Fibre connection and get 1 Gbit/s upload speeds. In theory that single image you created would take 1,560 seconds to upload aka 26 minutes.
A single BigPixel image would take 26 minutes to upload on a Google Fibre connection.
To transfer the data is going to require faster internet. A weekly Amazon Snowmobile collection would also work. Snowmobile is a 45-foot long ruggedized shipping container for getting data into AWS.
Extracting value from the data is something more resources can be thrown at with parallelisation of the work. I wouldn’t want to pick up that AWS bill though.
The way the gets stored will likely bring up a whole new set of epic data problems.
There are lots of technical challenges ahead but I would put money on them getting solved.