Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Here if anyone has questions on the implementation or such.


It's great you listen. So I'll try.

1. Scaling down in linear colorspace is essential. One example is [1], where [2] is sRGB and [3] is linear. There are some canary images too [4].

2. Plain bicubic filtering is not good anymore. EWA (Elliptical Weighted Averaging) filtering by Nicolas Robidoux produces much better results [5].

3. Using default JPEG quantization tables at quality 75 is not good anymore. That's what people referring as horrible compression. MozJPEG [6] is a much better alternative. With edge detection and quality assessment, it's even better.

4. You have to realize that 8-bit wide-gamut photographs will show noticeable banding on sRGB devices. Here's my attempt [7] to reveal the issue using sRGB as a wider gamut colorspace.

[1] https://unsplash.com/photos/UyUvM0xcqMA

[2] https://cloud.githubusercontent.com/assets/107935/13997633/a...

[3] https://cloud.githubusercontent.com/assets/107935/13997660/b...

[4] https://cloud.githubusercontent.com/assets/72159/11488537/3d...

[5] http://www.imagemagick.org/Usage/filter/nicolas/

[6] https://github.com/mozilla/mozjpeg

[7] https://twitter.com/vmdanilov/status/745321798309412865


I thought there was something familiar about that name Robidoux … I see he's the same one that got some crowdfounded work on GIMP's new scaling methods: http://libregraphicsworld.org/blog/entry/advanced-samplers-f...


"When we started this project, none of us at IG were deep experts in color."

This is pretty astonishing to me. I always thought applying smartly designed color filters to pictures was basically Instagram's entire value proposition. In particular, I would have thought that designing filters to emulate the look and feel of various old films would have taken some fairly involved imaging knowledge. How did Instagram get so far without color experts?


In the "How I Built This" podcast for Instagram, Kevin Systrom specifically says the filters were created to take the relatively low-quality photos early smartphone cameras were capable of and making them look like photographer-quality photos. Filters were about taking average photos and making them "pop" but not necessarily by virtue of having deep domain knowledge of color.


I was never under this impression. I was always under the impression the filters are just created by a designer playing around with various effects until it looked nice.


Because it isn't complicated or novel to make compressed 8-bit jpegs have color filters. There are tools for the job and they've been around for a long time.

Working in a different color space than standard requires a little bit of familiarity and finesse that modifying 8-bit jpegs for consumption on the internet did not require.

Many photographers and printers are familiar with this dilemma in a variety of circumstances, where the cameras create images in a different color space and higher bit depth that can't be perceived at all with any technology or the human eye.


I'm sure the comment you're replying to wasn't thinking of the algorithm that applies a filter to a jpeg, but the process by which that filter is created in the first place. The assumption being that there's some sort of theory to colour that allows you to systematically improve the aesthetic qualities of images.

As an analogy, think of the value music theory (i. e. https://en.wikipedia.org/wiki/Scale_(music)#Harmonic_content) for composition.


The creative process isn't novel. There isn't even the capability of any layer masking in most mobile apps, including Instagram, compared to pre-existing more robust tools on desktop (and other mobile apps), severely limiting the 'technical interestingness' to begin with.


Instagram value proposition is that other mostly young people use it.


Instagram's value prop was, and is, mobile.


Bit of a jump in topic, but I'm kind of curious: I don't use instagram myself, bugt I'm sure it resizes images for online viewing, saving bandwidth ans such. Does it do so with correct gamma[0]? Since that's a thing many image-related applications get wrong.

[0] http://blog.johnnovak.net/2016/09/21/what-every-coder-should...


If it's using Pillow for the resizing, probably not. I've not looked at Pillow specifically, but PIL certainly wasn't very good.


Pillow doesn't do anything with gamma by default, nor does it require that color management be compiled in.

(I'm a maintainer of pillow)


That would be a bit sad, given that they went through all the trouble of getting wide colour.


They don't store full resolution images: Square Image: 1080px in width by 1080px in height

Vertical Image: 1080px in width by 1350px in height

Horizontal Image: 1080px in width by 566px in height


Not necessarily on topic - but given your code samples are written in Objective-C, how much of your codebase is still in Objective-C and what's your opinion on porting it over to Swift?


Good q. All of it is still Objective C and C++; there's a few blockers to starting to move to Swift, including the relatively large amount of custom build + development tooling that we and FB have in place.


Hi Mike, since Instagram is listed in the React Native Showcase, could you tell us where are you using it? Thanks in advance.

https://facebook.github.io/react-native/showcase.html


It's getting used more and more in our app--few recent examples are the "Promote Post" UI if you're a business account and want to promote a post from inside Instagram, the Saved Posts feature, and the comment moderation tools we now provide around comment filtering.


I'd like to know this, too.


The swift compiler is probably too slow for projects as large as instagram.


Peculiar this is being downvoted - compiler speed / reliability / support for incremental builds are all issues with large Swift projects. I've even see people go as far as splitting up their project into frameworks in order to bring build times below times as large as 10m and restore incremental compilation.


The swift compiler is used in many large projects.

And you probably overestimate how large Instagram, the app, is.


I know there's not much love for the API these days, but is / will there be a way to access wide color images from the API? iPads and Macbook Pros also have wide color screens nowadays, so it would make sense to display them for specific use-cases in third party clients.


Great article. Question not directly related to your writeup: in your casual usage of a phone/photo app with the new, wider color space, do you notice a difference in your experience of using the app/looking at images? Or, in other words, does the wider color space feel at all different?


Is Wide Color preserved when processing video?


Photos only. Apple's APIs only capture in Wide Color when shooting in photo mode, and their documentation only recommends using wide color/Display P3 for images.


Our iOS dev hunkered down today and may have come up with a decent solution to your EAGLView issue. You can find the write-up on medium: https://medium.com/imgly/bringing-wide-color-to-photoeditor-....


Your link somehow got truncated. Here’s the full thing: https://medium.com/imgly/bringing-wide-color-to-photoeditor-...


I this why iPhone photos of red roses, or other vibrant roses look poor? I love the idea of a photo room is it hand painted?


Yes they look poor if you watch them on not color managed software environments (should not happen on Apple devices). See here a sample image https://code.google.com/p/android/issues/detail?id=225281


Do you test with Adobe RGB (which many prosumer cameras can output) or only P3 (which AFAIK only Apple devices output)?


Just P3, though the wider gamut available in our graphics operations should benefit photos brought in using Adobe RGB too since iOS is fully color managed.


What is the pixel format of your OpenGL surfaces?


Maybe a stupid question. But how to save an image as Display P3 JPEG from Photoshop? I want to play with this color standard.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: