The old adage “a picture is worth a thousand words” is very apt for Twitter: a single photo can express what otherwise might require many Tweets. Photos help capture whatever we’re up to: kids’ birthday parties, having fun with our friends, the world we see when we travel.
Like so many of you, lots of us here at Twitter really love sharing filtered photos in our tweets. As we got into doing it more often, we began to wonder if we could make that experience better, easier and faster. After all, the now-familiar process for tweeting a filtered photo has required a few steps:
1. Take the photo (with an app)
2. Filter the photo (probably another app)
3. Finally, tweet it!
Constantly needing to switch apps takes time, and results in frustration and wasted photo opportunities. So we challenged ourselves to make the experience as fast and simple as possible. We wanted everyone to be able to easily tweet photos that are beautiful, timeless, and meaningful.
With last week’s photo filters release, we think we accomplished that on the latest versions of Twitter for Android and Twitter for iPhone. Now we’d like to tell you a little more about what went on behind the scenes in order to develop this new photo filtering experience.
It’s all about the filters
Our guiding principle: to create filters that amplify what you want to express, and to help that expression stand the test of time. We began with research, user stories, and sketches. We designed and tested multiple iterations of the photo-taking experience, and relied heavily on user research to make decisions about everything from filters nomenclature and iconography to the overall flow. We refined and distilled until we felt we had the experience right.
We spent many hours poring over the design of the filters. Since every photo is different, we did our analyses across a wide range of photos including portraits, scenery, indoor, outdoor and low-light shots. We also calibrated details ranging from color shifts, saturation, and contrast, to the shape and blend of the vignettes before handing the specifications over to Aviary, a company specializing in photo editing. They applied their expertise to build the algorithms that matched our filter specs.
Make it fast!
Our new photo filtering system is a tight integration of Aviary’s cross-platform GPU-accelerated photo filtering technology with our own user interface and visual specifications for filters. Implementing this new UI presented some unique engineering challenges. The main one was the need to create an experience that feels instant and seamless to use — while working within constraints of memory usage and processing speed available on the wide range of devices our apps support.
To make our new filtering experience work, our implementation keeps up to four full-screen photo contexts in memory at once: we keep three full-screen versions of the image for when you’re swiping through photos (the one you’re currently looking at plus the next to the right and the left), and the fourth contains nine small versions of the photo for the grid view. And every time you apply or remove a crop or magic enhance, we update the small images in the grid view to reflect those changes, so it’s always up to date.
Without those, you could experience a lag when scrolling between photos — but mobile phones just don’t have a lot of memory. If we weren’t careful about when and how we set up these chunks of memory, one result could be running out of memory and crashing the app. So we worked closely with Aviary’s engineering team to achieve a balance that would work well for many use cases.
Test and test some more
As soon as engineering kicked off, we rolled out this new feature internally so that we could work out the kinks, sanding down the rough spots in the experience. At first, the team tested it, and then we opened it up to all employees to get lots of feedback. We also engaged people outside the company for user research. All of this was vital to get a good sense about which aspects of the UI would resonate, or wouldn’t.
After much testing and feedback, we designed an experience in which you can quickly and easily choose between different filtering options – displayed side by side, and in a grid. Auto-enhancement and cropping are both a single tap away in an easy-to-use interface.
Finally, a collaborative team of engineers, designers and product managers were able to ship a set of filters wrapped in a seamless UI that anyone with our Android or iPhone app can enjoy. And over time, we want our filters to evolve so that sharing and connecting become even more delightful. It feels great to be able to share it with all of you at last.
Posted by @ryfar
Tweet Composer Team
Did someone say … cookies?
X and its partners use cookies to provide you with a better, safer and
faster service and to support our business. Some cookies are necessary to use
our services, improve our services, and make sure they work properly.
Show more about your choices.