An extensive data collection, curation and visualisation effort, Singapore in Colour was an interactive story that invited readers to explore the neighbourhoods of Singapore in a new light. We wanted to challenge the idea of Singapore being a grey concrete jungle by highlighting its vibrancy and colour that locals may sometimes overlook.

Done in celebration of National Day, the project reflected snapshots from all around the island and encouraged readers to explore the colours that animated each neighbourhood. It was a great opportunity for us to use the size of our team to build an original dataset, work with our colleagues on the photo desk and tackle a challenging approach to visualising data.

Research

In the initial pitch, the idea was to look at the colours of Singapore’s iconic landmarks like the Merlion, Marina Bay Sands and Gardens by the Bay. By day, they may be muted, but by dusk they come alive in vivid displays of light and colour. Would we be able to analyse these displays and say, for example, that the Supertrees emanate a magenta glow 70 per cent of the time?

Initial data sources for archive photos

We began looking for data sources. Flickr and Google Maps were two external options, and we were planning to use APIs (Application Programming Interface) – a method in which two programs communicate with each other – for this. We also spent time combing through our own archives of photos by ST photographers. But there were a number of challenges.

A good photo, we found, was one that featured an identifiable scene of interest, a few distinguishable colours and adequate lighting. We found heaps of selfies and random photos mixed into the public image sets, and that was before getting into the constraints of API limits and licensing permissions. None quite fit our criteria. So we decided to take our own photos.

Getting started

There was no shortage of directions to explore with colour analysis, we found. This blog post by Nicholas Rougeux, analysing the colours of New Yorker magazine covers, provided some nice insight into this world and was an early inspiration for this project.

First prototype of the town map concept

The first iterations were built upon work by Alexandra Khoo, digital graphics designer Joseph Ricafort and digital graphics journalist Stephanie Adeline done at a workshop in the master’s programme in visual tools to empower citizens, organised by Fundacio UdG and the ViT Foundation. At this stage, we were refining how to extract colours from photos and which APIs worked best.

Colour breakdown of the neighbourhoods in Singapore

Some APIs could show an image’s colour breakdown by percentage, while others simply displayed a palette of n colours. We ran into issues with one API that fixated heavily on the shadows in the image — giving us much more grey than we wanted – and ignored the colours that were present. Each algorithm had a different method of extracting colour, and we ultimately went with colorthief.

Colorthief is a JavaScript library created by GitHub user, lokesh, that grabs a colour palette from an image. We used this library to extract the top five colours that were the most prominent in each image.

Implementation

Photos

We dispatched more than a dozen members of the graphics team to cover 45 planning areas around the island with the instruction to “look for colour.” This returned nearly 2,500 photos which we filtered down to 2,000 — a difficult task, to decide which photos did the neighbourhood justice both in character and colour.

One of the photos taken by a Straits Times photographer.

In addition to our own efforts, we enlisted the help of our talented friends on the photo desk (spoiler alert: their photos turned out much nicer than ours). These photos were used in the introduction to the interactive and served as a beautiful collection of images to help tell this story. Our smartphone photos, on the other hand, served as the raw material for our data analysis. These two sets of photos created both a structured narrative in showcasing Singapore’s neighbourhoods and the unstructured exploration in providing the actual dataset within the piece.

Data

Building the dataset was an iterative process. We wrote a script to parse all the photos, organised by folders for each area, and generated it in a JSON format with arrays of five colours as RGB (Red, Green, Blue) values. But at certain points we needed to use HSL (Hue, Saturation, Lightness) values as well. We also wanted to store information about which area each colour came from and from which photo. The best part about building this data from scratch was that we could continuously configure it to our specifications.

Folder containing data on colours, image details and district

Development

The challenge with visualising 10,000 colours was how to render each data point in a way that could be grouped, sorted and animated — while maintaining performance, especially on mobile devices. We used Three.js’s InstancedMesh to render each colour and position them accordingly, based on hue, saturation or planning area group.

Demo of colour grouping

For the colour explorer interface near the bottom, we used D3, a JavaScript library, to render the colours and the layout. The colour-isolated view, beside the photo, was rendered onto a canvas using a filter that looped through the image’s pixels and preserved only those within a certain HSL distance in hue to the target colour, rendering all other pixels as grey.

Colour interface feature demonstration

Takeaways

This piece presented a challenge to both our team and to readers. It pushed us to take on a full-scale data collection effort and visualise rich data in an approachable and effective manner. Readers responded positively, with some expressing how they had taken for granted the colour that exists around them.

Colour analysis as a field is full of opportunities for exploration, both in presentation — as in the types of visualisation — and data sources — where the colours originate. We hope to see more stories out there in the future which investigate colour where we don’t expect it.

Read it here.