Samsung Galaxy S21 Camera: Color Point Focus Mode + Single Take

product design / image processing / computer vision / computational photography

PROJECT SCOPE: During my internship at Samsung Design and Innovation Center (SDIC), I was given the project of designing and prototyping different ways Samsung can integrate computer vision research into their mobile phones. SDIC was primarily a design studio with UX and Industrial Design teams, and they were excited to see collaborate with someone with a machine learning background to innovate their products.

My prototypes were locally to demonstrate the user interactions and technical feasibility of different features.

Beyond working on the camera, I also contributed to early-stage user journeys for the Galaxy Bud Live, in-store services, and high-end fashion-tech collaborations.

MY CONTRIBUTIONS: I created the first working versions and designs for Live Focus “Color Point”, Single Take, and Custom Filters, and Auto Framing, which eventually ended up in the final Galaxy S21 and Galaxy Z Fold2 phones.

I implemented new computational photography and computer vision algorithms from research papers, analyzing tradeoffs between computational resources and UX experience.

I presented my work to the Director of SDIC and Executive VP of Samsung Federico Casalegno.

TOOLS: OpenCV, Python, Tensorflow, C++, Halide, React, Kotlin, Android Studio

COLLABORATORS: Self guided, Advised by Nathan Folkman, advisory calls with Samsung’s mobile camera team and SDIC’s UX Design Team

Timeline: 3 months, Summer 2019

Process Overview

Phase 1: Research and Craft User Personas

Before jumping into designing, I spent my first 1.5 weeks building my foundations about the mobile phone consumer market, computer vision research, and computational photography research. I skimmed through papers from top computer vision and human-computer interaction conferences like SIGGRAPH, CVPR, NeurIPS, CHI, and UIST. I also drew inspiration from classes I’ve taken at MIT, Advances in Computer Vision Course Digital and Computational Photography, taught by Fredo Durand, Vision in Art and Neuroscience, and Advanced Product Design. Finally, I drew inspiration from my peers in their social media and mobile camera usage, as well as my work as a professional photographer.

 

Phase 2: Set Vision and Brainstorm Use Cases

Phase 4: Test and Refine

Phase 3: Create Low Fidelity Technical Prototypes

 
 

How might we redesign the mobile camera to best capture our memories and tell stories?

During my Summer 2019 internship at the Samsung Design and Innovation Center (SDIC), I was given an open ended project to design and implement new features for the Galaxy S21 camera, with a focus on exploring ways Samsung could integrate the newest computer vision and AI research into their phone.

SDIC was primarily a Design studio, so it was exciting to be contribute a technical perspective to the problems they were solving. While my project was done independently, I was supported with ideas and conversations by my manager, the mobile camera team in South Korea, other UX designers in the office.

Beyond working on the camera, I also contributed to early-stage user journeys for the Galaxy Bud Live, in-store service design, and high-end fashion-tech collaborations.



I independently designed the UX and prototyped first working versions of Portrait Mode Life Focus "Color Point" feature in Galaxy S21, which detects the subjects using AI and makes the background black and white. I created this idea and worked on the prototype independently, integrating AI models for detecting saliency and image segmentation from research papers. Drawing from my experience as a professional fashion photographer, my original idea was inspired by helping creators and small businesses create better product photos using AI. I tested my original prototype using snacks in the office with clean initial results.

From Initial Designs to Technical Prototypes to Final Released Samsung Galaxy S21 Features

Custom Filter

Create and customize your own personal photo filters using existing pictures, including those you have taken using the Camera app. You can now personalize your shots to match your own aesthetic and color tone preferences, both before and after taking pictures.

I made the first versions of Custom Filter by using chrominance histogram matching.

Live Focus -Color Point / Backdrop

Automatically create Instagram-ready studio style portraits using the Live Focus Mode in the camera. “Color Point” automatically recognizes the subject and makes the background grayscale. “Backdrop” detects your clothing color for a vibrant background.

I made the first prototypes of Color Point using high-resolution saliency models to create masks around the subject, and manipulating the background.

Auto Framing

Create hands free cinematic videos with Auto Framing Video Mode, which creates the best video based on motion saliency, subject detection, and object tracking.

Single Take

Your own personal content creator —just tap once and you get a wide selection of Instagram-ready photos and uniquely edited videos to choose from, automatically created with AI from videos up to 15 seconds long.

I created prototypes that generated "best shots" based on changes in sound and optical flow, with integrated auto-cropping, video stabilization, and object tracking.


Portrait Live Focus Mode

After prototyping the first working versions of these features, I also I developed the system design roadmap for these features, analyzing tradeoffs between UX, mobile computational performance, and limitations of data for training AI models.

Finally, I contributed towards developing user journeys for audio experiences for the Galaxy Buds, as well as in-store customer experiences.

https://www.samsung.com/us/smartphones/galaxy-s21-5g/camera/

Erica Yuen