Samsung Galaxy S21 Camera: Color Point Focus Mode + Single Take
product design / image processing / computer vision / computational photography
PROJECT SCOPE: During my internship at Samsung Design and Innovation Center (SDIC), I was given the project of designing and prototyping different ways Samsung can integrate computer vision research into their mobile phones. SDIC was primarily a design studio with UX and Industrial Design teams, and they were excited to see collaborate with someone with a machine learning background to innovate their products.
My prototypes were locally to demonstrate the user interactions and technical feasibility of different features.
Beyond working on the camera, I also contributed to early-stage user journeys for the Galaxy Bud Live, in-store services, and high-end fashion-tech collaborations.
MY CONTRIBUTIONS: I created the first working versions and designs for Live Focus “Color Point”, Single Take, and Custom Filters, and Auto Framing, which eventually ended up in the final Galaxy S21 and Galaxy Z Fold2 phones.
I implemented new computational photography and computer vision algorithms from research papers, analyzing tradeoffs between computational resources and UX experience.
I presented my work to the Director of SDIC and Executive VP of Samsung Federico Casalegno.
TOOLS: OpenCV, Python, Tensorflow, C++, Halide, React, Kotlin, Android Studio
COLLABORATORS: Self guided, Advised by Nathan Folkman, advisory calls with Samsung’s mobile camera team and SDIC’s UX Design Team
Timeline: 3 months, Summer 2019
Process Overview
Phase 1: Research and Craft User Personas
Before jumping into designing, I spent my first 1.5 weeks building my foundations about the mobile phone consumer market, computer vision research, and computational photography research. I skimmed through papers from top computer vision and human-computer interaction conferences like SIGGRAPH, CVPR, NeurIPS, CHI, and UIST. I also drew inspiration from classes I’ve taken at MIT, Advances in Computer Vision Course Digital and Computational Photography, taught by Fredo Durand, Vision in Art and Neuroscience, and Advanced Product Design. Finally, I drew inspiration from my peers in their social media and mobile camera usage, as well as my work as a professional photographer.
Phase 2: Set Vision and Brainstorm Use Cases
Phase 4: Test and Refine
Phase 3: Create Low Fidelity Technical Prototypes
How might we redesign the mobile camera to best capture our memories and tell stories?
During my Summer 2019 internship at the Samsung Design and Innovation Center (SDIC), I was given an open ended project to design and implement new features for the Galaxy S21 camera, with a focus on exploring ways Samsung could integrate the newest computer vision and AI research into their phone.
SDIC was primarily a Design studio, so it was exciting to be contribute a technical perspective to the problems they were solving. While my project was done independently, I was supported with ideas and conversations by my manager, the mobile camera team in South Korea, other UX designers in the office.
Beyond working on the camera, I also contributed to early-stage user journeys for the Galaxy Bud Live, in-store service design, and high-end fashion-tech collaborations.
I independently designed the UX and prototyped first working versions of Portrait Mode Life Focus "Color Point" feature in Galaxy S21, which detects the subjects using AI and makes the background black and white. I created this idea and worked on the prototype independently, integrating AI models for detecting saliency and image segmentation from research papers. Drawing from my experience as a professional fashion photographer, my original idea was inspired by helping creators and small businesses create better product photos using AI. I tested my original prototype using snacks in the office with clean initial results.
From Initial Designs to Technical Prototypes to Final Released Samsung Galaxy S21 Features
Portrait Live Focus Mode
After prototyping the first working versions of these features, I also I developed the system design roadmap for these features, analyzing tradeoffs between UX, mobile computational performance, and limitations of data for training AI models.
Finally, I contributed towards developing user journeys for audio experiences for the Galaxy Buds, as well as in-store customer experiences.
https://www.samsung.com/us/smartphones/galaxy-s21-5g/camera/