Jul 2017 - Present
Bixby - Say Less, Do More
Bixby is an digital voice assistant that helps you complete both simple and complex tasks through voice or text interaction. What makes Bixby stand out from other assistants are its powerful learning capabilities – applying your preferences to make smart choices for you – and its open development platform that allows developers to create more experiences on a wide range of devices, from phones to fridges.
Over the course of my time on this project I worked on many things, but my main focus was working with various groups of designers from Samsung to define the Bixby Views design system. The components of this library were designed with scale in mind, with the ability to be used on small hand-held screens, wearables, appliances, TVs, and any other device that Samsung wants. To simplify the development constraints and ensure consistency, we built the component library using the Atomic Design method, using four categories of UI elements: Atoms, for icons, color, typography images, etc. – Molecules, for form fields, buttons, cards, cells, etc. – Organisms, for pickers, carousels, galleries, etc. – and templated layouts for the various Moments that make up our Conversation Model.
These assets were meticulously maintained in a Sketch UI kit which I published and distributed to the larger design organization at Samsung. Along with maintaining the source of truth for the product assets, I also wrote the detailed interaction and development guidelines behind them. These guidelines served as the documentaiton foundation on the Bixby Developers Guides.
The Bixby Platform
Bixby operates on a platform developed by Viv Labs, where I work as a Senior UX Designer. I joined Viv in July 2017 and have worked on a number of projects to completely redesign Bixby and ship it to millions of devices worldwide.
Designing a platform product requires conceptual solutions that are built on a framework to create an experience that is consistent across devices, with screens, without screens, touch, voice, and text interfaces. In everything we do, we consider two things: How would the experience we're going designing be carried out in a natural conversation, and how to we distill the fundementals of that experience into patterns that can be integrated into our platform?
When I joined Viv, Bixby was already a product on the Samsung Galaxy S8. The product hadn't lived up to its vision, so our newly-acquired team had an opportunity to redefine the next generation of the product.
Viv had already begean establishing its foundation of how conversational experiences should work long before being acquired. We took that foundation, applied it to the use cases we'd be designing for Bixby, and worked to create a concept that could handle the large scale of services it would be used for. We worked for months to put together a vision and pitched it to Samsung's design team at a workshop in December. To our excitement, they liked our vision and agreed to work on it as a foundation for Bixby 2.0 on these principles:
What Bixby is
Clarity and brevity
Bixby is about getting straight to the point and delivering what you need, when you need it. Interactions with Bixby should be as natural as speaking with your most knowledgeable friend.
What Bixby is
Continuity and context
Conversations with Bixby are like a choreographed back-and-forth. They build on every moment in a conversation and understanding when to apply context.
What Bixby is
Bixby presents well-informed suggestions to help users decide; it doesn’t just return search engine results. It’s learning what users prefer and so each interaction informs a larger narrative.
What bixby is
Bixby enables users to get things done quickly. It rewards specificity, but still offers help along the way to the action.
What Bixby is not
Search and search engines have shaped the way people have sought information with lots of results to sift through. People need to invest significant time and energy to put two and two together to get things done in this interaction paradigm. Bixby only provides the most valuable results and options for users, helping them use their own time more wisely.
What Bixby is not
Messaging is a great way to interact with friends, family, and co-workers, and one often goes through the message thread to find important communication. However, keeping a history of the conversation with an assistant serves no purpose.
What Bixby is not
Bixby is not a voice command feature. It does not aim to allow users to control, navigate or manipulate apps with voice.
What Bixby is not
Apps have been great, feature-packed tools available at the tips of our fingers since the dawn of mobile. However, assistance is not about exposing all the features of the app through UI. Instead, Bixby focuses more on information and less on all the knobs and dials.
Defining the experience of the future
We held an intensive two-week workshop with a core group of stakeholders and some of Samsung's best talent to lay the foundation for the new Bixby, resulting in key decisions for the Conversation Model and visual experience being made.
I focused my work on two areas of the product: The Conversation Model– the basic structure of the Bixby platform made up of specific moments that carry a conversation, and how to systemize the pages within that model into classes of elements that would let developers make consistent experiences on the platform. This project turned into Bixby Views, the design system behind Capsules which I would work on for the next year.
Yeah, our TV is crooked.
Spreading the vision
Once the artifacts of our thoughts and discussions were collected and catalogued, the presentations saved, and the laptops closed, it was time to pitch the vision to the broader Samsung organization. The decks made their way through the thoroughfair of executives and decision-makers, and the vision was adopted. The gears were in set into motion and the real work was now about to begin. We were tasked to build the next generation of Bixby and prepare to launch it with the Galaxy Note 9 in September 2018.
On a particularly wet and cold moring in Seoul, South Korea, I joined some of my team members and our lead on-stage in front of hundreds of UX designers, product designers, writers, illustrators, and animators, to showcase the model for Bixby 2.0.
From solo shops to mega corps, anyone with a vision can pick up our easy-to-use developer tool and build voice and touch experiences for Bixby. Since anyone can build anything, we needed to put in place a system that could handle anything.
We started by taking a series of must-work use-cases that were cornerstone to the Bixby experience. We created flows for ordering coffee and Uber rides, finding and playing media, looking up points of interest or communicating through phone calls and text messages. We categorized all of our experiences that other developers would surely build into, and based off of those categories, created interaction frameworks and UI component patterns.
Companies like Apple and Google have robust design system frameworks that address similar problems to ours, and Atomic Design addresses a myriad of system and scaling issues and is a proven framework for our system's needs. Knowing these models, I guided the team to establish a framework for common interfaces made up of repeated building blocks with an Atomic Design structure. This model allows us to develop a smaller library of scaling components which are reused in other components in the system. This ensures consistency in the look and feel of Bixby, its interaction paradigms, and the overall experience, as well as lowering the cost of development and maintenance of the system.
Towards the end of the first version of the design system we restructured and repackaged the components into a model that was more simplified and easier to understand. This attention to detail was showcased at the 2018 Samsung Developer Conference (SDC) in San Francisco, along with the entire newly-designed Bixby product, and shipped to all new Note 9 devices.