seen + learned

Visual design tips for app developers – consistency in the big picture: beyond where to place the button

Posted: Thursday, June 26, 2014 | Posted by Tania Schlatter | Labels: , , , , , 0 comments

In the book Visual Usability, we introduce what we call the "meta-principles" – three principles that are fundamental to the success of user interfaces. Consistency, hierarchy, and personality are not new to UI design. Considered primarily, however, they provide a structure for assessing and making decisions about the large and small visual characteristics of a UI design. When designing, keeping these principles in mind helps focus thinking about the effect of visual characteristics on how people interpret what they see from both the functional and aesthetic standpoints.

Interfaces have a big job to do. They need to provide information on multiple levels – convey content, communicate interactivity, and provide feedback. Before they can do that, they need to help someone know, or reinforce expectations of, what they are looking at. In Visual Usability, we go into detail about how visual characteristics of layout, type, color, imagery, and controls affect perception of consistency. Our goal is to raise awareness of these details and their effects. We want to help teams understand how people perceive visual characteristics, and make conscious decisions about them.

This post begins to address consistency in the big picture – as a principle for considering the approach to an interface rather than its details – concepts on a different strategic plane than where to place a button. From this perspective, it helps frame thinking about UX, and align UX with how it is represented through UI.

Internal consistency
Humans are wired to identify patterns in what we see and assign meaning to them. When establishing consistency within an application, lack of attention to consistent placement, color, styling, imagery (including symbols and icons), and typography can indicate difference to users, breaking the patterns they've perceived, and making them "stop and think."


This version of an older application had three separate ways of closing a window, forcing users to remember different paradigms every time they interacted with it.

Making sure a UI is internally consistent – that all the close buttons are styled similarly and placed similarly within a screen and from screen to screen within an application, for example – is familiar territory. Making sure that the patterns you choose to adopt and make consistent are in line with your users’ expectations is also familiar territory. What is different is calling out and thinking about the role consistency plays in affecting perception at the UI framework level.

External consistency
When Google created and launched Google Docs, the interface adopted symbols and patterns familiar to people who use desktop word-processing programs. This was a smart move strategically; consistent symbols reinforced similarities between the desktop and cloud word processing experiences, making Google Docs seem familiar and approachable. This helped Google transition users to word processing in the cloud. Consistency across platforms or applications is external consistency.



Google Docs UI, top; Microsoft Word 2010, bottom.

750words.com is also an application used for writing, but it has a different goal, and targets different users. Its tag line is "Private, unfiltered, spontaneous, daily," and the site focuses on people who want to write regularly. The UI provides only essential visual information – where to type – and the number of words written. The screen shows optional visual feedback for how often people are writing. UI elements and design characteristics do not say "word processing," but still encourage people to write. The decision to design the UI framework to be inconsistent with word processing applications helps the app show how it is different from – and better suited to – helping people write regularly.

750words
750words.com only indicates where to write, and supports writing regularly.

Consistency and novelty
Advertising and promotional media in general rely on inconsistency – or novelty, as it's known in communication theory – to get people's attention. Novelty can be a differentiator. Novelty as an approach can succeed when it’s authentic and effectively represents and reflects a difference that people find valuable. 750words.com takes a novel approach to writing online, and the choices made in its interface reflect that. Novelty can make visual information more memorable, as several studies,including this October 2013 one [PDF], have found.

While there’s nothing wrong with using platform standards as the initial foundation for a design, relying on them for the final design inevitably results in an app that lacks unique visual appeal. An app designed this way may get to market more quickly than an app that doesn’t, and provides interface characteristics consistent with other apps, rendering it easy to pick up and use. But without key visual differentiators that distinguish it from its competitors and help audiences build affinity, it risks falling behind the pack by not standing out.

For example, compare OpenMBTA (below left), an app for Boston’s public transportation system, to Grocery Gadget (below right), a grocery shopping app:

OpenMBTAGrocery Gadget

The apps serve completely different purposes, but because both rely almost entirely on iOS 6 visual conventions, they look so similar that they might as well be the same app. Even the custom “prepare/shop” toggle at the top of the Grocery Gadget screen isn’t enough to visually distinguish it from OpenMBTA.

By contrast, public transportation app CG Transit, while basing some of its look and feel on iOS 7 flat design standards, applies simple visual upgrades that push it beyond the base set of rules. Color bullet points connect a list of stops, a gradated map gives a high-level overview of the subway line’s path through the city, and a side menu with icons and color highlights combine to create classic visual appeal that make CG Transit feel fresher and friendlier than OpenMBTA.

CG Transit subway line screenCG Transit side menu

Striking a balance between consistency and novelty is at the heart of designing an effective and desirable user experience and interface. As one of the key visual principles behind what people notice and why, we hope you'll reconsider consistency, be aware of it, and use it when making decisions about how to approach and design visual interfaces.

Deborah A. Levinson also contributed to writing this blog post.

Using semantic differentials to evaluate visual design preferences

Posted: Monday, June 9, 2014 | Posted by Debby Levinson | Labels: , , , 0 comments

We recently completed a redesign for LoveToKnow, a site that provides advice on everything from beauty tips and pet health to travel recommendations and party-planning. LoveToKnow wanted a clean, clear design that welcomed users while also feeling believable and authoritative. The new designs had to reveal the site’s wealth of content without being overwhelming, encouraging people to spend more time browsing relevant articles, and ultimately, turning to LoveToKnow as a trusted source for help with all kinds of topics.

LoveToKnow wanted to test designs before finalizing them for launch. A/B testing was our first choice, but wasn’t an option for technical reasons. How could we confirm whether users preferred the new look and feel?

Visual design preferences are by definition subjective. However, tools like semantic differentials and Likert scales give people the terms they need to describe subjective opinions by choosing from a range of possible answers between two opposing statements. With this data, researchers can quantify what otherwise seems unquantifiable.

semantic differential scales
Sample semantic differential scales

Designing the test
LoveToKnow had questions they were hoping testing would help answer. We knew who our audience was: primarily middle- to upper-income women with some college education and moderate internet experience. We then broke the tests into three groups: one that set a baseline by evaluating the current designs before the new ones, and two that only saw the new designs, albeit in different orders in case viewing order affected preference.

We began by setting up a scenario so that test participants would all approach the designs with the same mindset: Imagine you have a puppy, and are wondering how much bigger it will get. You type “is my dog full-grown?” into a search engine, and click through to this page. Beginning with a scenario familiar to the dog owners we recruited immediately made the article page designs relevant, and a more realistic scenario is more likely to yield honest results.

Once users had the right mindset, we asked open-ended questions to gather general impressions of the page and website: what did people think the page offered? What did they think the site offered? What would they click on?

We also provided three semantic differential scales:

  • credible / not believable
  • inviting / unappealing
  • helpful / waste of time

Finally, we provided the same scales for three home page design options to help us choose a winner.

Building the test
Designing the questions, as it happened, was the easy part. The much trickier part was creating a test that would help us get answers through remote testing. We’ve had good experiences with usertesting.com before and believed we could make it work here, too; it was just a matter of linking our questions and designs to guarantee that users would proceed in a straight line instead of accidentally meandering off into the woods.

We settled on a split-screen approach: the questions would be coded in SurveyMonkey and show up in a left-hand frame. In a larger, right-hand frame, we’d display the page designs so that people would have them to refer to as they answered the questions. We also set things up so that clicking anywhere on a design would take people to the next one to review.

Over and over through our dry runs, though, we found that people got off-track immediately. They couldn’t help but click on what looked like a real web page to them, and no amount of written instructions stopped that basic behavior. (A real-life reminder of the usability truism that people often ignore written instructions!)

What did eventually curb the clicks was presenting test participants with a thumbnail of the appropriate page design in the survey, and asking them to confirm they were looking at the right design before answering questions. The colorful image broke up the survey’s wall of text and grabbed enough attention for people to stop, read, and understand.

split-screen test image

After that, it was just a matter of watching test videos and analyzing our SurveyMonkey data, which the site helpfully provides as charts and free text.

The final results
Some of what we discovered:

  • The new article page design was slightly more likely to encourage exploring the site. While participants found both old and pages credible, design had little to no effect on perception of credibility, and there was scant difference in design appeal.
  • However, the new design unquestionably helped people understand that LoveToKnow provided information far beyond just dog content, an important improvement.
  • Results for all three home pages were largely positive and equal, but our second design had an edge over the others, particularly in terms of credibility.

Big data UX, UI and IA

Posted: Friday, June 6, 2014 | Posted by Debby Levinson | Labels: , , , , , 0 comments

For the past three and a half years, Nimble Partners has worked with a big data company to design two suites of web and mobile-friendly applications investigating multiple aspects of the labor market, including demand, supply, job posting, job search, and career exploration. The applications serve a wide user base: job seekers of all ages and backgrounds, the state and local government officials assisting them, public- and private-sector employees researching labor market opportunities, and others.
We worked directly with customers, product management, and developers to redesign some applications, while building others entirely from scratch. Our work included both the customer-facing user experience as well as the tools administrators rely on daily to help job seekers and manage the applications. Customers were delighted with the results: one said we’d “nailed Jell-O to a tree.”

We provided:

  • User experience conceptualization and design
  • Information architecture
  • User interface design
  • Visual design
  • User testing

We can't show this work publicly. Contact us for access to samples via our demo site.