Card Content Optimization

Context | Challenges | Breakpoints | Concept | Feedback | Insights | Design


Context

The Socrata Performance Management suite of tools is utilized by many government agencies to track the progress of investment and policy decisions against strategic goals. One way information on the Socrata platform is conveyed to citizens is through public-facing, customer-created webpages that tell a story through data, including content cards, data visualizations, data tables, photos or videos, and text.

What is performance management? Performance management is the process to structure data that measures against a government agency’s goal; this process drives overall investment and policy decisions to achieve certain outcomes within government agencies at all levels - whether it’s city, county, state, or federal.

As the lead product designer on this product area, my team and I were tasked with optimizing the content card to better fit our customers’ needs within our redesigned Performance Suite, which now enabled advanced collaboration across agencies’ data programs. These content cards serve as quick snapshots on dashboards, sharing progress on performance measures with their citizens and general public. They contain at-a-glance, important information about the performance measure: the calculated data value, the units of the measure, the title, targets, and its status.

Older versions of dashboard content cards


Challenges

Some of the constraints that accompanied this work included:

  • A history of challenging customer needs, with a different pace often misaligned with typical agile software development.

  • Embedded on a lightweight CMS-like publishing tool called Perspectives, containing an outdated tech stack created in Angular in a larger React ecosystem.

  • A business need to add smaller-width content block to accommodate multiple cards placed in a dashboard format. This was required to meet feature parity to allow migration to newly revamped Performance Management product.

  • Broken responsive breakpoints based on outdated device widths.

  • Multiple layers of users to consider: the general public or citizens (consumer) + multiple content creators as analysts or program managers or leaders.

Personas developed from prior research of different consumers and creators of the Performance Management suite. Click to view detail.

Personas developed from prior research of different consumers and creators of the Performance Management suite. Click to view detail.


Breakpoints

At the time of this project, I had a good sense that the publishing tool’s breakpoints were a messy endeavor. As I combed through the CSS with an engineer, we discovered that there were actually 5 breakpoints (3 official ones, 2 questionable ones) that were not typical breakpoints and were based on older mobile devices.

Screen Shot 2019-09-26 at 4.08.00 PM.png

I also performed an audit of the entire Socrata platform (outside of my product team’s area) and a quick industry deep dive into standards. Breakpoints were either inconsistent or missing from different pieces of the platform.

As the lead for establishing the design pattern system at Socrata, I paired with a peer designer, to evaluate these older breakpoints with current needs of the platform and the product team’s overall future direction. We worked together to determine the best course of path across areas of the platform.

  • As a first step, it was imperative to move all existing platform pages with breakpoints to a consistent standard of defined breakpoints. Once this was done, the fluid, device-agnostic grid could be accomplished at a component level across the platform to improve scalability and efficiency.

  • I created an internal wiki that captured where we were at that point in time, especially since appetite for these breakpoints across development and design was not high. External challenges included other teams not being ready for or receptive to the amount of effort required to implement this standardization across the entire platform. Building support for this across the organization would take a lot of time.

Screen Shot 2019-09-26 at 4.08.07 PM.png

Concept

Based on consistent customer feedback over time, and their observed behavior over time, we knew that candidates for pieces of content on these performance measure cards included:

  1. Performance measure title

  2. Calculated data value

  3. Unit label for data value

  4. Status color and text label

  5. Date range (new)

  6. Target numbers detail (new)

  7. Link to performance measure

  8. “Ended” flag when a measure ends at a specified date

  9. Trendline of data visualization (new)

As I explored the old design with the above pieces of content, I realized it would be important to determine how much information was enough to make sense for our customers. Wireframing 3 content design concepts and soliciting feedback from existing customers would help us move forward (see below for the low-fidelity wireframes…and yes, that is Comic Sans**).


Quick and Dirty Feedback

To determine which pieces of content resonated most with customers, I pulled together a qualitative questionnaire. Based on past research experience, customers frequently told us that all the information was important. So we gave them 3 non-interactive rough content card wireframes in a dashboard configuration as fodder for feedback. We qualitatively asked 11 key people from our performance customers what information was working for them, what wasn’t working, what was missing, and why. Additionally, we had participants rank the order of the types of contents found on these dashboard examples, in relative order of preference.

** Previous customer feedback had yielded strong opinions about specific elements of the visual design that were out of scope of this project. For the purposes of this questionnaire, I pared down the design’s look with a simplistic font and removed elements that distracted from the focus on the content itself.


Insights

  • For months, we had been told by very vocal customers that the trendline would be a popular component of this card, and yet it was ranked quite low. But as we read through qualitative answers to the questions, we noticed that people had difficulty with the trendline and didn’t like it. While trendlines were interesting, a lot of context about the data itself was missing in such a small area. Other contributing factors included the simple visual look of the wireframes, an openness to misinterpretation without looking at the detailed data visualization, and one-item-too-many on the content card. The trendline was removed from consideration.

  • Targets were unexpectedly ranked lower in preference overall, though when they were missing from the design, customers noticed and voiced their concern. The target was helpful and crucial for the display of the performance measure, but lacked clarity (“[Target] [#] [Date or quarter]”). Should it display if the target date has passed/ended? Should it be adjusted depending on the breakpoint and the corresponding card size? We needed to consider other ways of displaying and formatting the target information.

  • As expected, the 3 main pieces of information on a performance tile (title, big number, and unit label) were ranked highest in the top 3 choices. This would continue to be a high priority for our customers’ requirements.

  • The character length of the card titles were not meeting the needs of our customers. We needed to lengthen this, while setting a character limit as we knew customers had tendency to creating lengthy descriptions about the performance measure in this field with very little real estate.

  • In the end, the details of the title, status color and label, target value, “ended” flag, and the date range would need to flex and adapt to the real estate allowed, depending on the breakpoint. Displaying all information as desired on every breakpoint would not be possible.

  • User confusion around the role of the content card — does it make sense to convey as much information as possible on the card versus inviting the user to explore the underlying data? The product needed to have an opinion and encourage “click to learn more” behavior by linking to the measure page itself, with its calculated dataset, background and context, and detailed information. The “view measure” link would continue to be crucial to accomplish this.

  • One interesting caveat of these results was that the people who experienced these options were viewing these as a webpage visitor (not the creator, which is the role they were more accustomed to being - see personas above). Seeing these concepts in “view mode” possibly accounts for an interesting angle within the feedback — of being in the shoes of their own customers. With more time allowed for testing, it would have been interesting to further test these multiple layers of perspectives.


Design

Once the content pieces for the content card were determined, I paired heavily with a lead engineer to pair design and focus on developing the tool’s new 6x1 content block. We also cleaned up the responsive breakpoints based on the new standard that I spearheaded, in addition to refactoring code and fixing bugs. Core to this challenge was ensuring that most of the pieces of content important to customers could flex to the cards within the new breakpoints. We prototyped it directly in code, and then carefully quality tested all the possible configurations all the way through to implementation.

In-progress, messy prototyping of a few breakpoints and the new 6x1 content block.

Creator view: 6x1 content block

Card - old and new

Shortly after these changes rolled out, several customers began generating massive amounts of performance measures. Customers were beginning to take advantage of these new features on these cards — the count numbering in the hundreds — to create dashboards and initiative-specific webpages that shared progress on their performance measures.

Live example: Pierce County, WA

Click to view detail