Case study

Atlassian Community

Atlassian · Question asking · Support UX

Overview

A question without context isn't a question. It's a delay.

When Atlassian customers posted support questions to the Community, the answers they needed were buried under rounds of follow-up. The problem wasn't the community — it was the information that never made it into the question to begin with.

I inherited a defined brief for this project and shaped the approach significantly — running my own customer interviews, designing and iterating through three rounds of prototyping, and leading the team through a six-week sprint from ambiguity to a confident, shipped solution.

The problem

87% of questions were answered by other customers. Context was everything.

The Community wasn't just a forum — it was the primary support channel for several Atlassian products. When a question landed without the context an answerer needed, the cost wasn't just a slower response. It was a slower path to resolution for the customer who asked.

  • Critical context — cloud vs. on-premise installation, product edition, admin vs. end-user role — was never captured by the question form because the fields simply didn't exist.
  • The result was a slow, frustrating loop: answerers asked clarifying questions, authors came back to provide details, time elapsed, resolution was delayed.
  • Atlassian had made Community the preferred support channel for select products. A broken question-asking flow was a direct threat to that strategy.

The insight

The real pain wasn't the back-and-forth. It was “others can't see what I see.”

I conducted customer interviews and ran a cross-disciplinary journey mapping session with product and engineering. Four pain points surfaced — but one was distinctly more urgent than the rest.

Customers weren't struggling to phrase their question. They were struggling to describe their setup — and the form didn't help them do it. The answerer and the asker were looking at two different products without knowing it.

What customers said

They couldn't show the community their environment. They lacked the vocabulary to describe it accurately, and the form gave them no scaffolding to work from.

What we decided to solve

Pre-populate every question with the customer's actual Atlassian product context — making their setup visible to the community without requiring them to know what to share.

The hardest design decision

What seems obvious inside the company means nothing to the customer.

Internally, the cloud URL felt like the clearest possible signal that a customer was on the cloud product — unambiguous to everyone at Atlassian. To customers, it was just a string of text they'd never thought about before. Every version of the UI that relied on internal Atlassian logic failed in testing.

The core challenge was translating the cloud vs. server distinction — something deeply meaningful to the product's function — into a UI that customers could navigate confidently without needing to understand what a cloud URL was or why it mattered.

Craft

Each prototype taught us something we couldn't have assumed.

We ran three rounds of usability testing, with the design of each round informed directly by what went wrong in the previous one. The progression mattered as much as the end state — the final pattern only worked because the earlier ones had failed in specific, legible ways.

  1. Passive visibility with a lock icon

    We surfaced context as read-only with a lock, signaling it was included but not editable until later. In testing, many participants still didn't understand what would be visible when they posted.

  2. Aggressive upfront visibility controls

    We tried exposing every visibility toggle before composition to see if fear of exposure was the blocker. It increased cognitive load without improving completion — the form felt like compliance, not support.

  3. Visibility settings moved lower in the form

    Letting authors draft the question first and confirm visibility after better matched how they described their mental model in interviews — and produced clearer, more confident decisions in the final round.

Delivery

Six weeks. One brief. A shipped product with evidence behind it.

Discovery, definition, three rounds of prototyping, usability testing, and a six-week sprint — each phase tightened scope against what we were learning from customers, not just what was easiest to build.

The final experience let customers select their Atlassian product, control exactly what context was visible to the community, and post a question with their full environment pre-populated.

Where I spent my time. Design lead · Customer interviews · Journey mapping · Rapid prototyping · Usability testing

Impact

Fewer drop-offs. More questions successfully published.

Two of the four success metrics we set at the start of the project moved meaningfully at launch. The experience was doing what we designed it to do — getting more questions into the community with less friction.

Abandonment rate

40% → 33%

7 points recovered

Successful publishes

60% → 67%

7-point lift post-launch

Sprint length

6 wks

discovery to ship