First Rail

Expert review and gap analysis to elevate digital experience

Strategic research to elevate mobile UX for a group of train companies

  • Aligned on strategic competitors and key criteria to evaluate high-value journeys.

  • Adopted an agile mindset to deliver interactive ‘mega-matrix’ of 70x insights to inform a roadmap digital experience improvements all train operators in the group.

Client

First Rail

Competencies

Heuristics Accessibility Gap analysis

Industry

Transport

Duration

8 weeks

App Screen with open sidebar
App Screen with open sidebar
App Screen with open sidebar

Context

  • The UK network is formed of separate franchises. First Rail is the holding company for a number of regional Train Operating Companies (TOCs), accounting for 24% of the UK rail market.

  • In recent years rail delivery and customer confidence had been affected by industrial action and disruption.

  • As a result the rail industry was under pressure from central government to improve performance, provision of disruption information and access to customer support.


The challenge

  • Each of the First Rail TOCs used their own limited budget to meet the needs of regional customers. Individually TOCs struggle to innovate.

  • The Web Product Manager for First Rail wanted a piece of ‘meta research’ to identify best practice among First Rail TOCs and the travel sector, in order to elevate the digital customer experience for all of the group's customers.


Objectives

  • Conduct an expert review and comparative analysis to understand:

    • What features or content are other organisations using to support key user journeys?

    • Where are the opportunities for First Rail TOCs to make substantial improvements to digital customer experience?


  • Gain an understanding of the general level of service and awareness of customer expectations in the sector.

  • Evaluate selected First Rail operators and strategic competitors to identify gaps in UX performance, focusing on mobile-first design for web relative to app

  • Generate insights to create a roadmap of coordinated improvements.


Constraints

  • The scope and timeline were limited by the client's need to change technical agencies and re-platform before a contract deadline.

  • Requirements and flexibility in the booking flow were limited by the functionality of the 3rd party booking engine.

  • Implementation of the new design system and the build itself would be handed off to the client’s technical agency.


Users

  • Key stakeholders: Web Product Manager and Business Analyst; responsible for group budget and strategy.

  • Digital teams: Responsible for digital delivery for individual TOCs.


My role

  • As Senior Designer I led workshops and day-to-day activities in collaboration with an Early Stage designer, and with oversight from the UX Director.

  • The Client Success Manager and Project Manager helped manage the engagement.


Approach & research activities

  • Alignment workshop

  • Competitor evaluation and gap analysis

  • Presentation of findings

Whiteboard showing synthesis of findings
Whiteboard showing synthesis of findings
Whiteboard showing synthesis of findings

Capturing stakeholders' concerns and theming to define the focus for research

Kick off workshop defined the problem space

  • I facilitated a workshop with Stakeholders from across the First Rail group to align on 3 key journeys, 6 strategic competitors and to get questions and expectations out in the open.


    Key user journeys:

    • Ticket purchasing / checkout (before)

    • Finding disruption information (during)

    • Customer service / self service interactions (after)


  • Thematic analysis of stakeholder input resolved to 27 research objectives, to guide evaluation. In hindsight it would have been useful to prioritise these, to focus effort later.

  • Stakeholder availability impacted the project timeline and participation. As a team we agreed a tight schedule of check-ins and asynchronous input to minimise further disruption.

Workshop agenda and review of notes
Workshop agenda and review of notes
Workshop agenda and review of notes
Thematic analysis of workshop comments
Thematic analysis of workshop comments
Thematic analysis of workshop comments

Refining the approach to evaluation, and highlighting best practice for mobile-first and accessible design

Piloting the approach for efficiency and value

  • Evaluation would be extensive so work would be divided. I adopted an agile mindset to pilot our process and assess the value of the output.

  • Review of the first competitor created some time pressure and a need to streamline evaluation. Variation across platforms also highlighted the need for a holistic view of journeys by competitor.

  • A client check-in helped further refine the objectives, criteria and align on format and detail.

Draft evaluation with notes for discussion
Draft evaluation with notes for discussion
Draft evaluation with notes for discussion

Refining the outputs of evaluation for efficient analysis

Iterating on the solution

  • As research progressed regular check-ins served to evolve the format of the deliverable.

  • Research questions provided an entry point to evaluation at competitor level. However, product teams would need to cross-reference the 'meta' gap analysis and compare summaries. Clear formatting made criteria and insights easier to scan.


“This is amazing, it’s really easy to get around.“

Business Analyst, First Rail

The project folder. Hyperlinks enabled users to navigate from the matrix to specific screen references

Outcome: Deliverables that delight!

  • Gap analysis was delivered as a Figjam ‘mega-matrix’ - an interactive grid highlighting best and worst-in-class, along with insights and recommendations to elevate digital experience for all First Rail franchises. Each cell linking to the individual evaluations.


“There’s enough here to keep us going for a year.”

Web Product Manager, First Rail

  • The format would enable stakeholders to navigate and share inspiration with non-delivery actors across the business.

  • By piloting playback internally I was able to refine my storytelling and verify that the structure and method of navigation were intuitive.

Learnings and takeaways

Evaluating apps doubled the work but not the insights

  • As suspected the quality of native apps varied (possibly due to outsourcing and investment). However, where an app offering was mature the accompanying mobile web offering tended to provide similar inspiration for the roadmap, so effort could have been better prioritised.


Disruption disrupts

  • You can’t evaluate disruption if there is none so we leveraged stakeholder’s attention to flag unplanned disruption so we could double-back to observe and evaluate.


Unmet needs can be missed by competitor analysis

  • The project focused on competitor analysis to drive improvements. Without data from real customers we advised on the risk of investing effort in low value changes without validation. It may have been more prudent to limit the scope of generative research to include this.


Be Agile more often

  • After the pilot we worked in a somewhat linear way. This made it effortful to re-engage and summarise each of the research questions and criteria at the end. It feels like it would have been easier to work laterally, to evaluate all competitors by a selected criteria and summarise as we went.


Make time for ambiguity

  • The scope of the objective and expected output made for an interesting project but also made it difficult to work efficiently. We might have benefited from more time at the beginning to define the criteria, workflow and format for research