Using iterative possibility research sprints to redevelop our cinema websites
Find out how the User Experience (UX) team is conducting research to redevelop the BFI’s cinema websites.

Introducing the BFI’s UX notes
We are a small in-house user experience (UX) team at BFI – two interaction designers (one currently on maternity leave), two content designers, and one user researcher – working behind the scenes to support the organisation’s digital transformation.
This is the first of what we intend to be an ongoing series of ‘UX notes’ on this blog. Each entry will offer a glimpse into how we approach research, design, content and strategy in our digital work. These notes won’t always be polished, but they will give a flavour of our work.
Why we chose the iterative possibility research sprints methodology
Last year, when we began developing the digital programme experience for What’s On at BFI cinemas, we conducted in-depth interviews to understand our audiences. We gained a better picture of who they are, as well as their goals, needs, frustrations, and excitement when planning a BFI cinema outing.
We prioritised the insights from these interviews and brainstormed ideas for our next steps, but we reached a point where we struggled to choose one idea to work on and test. We didn’t want to invest too much time developing a solution that might turn out not to be the most useful.
That’s when we turned to iterative possibility sprints – a flexible user research method for exploring possible solutions. In just two weeks (in other words, one sprint), the team sets up a quick idea and explores it with a small sample of participants.
If you’ve heard of design sprints – a method developed at Google Ventures to quickly test product ideas – this is a similar approach, but with a stronger focus on learning through research. Instead of creating polished prototypes, we use these sprints to explore early ideas and see how users respond.
The elements are simple:
- designers shouldn’t spend too much time prototyping this idea
- run quick, low-cost tests to see how users react to new concepts
- iterate and run another sprint based on their feedback
This method helped us stay agile and user-focused without committing too early to a single solution. It felt right at the stage we were at in our project.
What we explored, sprint by sprint
We embarked on a journey of four iterative possibility sprints, gradually evolving from basic visual concepts to a mobile prototype where only some buttons and screens worked. After each sprint, we refined our designs based on user feedback.
We included two user profiles to ensure diverse perspectives:
- Frequent bookers: people who already visit BFI cinemas and are familiar with our programme
- Cultural goers: a broader group who occasionally go to the cinema and are less familiar with BFI
In some sprints, we included participants from both user groups, or just one user group. Generally, we recruited seven participants per user group per sprint.
Sprint 1: What’s in a programme? Exploring curated and filter views
From our interviews, we learnt that even frequent visitors struggled to grasp the breadth of our programme. We also learnt that navigation on the current site felt long and confusing, making it hard to find the right content at the right time.
So we decided to test two ways of browsing the programme:
- A curated view, which grouped films by theme or season in visual carousels
- A filter view, which displayed all films and events by date, with additional options to refine the selection using filters
We wanted to understand how each view helped with film discovery and planning, and how these views felt compared to the way users typically browse and choose films.

Users found both views helpful: the filter view supported planning, while the curated view inspired discovery. However, scrolling through both was tiring, and many worried about missing out.
Showcards also caught users’ attention:

These cards include details about each film, such as dates, duration and so on. Participants found them helpful, but wanted more information, like the film synopsis and times. So we decided to focus on that next.
Sprint 2: Hitting the sweet content spot between XL and XS minimal film showcards
After each sprint, we conducted collaborative workshops to prioritise findings and shape the next sprint.
At this stage, a key priority was finding the right balance of when and how much information to show about each film or event – enough to support decision-making without overwhelming users.
One of our designers came up with a plan to test showcards in an interesting way. The idea was to test two ‘extreme’ showcard styles:
- Extra-small (XS) cards with basic information
- Extra-large (XL) cards packed with all possible film details
Both could be expanded and collapsed. As in Sprint 1, we kept the designs simple to focus on the concept rather than visuals:

Users preferred XL collapsible cards—compact by default, expandable for more detail. We also learnt that clarity in language matters: terms like screening rooms named NFT (short for National Film Theatre), or labels such as “intro” and “discussion with programmers”, confused less-familiar users (those in the cultural goers user group).
Sprint 3: Navigating the What’s On page with a more refined prototype
Following sprints 1 and 2, we felt we were ready to test a more refined prototype with a fuller journey – still not fully interactive, but much closer to a real experience. We conducted sessions with both user groups and focused on the What’s On page, but also included the homepage and individual film pages in the prototype:

Our concepts worked overall. Users could find films and understand key information. They especially used date and time filters to select films, while often overlooking other filters (such as seasons, format or special events).
But there are still blockers.
Many quickly scanned the What’s On page but didn’t spend much time on it. A few clicked on the homepage, but didn’t scroll through it fully, highlighting a key challenge: how the structure and order of information can limit discovery. We know we need to keep this in mind during future testing.
We also saw that while BFI’s unique programme is a strength, it can feel intimidating to those less familiar with film culture. We need to make it more accessible to everyone.
Sprint 4: Checking smaller screen sizes
After Sprint 3, we shifted our focus to mobile. We know many users plan cinema visits on their phones, and smaller screens pose a challenge for presenting our complex programme.
We adapted the Sprint 3 journey for mobile and tested it:

Overall, participants were able to use our mobile prototype to select films and felt satisfied with the key information provided on showcards and film pages. They tended to rely more on the and spent less time on the homepage, which requires more scrolling. This suggests we may need to explore other ways to showcase the variety of our programme.
We also found out that the difference between the IMAX and Southbank venues wasn’t clear to everyone on the mobile version: some were confused whether IMAX is a screen at BFI Southbank.
What we learnt from the process
This was our first time using iterative possibility sprints, and here’s what stood out:
What worked well:
- This method suited the early phase of the project, where we needed to explore different ideas before investing too much in one solution
- It allowed us to build concepts gradually and base decisions on user feedback
- It helped us stay agile and keep momentum
- Collaborative workshops after each sprint kept the team aligned
What to watch out for:
- If you test more than one user group, the data analysis can take longer and may slow down the project, but more people could get involved in the analysis
- Testing mobile prototypes can be a bit of a challenge if you don’t have the right tool, so make sure you run technical tests with participants
- As designs become more detailed, partially clickable prototypes can feel limiting – you can still get useful feedback, but it may be a good moment to move on and use another research method
Next steps in research for BFI cinemas
From all these sprints, we know there are things we still need to address, such as users’ understanding of BFI terminology and the distinction between the two venues. We will keep exploring these aspects in our next phase, where we’ll launch a private beta and invite a cohort of users to test it over a longer period, possibly using a diary study. This will help us understand how behaviour evolves as users get familiar with the site.
Stay tuned for upcoming updates on our research, design and content journey!
Olivia Gerber is a User Researcher in the Technology and Digital Transformation team at the BFI. You can follow her on LinkedIn.
If you’re interested in supporting the BFI’s digital work as a sponsor or donor, please get in touch with us at philanthropy@bfi.org.uk
The BFI would like to thank the BFI Trust, The Uggla Family Foundation, Esmée Fairbairn Foundation and Bloomberg Philanthropy for supporting elements of our digital transformation.