Our site uses cookies to ensure you get the best experience on our website.

Product design & research
Bw hero
Brandwatch vector logo CROP

Brandwatch

Applying research & design processes to a SaaS data science team and their AI models

Senior product designer and researcher

Permanent (2021-present)

Background

Brandwatch is a leading social analytics platform that allows businesses and their analysts to gain insights from consumer’s social media activity. These insights are used by (typically global, enterprise) businesses to help plan and measure their marketing strategies and product innovations. It provides services to clients such as Meta, Google, Bank of America, GSK, Unilever and Disney.

Problem

I joined Brandwatch’s AI team who focus on augmenting raw data into insights so that expert analysts can work faster and non-experts can gain value with less skills. Designing and delivering data science AI and ML features previously was a monolithic effort and I found that engineering teams were often embarking without research and spending months siloed without any user feedback. It wasn’t uncommon for long projects to fail or for unfit features to be released. These wasted efforts meant that competitors were moving ahead and our clients were inevitably churning making work very difficult for our sales and CSM teams.

Who I worked with

  • C-suite

  • SVP & VP product, research, engineering and data science

  • PhD level data scientists and engineers

  • Product directors and owners

  • Lead researchers & UI designers

  • Customer support and sales teams

Solution

Having worked using Lean and user-centred design processes for years, my goal was to apply these to our team so that we had confidence that we were using our valuable technical efforts in the right direction. We needed to get a better understanding of our users and show them a variety of solutions before focusing on one. Key activities to achieve this were:

  • Generative research and discovery phases to inform strategy

  • Team and stakeholder ideation sessions to create and test multiple concepts

  • Prototyping with real data to gain accurate user feedback


This mindset change took several months of internal conversations and relationship building to establish. This required earning the trust of several directors and a team of 40 engineers and data scientists as well as the promise to guide and support them in this new process.

Outcome

The team have

  • Stopped working in development or data science silos without user or team feedback
  • Scoped technical explorations so that they are time-capped and goal oriented
  • Understood that a variety of solutions must be explored before committing… and that design and research can support this via concept testing

  • Understood that we need to fail fast, and use quickly built, data driven prototypes to achieve this

  • Attended numerous research sessions, whether generative (understanding user needs/pains) or evaluative (seeing their prototypes being used)

“​​Ben's skillset is unique: he functions equally well as a senior UX researcher and a senior product designer, and that means he can shepherd a project from the early discovery / problem definition phases all the way through delivery and evaluation. I worked with him in my capacity as product director for a team of about 35 engineers and data scientists building AI-oriented SaaS products, and I relied on him not only as an individual contributor for specific projects, but also as a leader who introduced and improved product development process for the entire team…

…He has a better understanding of the user experience side of AI products than anyone else that I've worked with”

Paul Siegel - Product Director

Screen Shot 2022 11 03 at 9 25 28 AM

"Lean UX manifesto": Educating the team on product and design principles that underpin the new process I implemented

Screen Shot 2022 11 03 at 9 25 34 AM

"Google Labs and user needs in AI design": Educating the team on product and design principles that underpin the new process I implemented

Screen Shot 2022 11 03 at 9 26 33 AM

Google labs: testing with data and design fidelity. Industry concepts that underpinned our new process

Screen Shot 2022 11 03 at 9 28 19 AM

Presenting to our team on faking features to elicit user feedback

Artboard Copy 2

Our core user problems which I synthesised from generative research

Frame 1a

Mapping out our new process: the discovery and exploration phase

Frame 2a

Mapping out our new process: technical feasibility assessment and prototyping the solution

Frame 3a

Mapping out our new process: design supporting throughout development then researching in beta and live

Key activities / projects

Rapid prototyping with data science teams and their models

Getting feedback on analytics and research products relies almost completely on a user seeing their own data in an app. Traditional flat concepts only go so far in garnering useful information. Therefore we realised very quickly that we needed to prototype with data, however achieving this in the platform was a slow process that didn’t lend to innovating. Our solution came in the form of a data science 'sandbox' where we could code quickly, away from the production codebase, whilst using client data. After ideating in workshops we prototyped concepts in this sandbox and tested with users. This sped the process up by months and shifted the workload away from over burdened engineers and towards our less constrained data scientists.

Trend detection and analysis feature

AI and ML features help speed up user's time-to-insight and are powerful features for sales teams. Our data science team created an algorithm that identified shifts in activity in certain topics or communities and presented these to users so that they could spot opportunities to market and develop around these trends. Being first-to-market around a trend is a key advantage.

After researching to establish user needs, I worked with a designer to run ideation workshops with a tribe of 40+ people who created concepts for the algorithm. We undertook testing of the algorithm with user data to validate it’s accuracy and plan iterations. I am currently planning and executing a closed beta research study to assess how the model needs to be refined before it is rolled out to the entire client base.

Trends prototype

The Figma prototype helped inform interactions and was used for handover to engineers

Trends prototype july

Our high fidelity prototype, in the live product and connected to user data

Screenshot 2023 07 06 at 10 41 25 AM

Team review of sparkline features to discuss technical possibilities and constraints

Explore mode

An earlier iteration of the prototype

Screen Shot 2022 11 03 at 9 35 56 AM

User feedback report on our early concepts

Artboard Copy 11

Alternative concept exploration

Screen Shot 2022 11 03 at 9 51 01 AM

Plotting a user journey from generative research

Screen Shot 2022 11 08 at 12 16 04 PM

Ideation workshops where teams sketch ideas, present and critique (hybrid remote and in-person)

IMG 20220608 153349

Ideation workshops where teams sketch ideas, present and critique (hybrid remote and in-person)

IMG 20220608 144942

Ideation workshops where teams sketch ideas, present and critique (hybrid remote and in-person)

IMG 20220608 142943

Ideation workshops where teams sketch ideas, present and critique (hybrid remote and in-person)

Screen Shot 2022 11 03 at 9 40 44 AM

User problem statements supported by snippets from customer research. "How might we" statements help feed ideation sessions

Screen Shot 2022 11 03 at 9 34 26 AM

User problems distilled from generative research. We used these to ideate from

Screen Shot 2022 11 03 at 9 34 19 AM

A user journey from generative research

Screen Shot 2022 11 03 at 9 34 09 AM

Generative research report - presenting key themes to our teams to ideate with

Building relationships with client facing teams

To further understand our users, in the context of the sales process and account support, I used my research skills to interview our client facing staff. For me, understanding the business side of a project is critical in ensuring that the feature gains traction; whether that's challenging competitors or enabling renewals.

Quick delivery of critical features

Sometimes a thorough and lengthy process isn't appropriate. Product managers and directors have spent time with customers, who have upvoted for roadmap features and we are clear on what needs delivering. Working quickly with PMs and engineers we brought those features to life and released them (Using a Figma design system made the job even quicker).

To seek user feedback we guerrilla tested with staff who use our products, and the users who requested the feature. Both were quick and easy: we hopped on informal calls then noted key comments. A process which took less than a week.

Account settings

Total engagement metrics had been an ongoing client request. We worked quickly to design it's touchpoints in the product

Aspect

Creating new options for users to view the sentiment of social media posts allowed for more accurate analysis

Risk

After acquisition our new owners requested we inject algorithms detecting risk into the product

Sp3 comparing

Comparing cohorts of social media authors allows users to understand each groups opinions of their brand

"Ben worked closely with me on the Brandwatch Engagement Score - a feature that delivered 8 weeks of continuous user growth and became the fastest adopted metric. This impacted ~$700k MRR receiving the highest adoption from our Enterprise user base."

Adam Brons-Smith - Senior Product Manager

Lessons learned

  • Take time to understand the whole business. Go beyond users and stakeholders and interview client facing teams which will help you fill in the blanks about why a project may be so important, and what it needs to succeed.
  • Data scientists appear to be engineers however they actually work far further upstream in the realm of new concepts. They are more like designers and likely require a process and mindset from that role.
  • To change a large teams process you need to listen to that team, earn their trust and then be responsible for supporting their change.
  • When concepting, get architectural and system engineers to review designs for technical feasibility and remove the chance of a project hitting those barriers later in implementation.

Get in touch

If you’d like to discuss any roles, projects or how research and design might solve your business problems, drop me an email.

Send me an email