Defragmenting a Data Team
Growing up as a Sr. Data Science Manager while the world went into isolation mode
A big thank you to all who have followed along so far as I’ve kicked off this blog! As the year winds down, I’m recounting a tale of how one small and mighty data team came together and saved Christmas the relationship between two feisty protagonists: Product and Marketing. Grab yourself a mug of eggnog and enjoy!
On Friday afternoon, March 6, 2020, I took a look at my desk in the San Francisco office to see if I was forgetting anything. As more cases of a new virus were spreading in airports, a work trip to Toronto that had been scheduled for the following week got canceled, with the distributed team opting to have a remote session instead. Since I was on the west coast, it meant 8 a.m. meetings, so I intended to skip the commute and spend the week calling in from my apartment where I lived outside the city. With an emphasis that implied virtually, I said “‘see’ you next week” to some colleagues hanging around the kitchen, and took the train back home. It didn’t seem consequential then, but I still remember it clearly, because it would be the last time I walked out of the Mozilla office.
The virtual work week, and at that time, the vague idea of a global pandemic, were not the most preoccupying thoughts as I was sitting on the south-bound train that afternoon. I was in escrow for the purchase of my first house (stressssfuullll), and I was moving to a small town three hours south of the Bay Area. I had recently been promoted to Senior Manager by my director (yay!), but had also just learned that she was moving off to another project (ahhh noooo!), and that I would be reporting to a brand new Sr. Director of Data who was starting in April. I was also going to inherit a Marketing Analytics team of three people after the departure of their director, along with their freshly-promoted manager who had until recently been the team’s tech lead. More broadly, the Data Science and Data Engineering teams were going through a reorganization that was going to bring everyone under one reporting structure. Everything seemed like it was shifting and changing, but it created as many opportunities as it did challenges.
During the first few one-on-ones with the new Marketing Analytics manager reporting to me, it became clear that I didn’t really know much about how marketing at Mozilla worked. Until the teams merged, I had been (and continued to be) the manager of the Product Analytics team, and my stakeholders were all Product Managers. I was only tangentially familiar with the Marketing Analytics team’s work and was not close to the stakeholders or their needs at all.
I had experience managing managers, but this was the first time that I had not come up directly through that team’s management chain myself. While daunting, it turned out to be a blessing in disguise for two reasons; first, that I was coming in with a healthy fresh perspective, and more importantly, that right from the start, I had to fully rely on the manager now reporting to me, who was not only the expert in the marketing technology stack and analytics requests of the team, but had deep relationships with the people in the Marketing org as well. Instead of feeling directly responsible for knowing every detail of the team’s day-to-day work and what it took to keep the lights on, I was able to leave that responsibility to the new manager, and focus on setting him and his team up for success.
The one thing I did know was that I needed to help create a strong vision for the team, and to change the status quo of the way they worked. The Marketing Analytics team was incredibly reactive to requests from their stakeholders, and some were on the verge of burnout. To make matters worse, a lot of the work they were doing wasn’t connected directly to business outcomes, and it was impossible to determine how much impact they were having for all of their effort. I began setting up meetings with key stakeholders to try to assess the real pain points they had, and how they relied on the Marketing Analytics team to help.
I learned that the main way that Marketing was measuring success was through running A/B tests to maximize the click-through-rate (CTR) of their marketing campaigns, with the intention of getting more users to the page where they could download the Firefox browser. Even they seemed dissatisfied with this limited definition of success, and the Head of Marketing asked me directly for help. From her perspective, the problem was that they had no idea if these marketing campaigns actually “worked” from a real user acquisition standpoint, and if so, which audiences were the most valuable to target. I promised we would make solving this a priority and went back to the team to discuss strategy.
It appeared that the best way to measure campaign success was to see if a user actually downloaded and used the Firefox browser, which would ultimately bring in revenue. Diving into it, it didn’t take long to realize that the data used by the Marketing team (mainly from third party sources) sat completely disconnected from our internal data that was collected through the product. Following a user from a particular marketing campaign to the Firefox download page, and then watching to see if they installed the browser and used it was not just hard, it was literally impossible. And then try to measure impact by layering an experimentation platform on top of it all. Yikes.
As these deeper problems started unfolding, it was clear that I was going to have to organize the team in a way that could effectively tackle them. Not only did the team need to be responsible for creating measurable success criteria for the growth of Firefox users, they were also uniquely positioned to shine a light on certain gaps in our data collection infrastructure that we otherwise didn’t know we had. We needed a mix of both product and marketing analytics support, as well as data engineering expertise, all with the same mission. I had one of my Product Data Scientists join the effort and get up to speed on the data coming from various user engagement surfaces in the product. Official reporting structures could (and would) change later if and when it all worked.
And we needed a team name! It wasn’t just Marketing Analytics anymore as the scope had grown and we were connecting the dots across the Marketing and Product divide. At the time, Mozilla had created their own cross-functional squads to drive “user engagement” throughout the acquisition funnel, and it was expected that we would be the data team supporting that effort.
And lo, the Data User Engagement Team (or “DUET”) was born, with the tagline: “Data enabling Product and Marketing to sing in perfect harmony.”
People loved the name, and as far as I know it’s still being used today.
We had a team, a name and a mission, now we needed a team charter to help set the tactical expectations. So much went into creating it that it could be its own blog post, so for now I will summarize what we came up with. The team would be accountable for the following five responsibilities:
Audit the data sources from all of the user engagement surfaces, from acquisition to revenue, and document them (this helped us gain a shared understanding with stakeholders of where we were starting)
Audit the experimentation (A/B test) platforms used across the user journey, and document them (this was especially important as we used a lot of third party tools to run A/B tests at the top of the acquisition funnel)
Develop success criteria for all of the user engagement surfaces that we are trying to optimize
Make technical recommendations to the Data Platform and Software Engineering teams where there were gaps in the telemetry or experimentation infrastructure, and drive the conversation to build what we needed in order to measure the success criteria. This was a big paradigm shift from the way we previously worked, and it led to some great outcomes.
Finally, validate marketing campaigns by running A/B tests against the success criteria we developed
As the team got to work, DUET’s manager and I went on our own campaign to evangelize the team and its philosophy throughout Mozilla. This was no longer a reactive team, instead, they were there to partner with both Product and Marketing stakeholders to build out the analytics capabilities they needed in order to drive real growth.
Time passed, and I packed up and moved away from the Bay Area. Mozilla already had a healthy remote culture before the pandemic, and we settled into the new normal. Most of my work with the effort to build a combined data strategy for the Marketing and Product teams happened early on, namely setting up the charter and being there to help guide DUET towards a common vision. Making the gears turn and getting the tactical wins can be attributed to the manager of DUET, who took it, ran with it, and grew a lot in the process. That year was especially hard and it threw many challenges our way, but we had a winning team.
DUET went on to build out a more complete view of the user acquisition funnel into the product, and were able to create new and better success metrics that focused on real growth from marketing efforts. They even gained a seat at the table to drive the Mozilla growth strategy for products beyond Firefox, showing up as a partner with Product and Marketing leadership. It was truly a data miracle.
Happy holidays, and see you in 2022!
Opinions expressed here are my own and do not express the views or opinions of my past employers.