Growth Lessons Learned from LinkedIn
When LinkedIn acquired my startup Connected in 2011, Elliot Shmukler was the sponsor for the acquisition and I ended up reporting directly to him. At the time his team was not only responsible for the core experience at LinkedIn (profile, connections, pymk, search, and more), but he also led the LinkedIn growth team. It ended up being an incredibly fortuitous place for us to land in the organization, as both Ada Chen Rekhi and I learned an incredible amount on growth through working directly with Elliot. These invaluable lessons from such a growth expert who helped scale LinkedIn from 20M to over 200M+ members certainly shape how Ada and I think about driving growth in every future endeavor.
I wanted to share just a few of these growth lessons learned.
Build your team's product intuition through broadly sharing test outcomes
While the literature around growth hacking and growth tactics has certainly significantly increased over the years, the reality is the best way to learn remains running tests on your own audience and broadly sharing the outcomes. The reason is because while there are a few generalizable guidelines for performant experiences, the vast majority of successful tactics are situational to the specific product category and target audience. So instead of reading up on what tests worked for others, the best approach is simply to run your own tests and then share the learnings broadly within your organization so you can build up your product intuition for what tests will perform successfully with your specific audience. It's best to come up with a standard template for sharing outcomes, including metrics being tracked over a standard time period so that results are comparable across tests at-a-glance. Keep a repository of such tests and look frequently back at the results so you can compare magnitude of certain tests over others. Review test outcomes at a weekly meeting with everyone on the growth team to build the intuition throughout the growth team.
Focus on optimizing for velocity of A/B tests
Given there is absolutely no substitute for simply running a test to know whether it's going to be successful, the very best growth teams optimize for velocity of A/B tests they can run. There are a lot of things that slow down testing velocity - all of which can be optimized. Ability to divide up your audience into orthogonal sub-segments is important to ensure your tests don't interfere but you can run as many as possible simultaneously. Improving the throughput of test analysis is important so you limit the time you can decide whether to ramp a test or move on to the next variant. Developing a backlog of upcoming tests and regularly grooming that backlog will help you ensure you always are prepared for the next test. Streamlining your test ideation and brainstorming process will also improve the speed of getting good variants to run. The point is the single most important thing a growth team can do is not picking the right variant to test, but simply increasing the velocity of tests you are running.
Invest in your analytics framework for measuring tests
At LinkedIn, David Henke, our former SVP of Engineering/Operations, used to say "you can't fix what you don't measure". It became a mantra throughout the organization to ensure we invested heavily in measuring all user behavior. This is incredibly important in the area of growth. And building out your analytics and testing framework is a worthwhile investment. While I generally believe there are tons of great off-the-shelf solutions for technology companies that don't require you to build your own solution, growth still feels like the exception. The very best growth teams like LinkedIn, Facebook, Zynga, and others built incredibly sophisticated internal tools for automatically instrumenting and measuring A/B test outcomes. I saw significant iteration in these tools in my four years at LinkedIn, with increasing sophistication over time. Our current infrastructure, called the XLNT Platform, is best-in-class in showing incredible depth in testing variables, even showing you variables you as the product owner may not have even anticipated. Make it easy for all team stakeholders to see and share results as quickly and efficiently as possible to optimize for growth wins.
Get your hands dirty with directly analyzing the results
One of the things Elliot insisted was that all the product managers on his team ran their own metrics in the early days of the growth team. He didn't want to "outsource" the analysis to a business analyst because he believed it was incredibly important for growth product managers to understand how to run the A/B test results themselves, even to the point of calculating statistical significance. The importance of this is not to do this all the time, but to understand the principles of analysis in a very deep way so you can reason about test methodology and understand the limitations of testing approaches. It's too easy for teams today to plug in Optimizely and treat test outcomes presented as gospel, when in fact there is a lot of nuance underlying the analysis that needs to be understood to know whether you can make sound decisions based on the presented results. So it's important to get your hands dirty and understand what's underneath the hood.
Re-test experiences as you have a shift in your audience mix
One of the interesting things we learned at LinkedIn is the optimal test variant that won in the past wasn't always the test variant that would continue to be the most performant in the future. What we realized is that our audience mix was changing over the years at LinkedIn and thus their reaction to our experiences differed. So we found that it was worth re-running tests. For example, we ran a growth test to optimize subject lines for international growth emails and saw significant lifts in very different variants that we ran compared to results from a similar test just a few years earlier.
Enjoyed this essay?
Get my weekly essays on product management & entrepreneurship delivered to your inbox.
Jun 22, 2015