ONLINE TESTS EDITING BOARD
RO
EN
×
▼ BROWSE ISSUES ▼
Issue 58

The Emergence of Big Data in New Product Development

Laura Vaida
Product Owner @ Betfair
MANAGEMENT

Big data seems to be on everyone’s mind nowadays. As a product person myself, I know that good product managers have always been data driven. However, what we witness today is an explosion of endless tools and methodologies that depending on how they are used can either make or break a future product. A growing number of product managers are struggling with this abundance of data that generates more “noise” rather than better clarifying analysis and decisions.

It’s a known fact that 96% of all innovations fail to return their cost of capital (Deloitte). In this context, it is almost a no brainer that smart companies should change the way they are managing new product development. Several studies have demonstrated that those companies that manage to determine their customers’ needs and then innovate to meet them are much more profitable overall than companies who are not customer-centric.

Previous literature has demonstrated how firms are able to benefit from involving their customers or from acquiring customers’ feedback and input. Such methods and techniques come from both marketing research and R&D. There are countless options for managers to get valuable customer insights, some more expensive than others. Big data is cheaper and faster than traditional market research methods and while it probably won’t completely replace the latter, it is definitely a driving force of the new product development wave nowadays.

Here are some valuable data lessons that I’ve come across while working for a startup in the field of digital marketing. Please find below some pieces of advice that I would like to share on how to make the most out of Big Data:

1.Get all the data

Make sure that you get all the possible data that is available out there and then store it in one place.

Once you’ve managed to pull all the data that is of interest via APIs or other methods, I would suggest storing it in a data warehouse solution such as Amazon Redshift. Redshift was a good fit for our project due to its cloud-based, fully managed and hosted solution that allowed scaling as needed for large data volumes. It also helped that it’s built on Postgres, making it easy to use and integrate with other tools.

2.Know your data

Understand each data source in part and think about interdependencies.

It is particularly useful to get a full grasp of the available fields from each of the tables early on. Moreover, try to become aware of duplicate or similar fields coming from different tables or any other data peculiarities. Each data source is different, but it’s very important to take all the time needed to understand it before going any further on the analysis path.

3.Define your question

If you don’t know where you’re going, how can you get there?

You cannot make progress if you’re not asking questions. It would be best though, to ask the right questions. Nevertheless, at this step you should already have an idea of what you would be interested to discover about your customers or future product.

A basic example here would be: Are my customers’ purchases of product X influenced in any way by the weather outside?

4.Watch the right data

Pay attention to the right data sources!

This is easier said than done, but once it became clear what you are looking for and you have an understanding of your data sources it gets a lot easier to exclude the data that is not relevant.
For the example above, I already know that I’m supposed to be looking for sales data of product X and next for weather data. All other data not relevant for our question should at this point be put aside.

5.Use the right tools

Step away from your comfort zone and the tool you are most comfortable with.

Every one of us has a favorite tool: be it Microsoft Excel, Access, Tableau or some common or exotic SQL language, we are all drawn to one or another.

Make your homework and take the time to try as many tools as possible. Even if the cost to do so will be high in the short run, in the long run you will have a solid basis for any analysis and you will be able to choose the tool that best fits each scenario. Not to mention that not doing this will most likely be a recipe for disaster. As much as I love Excel, I wouldn’t recommend it for building powerful visualizations such as the ones that Tableau offers. The opposite is true as well, as there are certain tasks that Excel performs on the spot and that other advanced tools are not even designed to handle.

6.Reach a conclusion

No analysis is complete without a proper conclusion.

All the hard work you’ve done so far must be celebrated with a beautiful conclusion: you’ve asked the right question, looked at the right data and used the best analysis tools on the market.

To continue with our example, after some serious analysis we’ve come to learn that Product X is being purchased more when the temperatures outside are above 25 degrees Celsius.

7.Start playing with your findings

What is a conclusion without putting it to work for you?

Once we’ve learned more about our product and our customers it becomes utterly important to use these findings and try to improve our profitability. This can be done quickly through a variety of experiments such as A/B Testing, Multivariate Testing, Multi-page (Funnel) Testing etc.

Going forward with our example, we could try to run a generic google AdWords campaign in parallel with a campaign that targets only customers from regions where the temperatures are above 25 degrees Celsius. Having a higher ROI on the latter would will not only validate our previous conclusion but it will also increase our revenue.

To conclude, nowadays it’s not enough to ask customers what they want. The most valuable and innovative products are the ones that customers didn’t know they wanted until they first saw them. Big data should be used to discover unrecognized customer needs. According to IBM, 90% of all the data that exists in the world today was generated in the last 2 years. It is expected that the total quantity of data will reach 35 zettabytes (ZB) by 2020. This is therefore the era of “big data” and product people should all work towards embracing and leveraging this data in the best possible way.

Bibliography:

  1. Thomas H. Davenport; Harris, Jeanne G. (2007). Competing on Analytics: The New Science of Winning. Harvard Business School Press. ISBN 1-4221-0332-3.

  2. Thomas H. Davenport; How Big Data is Different. (2012)

  3. Yuanzhu ZhanEmail authorKim Hua TanYina LiYing Kei Tse. Unlocking the power of big data in new product development. (2016)

  4. Gil Press. With Big Data Analytics, Signals Eliminates The Noise In New Product Development. (2015)

  5. Gil Press. A Vety Short History Of Big Data. (2013)

  6. The Lazy Analyst’s Guide to Amazon redshift.

Sponsors

  • ntt data
  • 3PillarGlobal
  • Betfair
  • Telenav
  • Accenture
  • Siemens
  • Bosch
  • FlowTraders
  • MHP
  • BCR
  • Itiviti
  • Connatix
  • UIPatj
  • MicroFocus
  • Colors in projects