The Problem with Big Data

By Gabe Smith
November 11, 2013

Big Data is all the rage these days, and it’s here to stay. In fact, data is just getting bigger, and the tools available to use it are barely able to keep up in a cost effective manner when you consider the cost of the hardware and software required for Big Data implementations. 90% of the world’s data has been created in the last 2 years. The amount of information flowing over the Internet quintupled since 2009; it is predicted to reach 667 Exabytes in 2013 and that is predicted to go to 4148 Exabytes in 2017.

So, what is the real problem with Big Data for B2B companies? In short, the problem is that it’s BIG: too big to visualize effectively, too big to sift through, too big to drill down into. It’s overwhelming, unwieldy, and it never stops growing. In fact, the growth of IP traffic due to video and mobility is literally breaking the internet – there are a variety of new technologies, like Multi-Path TCP, introduced into production for the first time by Apple on the iPhone 5S, aimed at addressing it.

There has been a huge leap forward with in-memory computing based on the price of RAM coming down substantially. This, combined with Massive Parallel Processing, or MPP, has enabled solutions to problems that 10 years ago were reserved for those with access to huge supercomputers. SAP HANA is a good example here, and the things people are doing with it are pretty astounding.

So, how can you deal with this? Hire a team of analysts that are BI experts as well as experts on the subject matter and hope they can sift through it? Buy a huge amount of hardware and hire data scientists and developers to code algorithms to go through everything? Unless you are Google, Facebook, or Twitter, you probably don’t have that luxury, although this is the right path.

Automated opportunity identification is the key to dealing with Big Data. This is why the areas of data science, machine learning, and predictive analytics are exploding.

SaaS applications, like we’re developing at Vendavo, can go through your data and find opportunities for you. This is the future of enterprise applications: running algorithms on your data to make suggestions like Netflix and Pandora do for media, applications that apply this same sort of technique to Enterprise Big Data is the future.

  • Analytics , big data , data , data science , enterprise solutions , predictive analytics , SaaS , SAP HANA

    Gabe Smith

    Gabe has 13 years of experience in sales, consulting, pricing, product and program management. He joined Vendavo in 2007 as a Principal Pricing Consultant, where he led solution definition to enable value for multinational corporations such as IBM, Seagate, Emerson and Praxair. In 2009, Gabe moved into Product Management, and has worked on analytics, visualization and collaboration, and written several whitepapers on price and margin management best practices. Most recently, he product managed the release of the Vendavo Best Practice Edition and the CRM Sales Negotiator. In 2013, Gabe moved into an Account Executive position at Vendavo. Prior to joining Vendavo, Gabe worked at Cisco for eight years as an Operations and Program Manager in Manufacturing, Sales and Channels; he was the worldwide ops lead for some of Cisco’s largest worldwide sales and pricing programs and applications.