By Saurav Rajbhandari

Big Data has come into the limelight recently, with large companies analyzing data to study customer behavior and to come up with new products. Companies have found use for it in expanding their businesses as well as strategizing their next move. UPS, the courier service company, gathered data through sensors from all of its vehicles. That data triggered a major redesign of UPS drivers’ route structures. The project was called ORION (On-Road Integration Optimization and Navigation) and relied heavily on online map data to reconfigure a driver’s pickups and drop-offs in real time. This project led to savings of more than 8.4 million gallons of fuel by cutting 85 million miles of daily routes. The company estimates that their saving only one daily mile per driver saves the company $30 million!

Generally, most companies have goals to enhance customer experience. Value is added when data is contextualized, understood, and put in good use. This offers a competitive advantage to companies. However, dealing with Big Data has its challenges. Growth of structured and unstructured data presents obstacles to companies.

Here are the recent trends in Big Data.

Number of Companies Using Big Data Increasing

The number of companies using Big Data is growing with time. A 2013 Gartner survey showed that 64 percent of companies were invested or were planning to invest in big data within the next two years. The same annual study conducted in 2015 showed that 75 percent of companies are investing or planning on investing in big data in the next two years. International Data Corporation (IDC) sees the big data technology and services market growing at a compound annual growth rate (CAGR) of 23.1% over the 2014-2019 forecast periods with annual spending reaching $48.6 billion in 2019.

Hiring Data Scientists

Data scientists work to gather, store and refine data and put them into use contextually according to a company’s needs. Without data scientists, companies would have little power to analyze the raw data. Since the talent pool is scarce, there is competition in the market to attract data scientists. A study by McKinsey projected that by 2018, the U.S. alone may face a 50-60 percent gap between supply and requisite demand of deep analytic talent. So, companies have been known to offer starting salaries above $200,000 to lure in talent.

Three Major Big Data Submarkets

The IDC forecast predicts the three major big data submarkets- infrastructure, software and services- are expected grow within the next five years. Infrastructure, which consists of computing, networking, storage infrastructure, and other datacenter infrastructure-like security, will grow at a 21.7% CAGR. Software, which consists of information management, discovery and analytics, and applications software, will grow at a CAGR of 26.2%. And services, which includes professional and support services for infrastructure and software, will grow at a CAGR of 22.7%. Infrastructure spending will account for roughly one-half of all spending throughout the forecast period.

Large companies have been investing in data center infrastructure. Facebook started construction on a 750,000 square feet facility in Fort Worth, Texas in July 2015, investing almost $1 billion. Microsoft has spent $15 billion on data centers to-date. The company has now announced 32 Azure regions around the world with 24 generally available today – more than any other major cloud provider. According to research conducted by Synergy Analysts, in the third quarter of 2015, Microsoft was the number-one software maker with almost a 70 percent share of the market. Hewlett Packard Enterprise (HPE) is continuing to gradually grow its share of the enterprise hardware segment. It now owns 24 percent of the market. HPE recently acquired SGI, a renowned company from the 80’s and 90’s, for $275 million.

If we look at Conway Analytics for companies in the three subsectors, there have been a total of 1,494 projects, out of which, 844 were expansion and 650 were new projects from 2011-2015.

Figure 1

There is a downward trend seen in both new and expansion projects in 2015. New Projects dropped from 154 in 2014 to a mere 64 projects in 2015, and announcements of expansion projects also dropped from 186 in 2014 to 151 in 2015. IDC predicts that the number of data centers would decrease by the year 2017. The firm predicts that change will be fueled by mass migration from small, on-premise facilities run by internal IT teams to ‘mega data centers’ operated by large service providers. This might be a reason for decline of expansion and new projects of data center infrastructure providers in the recent years.

If we look at expansion projects, Virginia has the highest expansion project announcements with 138 projects, followed by Texas with 110 projects, and Ohio with 76 projects. Virginia is one of the top markets for data centers. It was predicted by 451 Research, in 2015, that Northern Virginia would overtake New York City Metro as the biggest multi-tenant data center market. Texas has attracted high amounts of investments from data centers as well as service providers.

Map 1

In Virginia, the highest number of expansion projects was announced in 2011 with 66 projects. Projects declined from there forward. In Texas, the highest number of expansion projects was announced in 2012 with 36 projects, and in Ohio the highest projects were announced in 2015 with 17 projects.

In the new projects category, Texas leads the way with 69 new projects followed by California with 49 and Pennsylvania with 45 new projects.

Map 2

Texas had the highest number of projects in 2014 with 23 projects. The lowest number of new projects in a year in Texas is 2015 with only 3 projects. California had the highest number of new projects in 2013 with 17 projects. Pennsylvania had the highest number of new projects in 2011, but after that year, the number of new project announcements in Pennsylvania declined.

Challenges for Data Organization

Gartner predicts that, through 2017, 60 percent of big data projects will fail to go beyond piloting and experimentation, and will be abandoned. The increasing competition in the market ensures that only the top players will succeed in the long-run. To solve this problem, companies need to turn on developing cognitive data systems which use AI to gather, analyze, and deliver data efficiently rather than depending on human teams.

A study conducted by Pricewaterhouse Coopers (PwC) and Iron Mountain on 1,800 senior business leaders in North America and Europe at mid-sized companies found that 75 percent of business leaders from companies of all sizes, locations, and sectors feel they’re “making the most of their information assets.” In reality, only 4 percent are set up for success. Overall, 43 percent of companies surveyed “obtain little tangible benefit from their information,” while 23 percent “derive no benefit whatsoever.” The study points out three-quarters of organizations surveyed lack the skills and technology to use their data to gain an edge on competitors. Even further, three-out-of-four companies haven’t employed a data analyst, and out of companies that do, only one-quarter is using these employees competently. Data strategy is vital to gain a competitive advantage over competitors.

Exceptional Cases of Big Data Use

A recent Forbes article focused on companies making good use of Big Data. Walmart, one of the largest retailers in the world, has Data Café. Timely analysis of real-time data is seen as key to driving business performance. As Walmart Senior Statistical Analyst, Naveen Peddamail, states, “If you can’t get insights until you’ve analyzed your sales for a week or a month, then you’ve lost sales within that time. Our goal is always to get information to our business partners as fast as we can, so they can take action and cut down the turnaround time. It is proactive and reactive analytics.” The company once struggled to understand why sales of a particular produce were unexpectedly declining. Once their data was in the hands of the Data Café analysts, it was established very quickly that the decline was directly attributable to a pricing error. The error was immediately rectified and sales recovered within days.

In the manufacturing sector, Rolls-Royce puts Big Data processes to use in three key areas of their operations: design, manufacturing and after-sales support. The company, as Paul Stein the Chief Scientific Officer puts it, has huge clusters of high-power computing which are used in the design process. Rolls Royce generates tens of terabytes of data on each simulation of one of their jet engines. The company uses sophisticated computer techniques to look into that massive dataset and visualize whether that particular product is designed well or not. The company says that adopting this Big Data-driven approach to diagnosing faults, correcting them, and preventing them from occurring again has “significantly” reduced costs. Rolls Royce has adapted to a new business model with Big Data usage. The company has been able to offer a new service model to clients, which they call Total Care, where customers are charged per hour for the use of their engines, with all of the servicing costs underwritten by Rolls-Royce.

The Forbes article also presents a good example of Apixio, a California-based cognitive computing firm, which has been successful in tackling challenges in the health sector where the problem is not in the lack of data, but its unstructured nature – the many different formats and templates that healthcare providers use, and the numerous different systems that house this information. To tackle this problem, Apixio devised a way by which data can be analyzed at an individual level to create a patient data model.

Traditionally, in order to understand patient information, experts trained in reading charts and coding the information would have to read the patient chart searching for documentation related to diseases and treatment. This laborious and expensive way of extracting information from patient records created human errors. Apixio, with its HCC Profiler, demonstrated that computers can enable coders to read two or three times as many charts per hour than manual review alone.

When data is cleverly analyzed and strategically used by enterprises, they will have the ability to deal with errors properly, resulting in fewer losses. Companies then can plan for new projects or expand their existing facilities. Big Data with its use of advanced technology has the potential to drive changes in the way we conduct business. Economic development professionals, in turn, can use big data to predict a company’s behavior. This can help economic development agencies to lure in projects for their communities.

i) NAICS CODES USED FOR ANALYSIS: 221111,221112,221113,221114,221115,221116,221117,221118,333242,333413,333414,333415,
335121,335129,335311,335312,335313,335314,335911,335912,335929,335931,335932,335991,335999,518210, 541513, 541512, 541511