How enterprises see big data analytics changing the competitive landscape next year

Picture credit: iStockPhoto
87% of enterprises believe big data analytics will redefine the competitive landscape of their industries within the next three years. 89% believe that companies that do not adopt a Big Data analytics strategy in the next year risk losing market share and momentum.
These and other key findings are from a Accenture and General Electric study published this month on how the combination of Big Data analytics and the Internet of Things (IoT) are redefining the competitive landscape of entire industries. Accenture and GE define the Industrial Internet as the use of sensor, software, machine-to-machine learning and other technologies to gather and analyze data from physical objects or other large data streams, and then use those analyses to manage operations and in some cases to offer new, valued-added services.
Big data analytics now seen as essential for competitive growth
The Industrial Internet is projected to be worth $500bn in worldwide spending by 2020, taking into account hardware, software and services sales according to Wikibon and previously published research from General Electric. This finding and others can be found on the home page of the Accenture and GE study here: How the Industrial Internet is Changing the Competitive Landscape of Industries.
The study also shows that many enterprises are investing the majority of their time in analysis (36%) and just 13% are using Big Data analytics to predict outcomes, and only 16% using their analytics applications to optimize processes and strategies. Moving beyond analysis to predictive analytics and optimization is the upside potential the majority of the C-level respondents see as essential to staying competitive in their industries in the future.
A summary of results and the methodology used are downloadable in PDF form (free, no opt in) from this link: Industrial Internet Insights Report For 2015.
Key take-aways from the study include the following:
  • 73% of companies are already investing more than 20% of their overall technology budget on big data analytics, and just over two in ten are investing more than 30%. 76% of executives expect spending levels to increase. The following graphic illustrates these results:
Figure 1 big data investments
  • Big data analytics has quickly become the highest priority for aviation (61%), wind (45%) and manufacturing (42%) companies.  The following graphic provides insights into the relative level of importance of big data analytics relative to other priorities in the enterprises interviewed in the study:
Figure 2 industry overview
  • 74% of enterprises say that their main competitors are already using big data analytics to successfully differentiate their competitive strengths with clients, the media, and investors. 93% of enterprises are seeing new competitors in their market using big data analytics as a key differentiation strategy.  The single greatest risk enterprises see from not implementing a big data strategy is that competitors will gain market share at their expense.  Please see the following graphic for a comparison of the risks of not implementing big data strategy.
Figure 3 Unable to Implement
  • 65% of enterprises are focused on monitoring assets to identify operating issues for more proactive maintenance. 58% report having capabilities such as connecting equipment to collect operating data and analyzing the data to produce insights. The following graphic provides an overview of Big Data monitoring survey results:
Figure 4 big data monitoring
  • Increasing profitability (60%), gaining a competitive advantage (57%) and improving environmental safety and emissions compliance (55%) are the three highest industry priorities according to the survey. The following table provides an analysis of the top business priorities by industry for the next three years with the shaded areas indicating the highest-ranked priorities by industry:
Figure 5 industry priorities
  • The top three challenges enterprises face in implementing big data initiatives include the following: system barriers between departments prevent collection and correlation of data for maximum impact (36%); security concerns are impacting enterprises’ ability to implement a wide-scale big data initiative (35%); and  consolidation of disparate data and being able to use the resulting data store (29%), third. The following graphic provides an overview of the top three challenges organizations face in implementing big data initiatives:
Figure 6 challenges for big data analytics
Related Stories

Ready or not, the mobile cloud era awaits you

Picture credit: iStockPhoto
We’re nearing the end of 2014 and most smart CEOs already know their IT transformation game plan for 2015 – more digital differentiation woven into the fabric of their essential operations.
Every enterprise is now a digital business, regardless of the industry. That’s why digital service innovators are in such high demand. Meanwhile, many of the more traditional IT process-oriented jobs will diminish in importance.

Are you evolving your IT support team’s roles and responsibilities, as a result these key trends? Forward-thinking CIOs and IT managers have already embraced business technology that will do some of the more tedious routine system administration tasks via automation, so that they can redirect their focus to higher-priority activities.

What’s considered a pressing requirement? Many believe that it’s attaining parity in the enterprise with the freemium consumer cloud offerings that have helped to fuel the so-called Shadow IT phenomenon.

Public cloud gains more converts

However, there’s been progress that’s worth revisiting. According to the latest worldwide market study by International Data Corporation (IDC), public cloud computing services spending for the enterprise will reach $56.6 billion in 2014 and grow to more than $127 billion in 2018. 

This forecast represents a five-year compound annual growth rate (CAGR) of 22.8 percent, which is estimated to be about six times the rate of growth for the overall global IT market. In 2018, based upon findings from the IDC study, public IT cloud services will account for more than half of worldwide software, server, and storage spending growth.

"Over the next four to five years, IDC expects the community of developers to triple and to create a ten-fold increase in the number of new cloud-based solutions," said Frank Gens, senior vice president and chief analyst at IDC.

The ongoing adoption of what IDC calls cloud-first business strategies - by IT buyers implementing new digital services - is a major factor that is driving public enterprise cloud services growth.

IDC believes that the enterprise cloud services market is now entering an evolutionary phase. It will produce an explosion of new digital solutions and associated commercial value creation – built on top of the pervasive cloud computing infrastructure that’s being deployed across the globe.

These new applications and emerging use-cases will be created in vertically-focused platforms with their own innovation communities, which will help to reshape how companies operate their increasingly essential IT function. According to the IDC assessment, it will also transform how these companies compete within their primary industry.

Mr. Gens adds "Many of these solutions will become more strategic than traditional IT has ever been."

IDC expects Software as a Service (SaaS) will continue to dominate public cloud services spending, accounting for 70 percent of 2014 expenditures. IDC says the second largest public cloud category will be Infrastructure as a Service (IaaS). They also predict that Platform as a Service (PaaS) and storage will be the fastest growing categories, driven by major increases in developer cloud services adoption and Big Data applications, respectively.

Next steps toward a digital nirvana

In time, I anticipate that we’ll see more multinational companies upgrade their legacy data centers and deploy private cloud solutions to meet their user’s needs – for a variety of different but equally compelling strategic business reasons. I expect public cloud to more frequently coexist with private cloud, in a multitude of combinations that will be limited by our own imagination.

Granted, there will be some technical constraints that need to be overcome – like making these cloud service permutations all work together in a frictionless manner. That being said, are you prepared with the right hybrid cloud management and orchestration solution in place? If not, start the due diligence process to select an appropriate solution. You’ll need time; choose wisely.

Of course, the now ubiquitous open-source software suites will play an instrumental role in enabling the transition to a hybrid cloud model. Certainly, the enthusiasm and momentum of the early-adopters at the OpenStack Summit in Paris, France this week was very encouraging for the fast-followers.

Besides preparing for a multi-cloud environment, I believe that the future outlook for many companies will likely include embracing a Mobile Cloud scenario, where the two most apparent enterprise technology trends morph together into a cohesive whole.

The combination of capable mobile devices and hybrid cloud computing services should provide an adaptive and flexible business technology foundation, so you’ll need to understand how they integrate into your existing IT infrastructure and legacy commercial applications.

Smartphones, tablets and personal productivity-oriented mobile applications (apps) are transforming how information is being accessed, used and shared in the enterprise. The savvy Line of Business leaders at progressive companies have already enabled employees to purchase mobile cloud apps for file syncing, and other requirements that may not have been met by the IT organization.

Some perceptive corporate IT leaders saw the mobile-first strategy gain traction in the marketplace, and immediately got involved with a proactive plan to build and support corporate-approved apps. The mobile application development platform providers, such as FeedHenry, offer the tools and services that together constitute the critical elements of a total solution.

These platforms enable an enterprise to design, develop, deploy, distribute and manage a portfolio of mobile apps running on a range of devices and addressing the requirements of diverse use-cases. Clearly, the best way for IT organizations to be relevant in the mobile cloud era is to get involved; preferably sooner, rather than later. So, what’s the status of your plan?

IBM announces deal with WPP, taps into greater big data and analytics

Picture credit: iStockPhoto
IBM has announced that communications services group WPP has extended its partnership with the tech giant, providing a service delivery and technology platform to run WPP’s operations in the cloud for $1.25bn (£797m).
The current agreement will last for seven years, with WPP able to expand the use of big data and analytics and deploy new products and services through Big Blue.
“As the world’s largest communications group, we are seeking to exploit IBM’s cloud computing expertise to allow us to innovate and add value to both the service and the product we deliver to clients across 111 countries,” said WPP Group CIO Robin Dargue.
This isn’t the only deal coming from Armonk towers in recent months, with multi-billion dollar deals agreed between IBM and ABN Amro, a Dutch bank, as well as German airline firm Lufthansa. And according to reports, this won’t be the last announcement either.
IBM’s push towards becoming the leader in cloud computing has been aggressive for the past 12 months, having shoved $1bn (£638m) of investments into that space to rebrand as a cloud-first company back in March. A lot of big legacy tech firms have been moving this way as well – SAP, Oracle, Microsoft – with similar results in struggling financials and job cuts.
The firm’s Q3 numbers showed revenue down 4%, operating net income down 18%, yet cloud revenue was up more than 50%. They’re numbers which make analysts and investors worry, but it’s to be expected when moving your revenues to cloud-based ones rather than legacy software.
Globalfoundries recently picked up IBM’s chipmaking division, at a cost of $1.5bn to Big Blue, as a clear sign of where IBM CEO Ginni Rometty wants to move the company. As she explained on the analyst call: “The strategy’s correct, and now it’s our speed of execution that needs to continue to improve.”
IBM has also made strides to partner up with its perceived competition in recent months, announcing an agreement with SAP to help run its HANA Enterprise Cloud. The two companies, even though they’re enemies in the cloud vendor war, have partnered for more than 40 years. It’s certainly a trend – take Microsoft’s recent buddying up with Dropbox, even though Redmond has its own storage product in OneDrive.
Related Stories

Take a look inside SoftLayer's new UK data centre

Picture credit: SoftLayer
Six months after it was launched, IBM and SoftLayer opened the doors of its new UK data centre to the media, with IBM UK&I cloud leader Doug Clark and SoftLayer CTO Marc Jones in attendance to review 2014 and look forward to the year ahead.
The data centre in Chessington, run by Digital Realty, was until 2012 a shipping warehouse before being converted. Each pod, which Jones pointed out had the same design in every SoftLayer data centre, has 150 racks, 4000 physical nodes, and a 10,000 ft² isolated zone. Alongside the usual mix of generators is a series of car batteries to share the load for the first few minutes while the generators whir up.
The compute nodes which power SAP HANA have as much as 1TB of memory
2014 has been an extremely busy year for both IBM and SoftLayer, with a mix of the familiar, such as the partnership with SAP's in-memory database HANA,and the unfamiliar. Firmly in the latter category includes IBM's deals with Twitterand Docker, which Clark called "positively disruptive." Part of SAP HANA runs on SoftLayer, and Jones noted the compute nodes that power it have as much as one terabyte of memory each.
Picture credit: SoftLayer
The event also featured customer testimonials from Gyrocom and GoCardless, although SoftLayer was keen to emphasise others, most notably "poster child" WhatsApp, which is run entirely on the SoftLayer infrastructure, as well as Cloudant, which before being acquired by IBM was a "major" SoftLayer customer.
For each customer, SoftLayer's expertise in bare metal servers was a key differentiator alongside, naturally, location. This publication has extensively covered the geography of newly built data centres, not just from a data sovereignty view with European customers wanting their data to reside in European data houses, but also within the UK; greater connectivity and relatively similar latency means vendors can choose their build sites a bit further away from London.
SoftLayer's expertise in bare metal servers was a key differentiator for all customers
GoCardless CTO Harry Marr explained how the startup was also looking at AWS, but eventually plumped for SoftLayer. Marr noted how GoCardless was "having real problems with the cloud" when initially attempting to scale, mainly due to multi-tenancy issues and the 'noisy neighbour'. As regular readers of this publication will remember, IBM secured a patent last year to solve that very issue, utilising software defined networking (SDN) to ensure virtual machines give consistent network performance.
Clark noted the importance of SoftLayer in IBM's strategy going forward, describing the IaaS provider in a slide as the "foundation of the IBM cloud portfolio", as well as explaining SoftLayer was a "fundamental" and "maybe dominant" piece of the hybrid cloud equation.
Another interesting nugget came in the form of developers; with 18.2 million developers in the world, only 25% develop on the cloud. Clark expects both numbers to rise, with IBM predicting the overall figure will increase to 26 million in the coming years. Jones noted SoftLayer has an API, and a small team of API evangelists to aid customer integration.
"If anybody takes their eye off the ball, woe betide them in this new world," said Clark.
Elsewhere, it was confirmed that SoftLayer's CEO Lance Crosby has left IBM 20 months after his company was acquired. "We wish Lance Crosby the best as he takes a well deserved break before pursuing new endeavours," a statement from IBM read. Earlier this month it was confirmed that Robert LeBlanc, an IBM veteran, had moved into the role of head of cloud.
Another picture of the SoftLayer data centre can be found below:
Disclaimer: Your correspondent's travel expenses for this story were paid for by IBM SoftLayer.
Related Stories

Database drama: Relational or NoSQL? How to find the best choice for you

(c)iStock.com/tumpikuja
Research conducted by Forrester and commissioned by EnterpriseDB (EDB) has found nearly half (42%) of respondents are struggling to manage the NoSQL solutions deployed in their environments.
The study sheds more light on the NoSQL vs relational database discussion facing companies looking to store data today. NoSQL providers will explain their technology is pivotal to deal with increasing changes in terms of processing power, scale and speed; yet according to EDB, relational database providers are slowly clawing back ground and are evolving to support new data capabilities.
Statistics from the report make for interesting reading:
  • A third (30%) of respondents said data stored in NoSQL solutions was creating data siloes
  • 36% said they want to link their unstructured data with their structured data most of the time
  • More than half (52%) said they were unable to prevent developers from deploying new apps on separate NoSQL databases
It’s worth noting here that the majority of the survey results point towards one database to solve all needs – and lo and behold, EDB is pushing out a solution which does exactly that. Yet the more interesting takeaway is the position of relational databases – traditionally the older, poorer relation to the NoSQL players – in the discussion.
Bob Wiederhold, CEO of NoSQL vendor Couchbase, who most recently worked to scale Facebook hit game Cookie Jam, told this publicationback in 2013 that NoSQL will “dominate and ultimately...cause Oracle, IBM, SAP and others to have a very difficult time.”
Yet EDB seems to disagree. “Relational databases – and Postgres especially – have responded to changing data demands and incorporated capabilities for managing unstructured data as well as traditional structured data types,” said EDB chief exec Ed Boyajian in a statement.
“Today’s applications are more demanding, and using multiple different database solutions to support them creates problems with usability, adds cost and complexity and poses greater risk for the enterprise,” he added.
Elsewhere, SQL database management providers NuoDB has put out a relational database competitive analysis chart, comparing each vendor to various capabilities including availability, programming languages and multi-tenancy among others. Naturally, NuoDB puts itself firmly at the top of the pile, but it’s still an interesting examination, which can be seenhere.
Related Stories

Faster still: Analysing big data analytics and the agile enterprise

(c)iStock.com/sndr
By Mark Davis, Distinguised Big Data Engineer, Dell Software Group, Santa Clara, California 
Big data technologies are increasingly considered an alternative to the data warehouse. Surveys of large corporations and organisations bear out the strong desire to incorporate big data management approaches as part of their competitive strategy.
But what is the value that these companies see? Faster decision making, more complete information, and greater agility in the face of competitive challenges. Traditional data warehousing involved complex steps to curate and schematise data combined with expensive storage and access technologies.  Complete plans worked through archiving, governance, visualization, master data management, OLAP cubes, and a range of different user expectations and project stakeholders. Trying to manage these projects through to success also required coping with rapidly changing technology options. The end result was often failure.
With the big data stack, some of these issues are pushed back or simplified. For example, the issue of schematizing and merging data sources need not be considered up front in many cases, but can be done on a more on-demand basis. The concept of schema-on-read is based on a widely seen usage pattern for data that emerged from agile web startups. Log files from web servers needed to be merged with relational stores to provide predictive value about user “journeys” through the website. The log files could be left at rest in cheap storage on commodity servers beefed up with software replication capabilities. Only when parts of the logs needed to be merged or certain timeframes of access analyzed, did the data get touched.
Distributing data processing on commodity hardware led to the obvious next step of moving parts of the data into memory or processing it as it streams through the system. This most recent evolution of the big data stack shares characteristics with high performance computing techniques that have increasingly ganged together processors across interconnect fabrics rather than used custom processors tied to large collections of RAM. The BDAS (Berkeley Data Analytics Stack) exemplifies this new world of analytical processing. BDAS is a combination of in-memory, distributed database technologies like Spark, streaming systems like Spark Streaming, a graph database that layers on top of Spark called GraphX, and machine learning components called MLBase. Together these tools sit on top of Hadoop that provides a resilient, replicated storage layer combined with resource management.
What can we expect in the future? Data warehousing purists have watched these developments with a combination of interest and some degree of skepticism. The latter is because the problems and solutions that they have perfected through the years are not fully baked in the big data community. It seemed a bit like amateur hour.
But that is changing rapidly. Security and governance, for instance, have been weak parts of the big data story, but there are now a range of security approaches that range from Kerberos protocols permeating the stack to integrated ReST APIs with authentication at the edges of the clustered resources. Governance is likewise improving with projects growing out of the interplay between open source contributors and enterprises that want to explore the tooling. We will continue to see a rich evolution of the big data world until it looks more and more like traditional data warehousing, but perhaps with a lower cost of entry and increased accessibility for developers and business decision makers.
About the author:
Mark Davis founded one of the first big data analytics startups, Kitenga, that was acquired by Dell Software Group in 2012, where he now serves as a Distinguished Engineer. Mark led Big Data efforts as part of the IEEE Cloud Computing Initiative and is on the executive committee of the Intercloud Testbed Executive Committee, as well as contributing to theIEEE Big Data Initiative.
Related Stories

Why there is still a culture of ‘hope’ and ‘fear’ around cloud and big data technologies

(c)iStock.com/portishead1
Is more sensitive data being kept in the cloud? According to the latest report from Vormetric, in association with analyst house Ovum, 60% of US IT decision makers and 54% of respondents globally say they store sensitive data in the cloud.
Yet cloud environments (47%) are more of a risk to enterprise organisations than databases (37%) and file servers (29%), while cloud and big data concerns remain “genuine” and “deep rooted” according to the study, which surveyed over 800 IT decision makers worldwide.
The numbers revealed worrying findings about why organisations were moving data into the cloud; almost half (46%) of respondents expressed concerns over ‘market pressures’ forcing them to use cloud services. In terms of key changes to increase the use of cloud services, 55% wanted encryption of data with enterprise key control on their premises; 52% wanted encryption of their organisation’s data within the service provider’s infrastructure, and 52% wanted liability terms for a data breach.
The stakes are high if something goes wrong – which is more often than you think. Two in five (40%) organisations experienced a data breach or failed a compliance audit in the last year.
Increasingly, as sister publication Enterprise AppsTech has discovered, it’s the insider threat which is particularly worrying. Nine in 10 (89%) say they are at least ‘somewhat’ vulnerable to insider attacks, while respondents believe the most dangerous insiders in terms of data breaches are privileged users (55%), followed by contractors and service providers (45%) and business partners (43%).
Consequently, data breach protection is now the number one priority among IT, ahead of compliance. For Ovum, this trend can be clearly seen in recent events – the recent Sony, Target, and Vodafone security issues came about even though each company was compliant at the time.
“The cloud and big data survey results demonstrate that there is both hope and fear when it comes to cloud and big data technologies,” said Andrew Kellett, Ovum lead analyst and report author.
“This fear can lead to slow implementation of these platforms, which stymies innovation and growth. But there are steps enterprises can take and changes providers can make that will increase adoption.”
You can take a look at the full report (email required) here.
Related Stories

Why big data’s big promises are finally within reach

(c)iStock.com/tumpikuja
By Adam Spearing, VP Platform EMEA, Salesforce
Let’s face it - until very recently big data has been a big letdown. Data warehouses and data analytics tools have historically proven difficult to design, build, and maintain. How much storage space will be necessary? How much data is there? What data management tools can the organisation afford and, just as important, what expertise is available in-house to build and run the data warehouse or data analytics platform?
InformationWeek recently outlined eight reasons why big data projects often fail. The article cited a survey from Gartner that found an astonishing 92% of organisations are stuck in neutral when it comes to their big data initiatives. Why? Because enterprises are spending a lot of money on big data technologies, or plan to, but don’t have the right skills or strategies in place to drive the initiatives forward.
That’s a shame because, when done right, good analytics can improve customer experience, drive sales, mitigate financial risks, and streamline business efficiencies.
As modern data analytics technologies are built on cloud infrastructure, they don’t require dedicated hardware or sophisticated applications
For a moment, consider how the typical data warehouse is run in today’s enterprise. Delivery of information from the data warehouse is often performed within a two-step process. Those needing new business insight tools would first send a request to their analysts, who then would turn to their IT team and ask it to provision the hardware and data platforms necessary. Then, those analysts or data scientists would perform all of the associated integration work and schema development. Remember, just building the most basic data warehouse can take eight months to a year.
If anyone in the organisation needed access to data or had questions they’d like answered, they’d always have to go to the gatekeepers — the data scientists.
Legacy data analytics tools have also fallen short when it comes to the promises made regarding so-called “big data solutions.” These tools have proven themselves to be difficult to use. They’re too reliant upon on-premise technologies, and they are not available where their insights are needed most: in the field, at customer locations, within the factory, before the big presentation, or wherever employees, executives, and teams happen to be when they need answers.
Fortunately, there have been so many advancements in enterprise technology in recent years that the impediments between the data that enterprises hold and the insight it can produce are disappearing.
Currently, three technological trends are driving this change: data analytics tools are now cloud native, they are mobile, and they are within reach to everyone.
When done right, good analytics can improve customer experience, drive sales, mitigate financial risks, and streamline business efficiencies
As modern data analytics technologies are built on cloud infrastructure, they don’t require dedicated hardware or sophisticated applications. They are always available as required and in the capacity they are needed.
Access to the data and being able to query it is no longer something that happens in the data center, or behind the corporate firewall. Data and the associated insight can be accessed securely from anywhere, which means comprehensive answers to the business’s most important and unexpected questions are available through mobile devices on engineering sites, in the manufacturing plant, or at a customer location.
Because modern analytics tools are based on cloud technologies and available on mostly any device, the insight they provide is accessible to anyone. This is a profound change. Now, salespeople, business managers, designers, engineers — whoever is in need of answers — can query the data and get the most actionable insight they need to succeed for their specific jobs without having to send requests through an analyst.
Of course, these cloud native, mobile, and insight-accessible data analytics technologies are incredibly powerful. Essentially, they provide unlimited power to consume billions and billions of rows of data that were never possible before. And this data can be surveyed from anywhere, delivered on mostly any mobile device, and comprehended by anyone. This intelligent collection, analysis, and mobile distribution of data analytics enables enterprises to not only stay a step or two ahead of the competition, but to leap far away from it.
Related Stories

Box unveils its iOS 8 bet, users able to access docs from iOS apps

Cloud storage provider Box has announced compatibility with iOS 8 on day one, including support for iOS App Extensions, which allows apps to “lend” functionality with each other.
This means users can access Box content from any other app on their iDevice, provided it supports the Extensions framework. Users can select a document on Box and open it in an editor, or work on Box content from a mobile project management tool, for instance.
“This gives you unprecedented freedom and control when it comes to your information, while maintaining the security you expect with Box and iOS,” wrote Aaron Levie, Box CEO, in a blog post.
“This is a huge milestone for interoperability, and we’re thrilled to support iOS 8 on day one to transform productivity and collaboration for individuals, businesses and industries,” he added.
With the release of iOS 8 yesterday, plenty of companies have announced their support. Enterprise mobility management (EMM) provider MobileIron noted in a blog that iOS 8 was “the first sign of Phase III of mobile enterprise computing.”
In other words, we’ve moved on from only being able to do wireless email, and we’ve moved on from the more recent trend of using standalone apps for working on mobile. Now it’s about workflow. It was a point Levie referenced to close his blog.
“We’re just at the beginning of the post-PC technology shift, with many more years of innovation to come,” he wrote. “What Apple introduced [yesterday] is crucial to the health and success of this transition.”
Box is evidently ramping up its collaboration tools. Box Notes, a tool which allows users to compare notes across mobile, has recently been made available for both iOS and Android-powered devices.
You can find out more about Box’s iOS 8 integration here.
Related Stories

AWS is top enterprise cloud service – but beware the consumer threat

Picture credit: iStockPhoto
Amazon Web Services (AWS) is the most popular enterprise cloud service according to a report released today from Skyhigh Networks – but the research also fired a broadside at how companies are struggling to block consumer products.
The reports, which are quarterly and based on data from more than 1.6 million users, noted a huge disparity in companies saying they block certain services and the amount of employees actually blocked.
80% of companies say they block Dropbox, but only 21% of users are actually blocked
Take file sharing provider Dropbox. It’s a very popular service to block, with 80% of firms surveyed saying they nix it. Yet only 21% of users are blocked. It gets worse the further you look. Half of companies claim to block iCloud, yet the actual block rate is only 9%. For Netflix (40% and 4%) and Instagram (48% and 4%), it’s a similar story. Only Facebook has a good hit rate – 50% of companies say they block it, and 31% of users are actually blocked.
The remaining number – in Dropbox’s case, 59% of users – is what Skyhigh calls the “cloud enforcement gap.” This can come about in various ways. Either the cloud service provider proffers a new URL which isn’t picked up on by the IT department, or block rates aren’t enforced by certain geographies, or exemptions to particular groups get picked up on and shared around.
It’s a prevalent trend. The number of cloud services used by the average company rose 23%, from 588 in Q114 to 724 in Q3, yet almost three quarters (74.3%) of cloud services in use do not meet the current EU Data Protection Directive.
In other words, it’s shadow IT. Facebook is the most popular service in this category, followed by Twitter, YouTube, LinkedIn and Pinterest.
It’s interesting to note the differences between consumer and enterprise in this instance. Box is the seventh most popular enterprise tool, yet Dropbox is positioned #11 in consumer cloud services. Other storage providers, Apple’s iCloud (#13) and Google Drive (#14) also feature in consumer.
The average company uses an eye watering 125 collaboration services
Ultimately, Dropbox is the most popular file sharing tool overall, ahead of Google Drive, Box and OneDrive. Office365 is the most popular collaboration tool, ahead of Gmail and WebEx, while workday is number one for HR. The average company uses 37 different file sharing services, and an eye watering 125 collaboration services.
The report also notes how the Pareto Principle, or the 80/20 rule, applies to cloud services, but in a much more extreme manner. 80% of data uploaded to the cloud goes to less than 1% of all services – Box (23%) is the most popular there, followed by Dropbox (11%), YouTube (9%), and Microsoft Office365 (7%).
In order, the 10 most popular enterprise cloud services are AWS, Microsoft Office365, Salesforce, Cisco WebEx, Concur, ServiceNow, Box, LivePerson, Zendesk and Yammer. The top 20 also includes services from enterprise giants, such as BMC, Workday and GoToMeeting. Microsoft’s storage product, OneDrive, and NetSuite, were new entries in this edition.
Read the full document here.
Related Stories