Cloud Computing, Predictive Analytics and Big Data - Grasping the Big Picture



Suddenly, Analytics seems to be the new magic wand certain businesses can wave about in the 21st century knowledge economy to outbeat and outmanoeuvre competition and race ahead. To rephrase an old adage, a manager these days is known by the metrics he keeps. Some people are even of the view that the internet and more so the rise of social media and its growing usage in enterprises large and small, has been the trigger for the use of analytics & lately, Big Data analytics, on a rapidly growing scale.

This is hardly the case as a basic definition of analytics would no doubt illustrate. Check out the definition below to understand what analytics has been all about, for ages.

Analytics is an essential skill needed to run your business successfully. Common applications of analytics include the study of business data using statistical analysis in order to discover and understand historical patterns to predict and improve business performance in the future. Analytics is an integral part of most businesses, and one does not need to be an analyst to do analysis. Most successful entrepreneurs and business managers have generally been analysts in their own right, even if the analysis they have been doing has largely been intuitive.

To understand the heightened buzz around Predictive Analytics & Big Data Analytics, let us consider several important underlying aspects.

Moore's Law applied to generation of Data:

The original Moore's Law stated that the number of transistors on an integrated circuit doubled approximately every two years. The same is the case presently with data. Consider some of these facts:
If all seven billion people on earth were to join Twitter and continually tweet for one whole century, they would just about generate one zettabyte ( the prefix zetta indicates the seventh power of 1000 and hence 1 zettabyte is one sextillion bytes or 1 billion terabytes) of data at the end of the century. In 2011 alone, almost double that amount of data was generated.

It was projected, sometime during 2013, that 90% of the world's data had been created during the preceding two years going back from that date. Such an avalanche of data was unimaginable even a decade ago.Virtually each and every person on this planet is continually producing data about himself or herself simply by moving about, with behaviour being captured by cameras or by card usage or by the mere act of logging on to his or her desktop, laptop or mobile device and then getting on to the net. For a single journey across the Atlantic Ocean, a four-engine jumbo jet can create 640 terabytes of data. Multiply that by the more than 25,000 flights flown each day, and you get an understanding of the enormous amount of data that is being generated daily.

This hitherto unheard of volume of data, both structured and unstructured, is what is referred to as 'Big Data'.

The inaccuracy of most forecasts and projections till date



Forecasting has been around for a long, long time. In 1998, it was estimated to be a $200 billion industry ($300 billion by today's standards) and the bulk of the money was being made in business, economic & financial forecasting.

Perhaps one of the most ironic episodes about forecasting dates back to the late 1920s when Irving Fisher, one of the most famous economists of his time, predicted that 'Stocks have reached what looks like a permanently high plateau' just two weeks before the great Wall Street crash of 1929.

Despite the advances in analytics and forecasting methods, till today, over 70% of projections turn out to be wrong. While the causes are many, one of the prime reasons is said to be the relative volume of data which a statistical or an analytical model could get to work on. With the right analytical tools and algorithms applied to the required volumes of data which meet the needed standards of veracity, velocity and variety typical for Big Data analytics, this state of affairs is expected to change significantly in the future. The new stars in the predictive analytics horizon would be a growing community of superforecasters.


Cloud Computing and The Internet of Things

Over the last several years, industry experts have been talking about 'The Cloud Era', 'The Era of Mobile Devices' and the 'Era of Advanced Analytics & Big Data'. Yet, these are not disparate technologies but technologies complementary to each other. Evolutions in Social Media, Mobility and Cloud have spurred developments in the 'Internet of Things'. According to an IDC study, there will be 212 billion devices or things connected to networks by 2020. These technologies, taken together, are expected to provide a major fillip to innovations and new discoveries.

No leader or senior manager can afford to ignore many of these trends. Yet adoption in organizations would mean building new ecosystems and integrating specialist human resources. Our seminars are targeted to give you an invaluable overview & understanding of what it would take to harness the tremendous power of these technologies to give your organization that crucial edge.

Comments