Every four years on presidential election night, Americans tune into their televisions—or laptops or smart phones or tablets—to hear the media outlets “call” the states one by one as the polls close. It’s an exciting night and the networks pull out all the stops to keep you on the edge of your seat for hours with dramatic “magic walls,” “decision desks,” and hologram correspondents. Who knows what’s in the works for election night 2016, but I’m sure it will be nail bitingly inventive.
However, fast-forward a couple of election cycles past 2016 and I think we’re likely to see more subdued coverage. Why? Because big data statistical models—such as the one statistician Nate Silver used in the 2012 presidential election to accurately (and astonishingly) predict the winner of all 50 states and the District of Columbia—will be stealing the thunder. As these big data models become more powerful and more accurate and their predictions are consistently borne out, news organizations will need to find a new tone and context in which to report on increasingly foregone conclusions.
Of course, the game-changing power of big data extends well beyond its potential to disrupt the suspense of election night. Environmental activists are using big data to identify hotspots of deforestation activity; pharmacies are using big data to flag prescription drug abuse; and public health advocates are using it to predict disease outbreaks, to name a few examples of big data’s enormous impact on society.
Businesses across all industries are also looking to benefit from big data through analysis and monetization of their amassed treasure troves of data. In fact, according to some industry analysts, 30 percent of businesses will be monetizing their data by 2016.
But as Nate Silver explained in a 2013 interview with Fortune, successful analysis requires that a good set of rules for processing information (based on theory and practice and past experience), be in place before the data is analyzed. In other words, you need clean, timely, contextual, and relevant data if accurate analysis is an expected outcome.
Given that the number of data sources are increasing exponentially with vast volumes of data now being collected from new data sources like wearables and the internet of things, this brings me to my next point: big data is changing the nature of integration. The inherent challenges of big data—the merging of many disparate data sources and formats, the sheer volume of data, and the need to process it in real time, all while making sure it is “clean,”—require an entirely new approach to integration. Gone are the days when you relied on rigid integration platforms to integrate fairly static data.
If your enterprise is looking to predict the future using big data—especially when the ability to successfully do so affects your company’s own future—you’ll need an integration solution (like our Liaison ALLOY™ Platform) engineered from the ground up to address the disruption of big data. It’ll need to be scalable to handle massive amounts of data, flexible to store and serve up data in whatever format is required of the use case, nimble to respond rapidly to changes, and tightly integrated with data management functions to support streaming data processing. Are you ready?
Big data is one of three data trends Liaison recently highlighted as integration game changers. Download our brief titled The Game Changers of Integration: Inside & Out to discover two other notable game changers impacting your integration operations, and how to prepare for them.