From the July 01, 2012 issue of Futures Magazine • Subscribe!

Big Data: Manage it, don’t drown in it

Wikipedia describes Big Data as “data sets that grow so large and complex that they become awkward to work with using on-hand database management tools. Difficulties include capture, storage, search, sharing, analytics and visualizing.”

Making decisions based on too much information that is not properly managed and classified can be just as dangerous as making decisions based on too little. This is the challenge of Big Data.

Big Data has become a common buzz phrase in the trading industry and the implications for market professionals, from the individual trader to large firms and trading desks, are real and significant. Big Data confronts us with several questions: How do we tackle collection and distribution? How do we take data soup and manage it in such a way that it is informative and offers cues to act? When profitable opportunities rely on market inefficiencies, and when these inefficiencies may appear only for moments, how do we equip ourselves to manage these processes fast enough to seize them? Big Data no longer rings hollow as a buzz phrase but as a solid challenge that needs to be addressed. Before we dig into ways to navigate this new world, let’s look at the evolution of data.  

•   Data past:  In recent history, traders managed market data with pen and graph paper. Data granularity was confined to limits of open, high, low and close prices, and data management was restricted by physical capacity. When parsing price data to an analyzable and actionable format was a manual process, the scale of manageable data was limited. Data delivery was slower, which constrained the amount received in a given period of time — over the past 35 years we have moved from data delivery via floppy disk, satellite and low-bandwidth internet connectivity. The seemingly finite nature of data in the recent past makes the current information explosion appear overwhelming. 

•   Data present:  The futures industry now finds itself in a situation of ballooning data volume and low-latency, or even ultra-low-latency, access. As with nearly all things technological, we have progressed rapidly in an impressively short time period. Trading decisions now hinge on a huge quantity of market information. Electronic exchanges offer quick access to highly granular data. Tick data is updated in microsecond time frames. Real-time price action for similar or related instruments listed on different exchanges is visible on a single trading screen. In addition to the speed and complexity of structured data, factor in unstructured data such as real-time news from traditional sources and non-traditional sources like Twitter, which increasingly provides news before it is news. It is possible to add sentiment gauging on social media platforms to the trader’s or firm’s ordinary workflow. Now consider the number of symbols in that workflow and multiply accordingly. Thus our future: Big Data.

•   Data future:  These factors distill to two core Big Data challenges: 1) Effectively and efficiently collecting and distributing data and 2) reliably and intelligently analyzing the data and executing on it.

Comments
comments powered by Disqus