5 Ways Big Data Are Fundamentally Changing Information Systems


Share on LinkedIn

A lot has been said and written lately about whether Big Data revolution is for real or it is one more hype that will die down soon as tech world moves on to the next fad.

In my opinion, Big Data is a game changing revolution that will fundamentally change how information is collected, stored, managed and consumed thereby transforming the way we work, live and play.

Given below are five reasons why Big Data will change information systems and corporate IT:

1. Move away for traditional RDBMS:

Ever since electronic storage and processing of data began as a centralized corporate function (remember good old EDP or Electronic Data Processing Department!), Relational Database Management System or RDBMS in short is fundamental to most of the computerized corporate information systems. Even today, most of the information systems such as ERP or CRM are supported by RDBMS.

This is about to change in a big way, thanks to three Vs of Big Data namely, Data Volume, Data Variety and Data Velocity. Traditional data storage and retrieval methods, such as RDBMS are no longer going to work and would necessitate NoSQL (short for “Not only SQL”) database instead of RDBMS. Unlike SQL data or RDBMS which places data inside well defined structures or tables using meta data, NoSQL is designed to capture all data without categorizing and parsing upon entry into the system. This will fundamentally change the architecture of corporate information systems.

2. Unstructured data handling capability:

Capability of handling both, structured and unstructured data is another important way information systems are going to change fundamentally thanks to Big Data. As noted above, Big Data has three defining attributes – Data Volume, Data Variety and Data Velocity and together they constitute a comprehensive definition of Big Data.

Data Variety implies that Big Data is not just about text or numbers (alphanumeric fields), but also unstructured data. Information systems in future will have to be designed with capability of handling both structured and unstructured data.

3. Real Time Data Processing:

Given the Velocity or speed with which Big Data is being generated, information systems in future will require capability of processing massive volume of data in real time. Even “near real time”, a phrase often used with current generation of information systems, is not good enough.

A good example of real time data processing is the ability to process social media or sensor data as they are being generated and take necessary action immediately, such as responding to a tweet or Facebook posting. Batch processing, nightly or weekly updates and even near real time data processing are not good enough because of high Data Velocity as is the case with Big Data.

4. Predictive analytics and in memory analytics:

If data is being generated in a variety of formats (structured and unstructued), in high volume and at a high velocity, only way it can be used effectively for decision making is through the use of Predictive Analytics and in memory data analytics. Information systems in future will have to be designed keeping this aspect in mind.

5. Most data are either user or machine generated:

And lastly but not the least, most of Big Data are either generated by end users/customers (such as social media data) or by machines/sensors outside the confines or firewall of a company. This is unlike in the past, when most of the data were generated within the firewall of a corporation (such as transaction data, inventory data or factory production data) with very little coming from outside. This will fundamentally transform the architecture of information systems in future.

What do you think? Do you agree that Big Data will fundamentally change information systems and corporate IT? Please do share your thoughts:

Republished with author's permission from original post.

Harish Kotadia, Ph.D.
Dr. Harish Kotadia has more than twelve years' work experience as a hands-on CRM Program and Project Manager implementing CRM and Analytics solutions for Fortune 500 clients in the US. He also has about five years' work experience as a Research Executive in Marketing Research and Consulting industry working for leading MR organizations. Dr. Harish currently lives in Dallas, Texas, USA and works as a Practice Leader, Data Analytics and Big Data at a US based Global Consulting Company. Views and opinion expressed in this blog are his own.


  1. Dr. Harish, nice perspective on Big Data. We are seeing an increase in businesses seeking specialized skills to help address challenges that arose with the era of big data. The HPCC Systems platform from LexisNexis helps to fill this gap by allowing data analysts themselves to own the complete data lifecycle. Designed by data scientists, ECL is a declarative programming language used to express data algorithms across the entire HPCC platform. Their built-in analytics libraries for Machine Learning and BI integration provide a complete integrated solution from data ingestion and data processing to data delivery. More at http://hpccsystems.com


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here