What Albert Einstein can Teach Us about Tech Innovation


Share on LinkedIn

There’s an Albert Einstein quote about problem-solving that makes regular rounds in spam emails meant to inspire bored office workers. Misquoted often enough that its original wording is almost lost, the quote essentially says: “We cannot solve our problems with the same thinking we used when we created them.”

Surprisingly, this saying turns out to be highly relevant for a field that didn’t even exist in Einstein’s days – Business Intelligence. It might seem like a stretch, but bear with us.

The Data Analytics Conundrum

Businesses today are facing a problem that used to be much simpler: analyzing all the different types and amounts of data they have in a quick and neat way. Back in the day this could be done – and still, is albeit less successfully – with a simple Excel spreadsheet.

When Excel couldn’t cut it anymore, as datasets grew and diversified among tens of origins, more advanced BI software solutions stepped in to do the job. The most significant advances in the technology came with in-memory (RAM based) databases. And for a while they did wonders. But nowadays in-memory solutions are rapidly becoming insufficient for modern businesses.

In response, 2014 became the Year of the Workarounds, which saw businesses and vendors alike trying to get ahead of major issues of using in-memory computing with small fixes. Alas, as Einstein reminds us, it’s difficult to solve an underlying problem without looking at solutions in a different light. In 2015 we will see the next generation of databases and querying – in the form of In-Chip analytics. 2015 will be the year of innovation.

What’s There to Work Around?

Though in-memory solutions used to work really well, today they just don’t stand up to the challenge. There are a number of major issues today that most BI software providers are failing to fix, and either patching up or leaving to their customers to find solutions. The most common of these performance issues have to do with the how the software is able to compute large amounts of data that is often messy as well.

Users are facing incredibly long query times, the ability to analyze only very limited data amounts, inability to compute with larger numbers of concurrent users… The list goes on. More than this, now that data is taken from many different places and in many different forms, making it almost impossible for a layman to work with it without the help of IT or consultants to work on data preparation. In-memory technology is also becoming more expensive as it relies on using computers’ RAM (Random Access Memory) and as data grows bigger and takes up more space, more and more RAM is needed for storage.

2014 was the year of holding on to current in-memory solutions while trying to work around the myriad problems they keep presenting. So companies have gone about to buy more and more hardware to fulfill the RAM needs, or hired entire workforces dedicated to preparing data for analysis – slowing down every simple task. Other fixes have included segmenting data (reducing granularity) to make it easier for programs to digest, or limiting the number of queries they run simultaneously.

All this does at the end of the day is reduce the quality of data, complicate work processes and pile another expense on already costly software. Basically it looks like we’ve reached a glass ceiling for in-memory technology. It can no longer promise reasonable performance or price for the increasing complexity and size of data that is now being gathered, aggregated and analyzed by businesses. What’s needed is a different type of solution that can deal with these issues in a novel way.

Moving Beyond Workarounds to Innovative Solutions

In-Chip technology, the latest generation of in-memory technology, might be the innovation that will put an end to this year’s workarounds. The name ElastiCube comes from the database’s unique ability to stretch beyond the hard limitations imposed by older technologies. It’s already been adopted by many large businesses, though it was only released a few years ago.

In-chip analytics have the ability to solve the issues presented by previous technologies. The biggest change happens behind the scenes, as the technology makes optimal use of disk storage, RAM and the CPU cache to achieve minimal query times, regardless of the size of the data being queried. Additionally, by automating the data preparation process, most simple joins between data can be handled by the software itself and do not require extensive IT intervention..

Thus, instead of working around the problem – i.e., limiting query possibilities or the scope of data being looked at – In-Chip technology manages to find an innovative way to tackle the heart of the problem, and gives business intelligence end-users the performance rates they have grown used to, without compromising the quantity or quality of the data they’re using.

The only real solution to today’s big data problem will be one that doesn’t rely on yesterday’s technologies. In 2015 we will have to stop trying to solve our problems with the same thinking we used when creating them – and so we should expect to see further innovation in BI technologies. Hopefully, it will be the year of thinking outside the limitations of outdated technologies and workarounds built upon them. But only if we listen to Einstein’s sage advice.

Saar Bitner
Saar Bitner is the VP of Marketing at SiSense, the award-winning business analytics and dashboard software that lets non-techies easily analyze and visualize big data sets from multiple sources.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here