If you look at the history of Big Data requirements (volume, velocity and variety), and the NoSQL platforms supporting those requirements, you see a history of organizations and development teams breaking the mold of traditional information technology (IT) programs. Instead of following the traditional IT methodologies to solve the Big Data issues, these teams pushed the envelope and invented new technologies to solve those “volume, velocity and variety” problems. More often than not, these efforts were accomplished using collaborative, bottom-up methodologies, such as Open Source, rather than rigid, top-down approaches found in traditional product development methodologies. Specifically, if you look at the history of the Hadoop development at Yahoo, you see an approach that sought the input and wide spread resources of the Open Source movement rather than a more rigid proprietary approach.
Several years into the Big Data ”era,” you would think that IT organizations would recognize this change and adapt. However, a recent survey, from the NoSQL folks at Couchbase, found that it is a continued “lack of flexibility” and “inability to scale” that are the driving forces of NoSQL implementation funding in 2012.
Do you agree or disagree with the survey results? Are you seeing similar implementation drivers in your organization?
Post your comments below or ping me via twitter ( @JohnLMyers44 ) me directly.
Related EMA Content
Non-Sequitor in NoSQL
Is there a NoSQL Identity Crisis
- Pentaho, DataStax Build Ties To Ease NoSQL Data Movement (informationweek.com)
- Who’s topping the big data charts? (news.cnet.com)