I admit, I cribbed the title of this post from an article by Stephen Swoyer at TDWI.  Stephen's article was about a survey by JasperSoft where they found the prevalence of traditional ETL or DI tools in BI projects – 59 percent.  Also, they found that 60 percent of survey respondents are using traditional RDBMSs in their Big Data projects.  When you sit back and consider this, it makes perfect sense.  Companies are not going to just rip and replace their existing infrastructure with a whole new set of tools.  In fact, what is happening is that companies are extending their data environments to incorporate new tools.  This is very important to consider when you look at the BI and analytical tools that will be used in Big Data projects.  If someone is using Excel to analyze their data or Tableau to visualize their data, just because they now have data in Hadoop or MongoDB or Cassandra does not mean they will stop using Excel or Tableau.  Rather, they will want to continue to use Excel or Tableau.  Vendors such as Microsoft and Tableau have recognized this and are working very hard to make sure their products continue to work with Big Data sources.  Microsoft has built a spreadsheet add-in for Excel that allows Excel to work with Hive data and Tableau has built into their product the ability to connect to Hadoop distributions like MapR and DataStax.  In fact, Hadoop distributions such as those from DataStax, HortonWorks, and MapR have gone to great lengths to ensure they have good quality ODBC drivers that enable any traditional BI and reporting tools to continue to effectively and efficiently analyze the data in your Hadoop cluster.  Open data access standards like ODBC are key to enabling traditional tools to continue to play a big part in Big Data.

You can read Stephen's article here: http://tdwi.org/Articles/2012/09/11/Big-Data-Tools.aspx

You can read about Simba's Hive ODBC Driver with SQL Connector here: http://www.simba.com/Apache-Hadoop-Hive-ODBC-Driver-SQL-Connector.htm