Pentaho Kettle now (freely!) available for your Cassandra/Hadoop applications
I’ve had the conversation many times. Many, many times. At conferences, user group meetings, etc., plenty of people will come up to me who are sold on using Cassandra and/or Hadoop, but they want to easily move and perhaps transform data from their legacy systems (e.g. Oracle, SQL Server, etc.) to Cassandra and Hadoop. Up until now I’ve not been able to give them a really good answer outside of doing some manual and possibly time-consuming development work.
Now I’ve got a much better response.
Today, we announced our partnership with Pentaho, which provides you with a very powerful and easy-to-use solution for moving data into and out of Cassandra, and also manipulating data in Cassandra itself. And part of the great news is that it’s completely free for you to use.
I’ve used Pentaho in the past and have always appreciated Kettle’s feature set and the engine itself, which is typically able to transfer fairly large amounts of data very quickly. I like the intuitive interface, the drag-and-drop features of the solution, and the powerful transformation capabilities it provides. Pentaho has made sure that all that functionality now supports Cassandra.
Pentaho’s offering works with community Cassandra, which is what we bundle into our DataStax Community Edition. It also works with our DataStax Enterprise Edition, and connects to both our Cassandra and Hadoop distributions that make up the DataStax Enterprise big data platform.
You can download and get more information on how to use Kettle with Cassandra and Hadoop on both our site and the Pentaho website. Also, be looking for upcoming tutorials and web presentations from us soon that will show you how best to use Kettle with Cassandra and Hadoop.