Configure Pentaho for Cloudera CDH4
Unknown macro: {scrollbar}
Outdated Material
This page has outdated material from an old Kettle version - 4.3 and has been archived. If you found it via search, be sure that it is what you want.
Client Configuration
These instructions are for the Cloudera CDH4 MRv1 Distribution, for previous versions, please go to Configure Pentaho for Cloudera and Other Hadoop Versions.
Kettle Client (PDI)
- Download and extract Kettle 4.3.0 CE from the Downloads page.
The Kettle Client comes pre-configured for Apache Hadoop 0.20.2. If you are using this distro and version, no further configuration is required. - Configure PDI Client for CDH4 MRv1
- Delete the pentaho-big-data-plugin directory found at $PDI_HOME/plugins.
- Download the Pentaho Big Data plugin specifically used for CDH4 MRv1 from here: http://ci.pentaho.com/job/BRANCH_CDH4_pentaho-big-data-plugin/, unzip it, and move the pentaho-big-data-plugin directory to $PDI_HOME/plugins.
- Delete the following from $PDI_HOME/libext/bigdata:
- hadoop-0.20.2-core.jar
- hbase-0.90.3.jar
- zookeeper-3.3.2.jar
- Delete the following from $PDI_HOME/libext/bigdata/hive:
- hive-exec-0.7.0-pentaho-1.0.1.jar
- hive-metastore-0.7.0-pentaho-1.0.1.jar
- hive-service-0.7.0-pentaho-1.0.1.jar
- libfb303.jar
- libthrift.jar
- Delete: $PDI_HOME/libext/google/google-collections-1.0-rc5.jar
- Copy the following jars from the CDH4 MRv1 installation to $PDI_HOME/libext/bigdata:
- avro-1.5.4.jar
- commons-configuration-1.6.jar
- hadoop-auth-2.0.0-cdh4.0.0.jar
- hadoop-common-2.0.0-cdh4.0.0.jar
- hadoop-core-2.0.0-mr1-cdh4.0.0.jar
- hadoop-hdfs-2.0.0-cdh4.0.0.jar
- hbase-0.92.1-cdh4.0.0-security.jar
- protobuf-java-2.4.0a.jar
- zookeeper-3.4.3-cdh4.0.0.jar
- Copy the following jars from the CDH4 MRv1 installation to $PDI_HOME/libext/bigdata/hive:
- hive-builtins-0.8.1-cdh4.0.0.jar
- hive-exec-0.8.1-cdh4.0.0.jar
- hive-metastore-0.8.1-cdh4.0.0.jar
- hive-service-0.8.1-cdh4.0.0.jar
- libfb303-0.7.0.jar
- libthrift-0.7.0.jar
- Move /pentaho-big-data-plugin/lib/guava-11.0.2.jar to the $PDI_HOME/libext/bigdata directory.
- Copy the core-site.xml, hdfs-site.xml, and mapred-site.xml configuration files from your Hadoop cluster to the $PDI_HOME directory.
Pentaho Report Designer (PRD)
- Download and extract PRD from the Downloads page.
PRD comes pre-configured for Apache Hadoop 0.20.2. If you are using this distro and version, no further configuration is required. - Configure PRD for CDH4 MRv1
- Delete the pentaho-big-data-plugin directory found at $PRD_HOME/plugins.
- Download the Pentaho Big Data plugin specifically used for CDH4 MRv1 from here: http://ci.pentaho.com/job/BRANCH_CDH4_pentaho-big-data-plugin/, unzip it, and move the pentaho-big-data-plugin directory to $PRD_HOME/plugins.
- Delete the following from $PRD_HOME/lib/bigdata:
- hadoop-0.20.2-core.jar
- hbase-0.90.3.jar
- zookeeper-3.3.2.jar
- Delete the following from $PRD_HOME/lib/jdbc:
- hive-exec-0.7.0-pentaho-1.0.1.jar
- hive-metastore-0.7.0-pentaho-1.0.1.jar
- hive-service-0.7.0-pentaho-1.0.1.jar
- libfb303-0.5.0.jar
- libthrift-0.5.0.jar
- Copy the following jars from the CDH4 MRv1 installation to $PRD_HOME/lib/bigdata:
- avro-1.5.4.jar
- commons-configuration-1.6.jar
- hadoop-auth-2.0.0-cdh4.0.0.jar
- hadoop-common-2.0.0-cdh4.0.0.jar
- hadoop-core-2.0.0-mr1-cdh4.0.0.jar
- hadoop-hdfs-2.0.0-cdh4.0.0.jar
- hbase-0.92.1-cdh4.0.0-security.jar
- protobuf-java-2.4.0a.jar
- zookeeper-3.4.3-cdh4.0.0.jar
- Copy the following jars from the CDH4 MRv1 installation to $PRD_HOME/lib/jdbc:
- hive-builtins-0.8.1-cdh4.0.0.jar
- hive-exec-0.8.1-cdh4.0.0.jar
- hive-metastore-0.8.1-cdh4.0.0.jar
- hive-service-0.8.1-cdh4.0.0.jar
- libfb303-0.7.0.jar
- libthrift-0.7.0.jar
- Move /pentaho-big-data-plugin/lib/guava-11.0.2.jar to the $PRD_HOME/lib/bigdata directory.
Pentaho Business Intelligence Server (BI Server)
- Download and extract BI Server from the Downloads page.
The BI Server comes pre-configured for Apache Hadoop 0.20.2. If you are using this distro and version, no further configuration is required. - Configure BI Server for CDH4 MRv1
- Delete the pentaho-big-data-plugin directory found at $BI_SERVER_HOME/pentaho-solutions/system/kettle/plugins.
- Download the Pentaho Big Data plugin specifically used for CDH4 MRv1 from here: http://ci.pentaho.com/job/BRANCH_CDH4_pentaho-big-data-plugin/, unzip it, and move the pentaho-big-data-plugin directory to $BI_SERVER_HOME/pentaho-solutions/system/kettle/plugins.
- Delete the following from $BI_SERVER_HOME/tomcat/webapps/pentaho/WEB-INF/lib:
- hadoop-0.20.2-core.jar
- hbase-0.90.3.jar
- hive-exec-0.7.0-pentaho-1.0.1.jar
- hive-metastore-0.7.0-pentaho-1.0.1.jar
- hive-service-0.7.0-pentaho-1.0.1.jar
- libfb303-0.5.0.jar
- libthrift-0.5.0.jar
- zookeeper-3.3.2.jar
- Copy the following jars from the CDH4 MRv1 installation to $BI_SERVER_HOME/tomcat/webapps/pentaho/WEB-INF/lib:
- avro-1.5.4.jar
- commons-configuration-1.6.jar
- hadoop-auth-2.0.0-cdh4.0.0.jar
- hadoop-common-2.0.0-cdh4.0.0.jar
- hadoop-core-2.0.0-mr1-cdh4.0.0.jar
- hadoop-hdfs-2.0.0-cdh4.0.0.jar
- hbase-0.92.1-cdh4.0.0-security.jar
- hive-builtins-0.8.1-cdh4.0.0.jar
- hive-exec-0.8.1-cdh4.0.0.jar
- hive-metastore-0.8.1-cdh4.0.0.jar
- hive-service-0.8.1-cdh4.0.0.jar
- libfb303-0.7.0.jar
- libthrift-0.7.0.jar
- protobuf-java-2.4.0a.jar
- zookeeper-3.4.3-cdh4.0.0.jar
- Move $BI_SERVER_HOME/pentaho-solutions/system/kettle/plugins/pentaho-big-data-plugin/lib/guava-11.0.2.jar to the $BI_SERVER_HOME/tomcat/webapps/pentaho/WEB-INF/lib directory.
- Place the Hadoop configuration files (hdfs-site.xml, core-site.xml, mapred-site.xml) into $BI_SERVER_HOME/tomcat/webapps/pentaho/WEB-INF/classes