Pentaho Data Integration - Java API Examples
In General
- The PDI SDK can be found in "Embedding and Extending Pentaho Data Integration" within the Developer Guides.
- For questions or discussions about this, please use the forum or check the developer mailing list.
- Get the source code here.
- Please read the Development Guidelines.
Pentaho Kettle Component
This component shows a good example how to call a transformation or a job from another Java program.
It can be found in the package org.pentaho.plugin.kettle in the Pentaho-BI-Server project.
Get the source code here.
Program your own Kettle transformation
Recommendation for upward compatibility: If you want to create your own Transformation dynamically (e.g. from meta-data), use the method of generating a XML-file (KTR) instead of using the API. The XML-files are compatibility from the first Version of Kettle until now. This is the same for Jobs.
The example described below performs the following actions:
- create a new transformation
- save the transformation as XML file
- generate the SQL for the target table
- Execute the transformation
- drop the target table to make this program repeatable
The complete source code for the example is distributed in the distribution zip file. You can find this file in the downloads section. (Kettle version 2.1.3 or higher) After unzipping this file, you can find the source code in the "TransBuilder.java" file in the "extra" directory.
The Kettle Java API for Kettle can be generated with Javadoc, e.g. in Eclipse: Project / Generate Javadoc or by exporting to Javadoc. When the wizard opens: Set the location of the Javadoc command and enter the absolute path (e.g. C:\Program Files\Java\jdk1.5.0_12\bin\javadoc.exe).
// Generate the transformation. TransMeta transMeta = TransBuilder.buildCopyTable( transformationName, sourceDatabaseName, sourceTableName, sourceFields, targetDatabaseName, targetTableName, targetFields ); // Save it as a file: String xml = transMeta.getXML(); DataOutputStream dos = new DataOutputStream(new FileOutputStream(new File(fileName))); dos.write(xml.getBytes("UTF-8")); dos.close(); System.out.println("Saved transformation to file: "+fileName); // OK, What's the SQL we need to execute to generate the target table? String sql = transMeta.getSQLStatementsString(); // Execute the SQL on the target table: Database targetDatabase = new Database(transMeta.findDatabase(targetDatabaseName)); targetDatabase.connect(); targetDatabase.execStatements(sql); // Now execute the transformation... Trans trans = new Trans(transMeta); trans.execute(null); trans.waitUntilFinished(); // For testing/repeatability, we drop the target table again targetDatabase.execStatement("drop table "+targetTableName); targetDatabase.disconnect();
Below is the source code for the method that creates the transformation:
/** * Creates a new Transformation using input parameters such as the tablename to read from. * @param transformationName The name of the transformation * @param sourceDatabaseName The name of the database to read from * @param sourceTableName The name of the table to read from * @param sourceFields The field names we want to read from the source table * @param targetDatabaseName The name of the target database * @param targetTableName The name of the target table we want to write to * @param targetFields The names of the fields in the target table (same number of fields as sourceFields) * @return A new transformation metadata object * @throws KettleException In the rare case something goes wrong */ public static final TransMeta buildCopyTable(String transformationName, String sourceDatabaseName, String sourceTableName, String[] sourceFields, String targetDatabaseName, String targetTableName, String[] targetFields) throws KettleException { EnvUtil.environmentInit(); try { // Create a new transformation... // TransMeta transMeta = new TransMeta(); transMeta.setName(transformationName); // Add the database connections for (int i = 0; i < databasesXML.length; i++) { DatabaseMeta databaseMeta = new DatabaseMeta(databasesXML[i]); transMeta.addDatabase(databaseMeta); } DatabaseMeta sourceDBInfo = transMeta.findDatabase(sourceDatabaseName); DatabaseMeta targetDBInfo = transMeta.findDatabase(targetDatabaseName); // // Add a note // String note = "Reads information from table [" + sourceTableName+ "] on database [" + sourceDBInfo + "]" + Const.CR; note += "After that, it writes the information to table ["+ targetTableName + "] on database [" + targetDBInfo + "]"; NotePadMeta ni = new NotePadMeta(note, 150, 10, -1, -1); transMeta.addNote(ni); // // create the source step... // String fromstepname = "read from [" + sourceTableName + "]"; TableInputMeta tii = new TableInputMeta(); tii.setDatabaseMeta(sourceDBInfo); String selectSQL = "SELECT " + Const.CR; for (int i = 0; i < sourceFields.length; i++) { if (i > 0) selectSQL += ", "; else selectSQL += " "; selectSQL += sourceFields[i] + Const.CR; } selectSQL += "FROM " + sourceTableName; tii.setSQL(selectSQL); StepLoader steploader = StepLoader.getInstance(); String fromstepid = steploader.getStepPluginID(tii); StepMeta fromstep = new StepMeta(log, fromstepid, fromstepname,(StepMetaInterface) tii); fromstep.setLocation(150, 100); fromstep.setDraw(true); fromstep.setDescription("Reads information from table [" + sourceTableName + "] on database [" + sourceDBInfo + "]"); transMeta.addStep(fromstep); // // add logic to rename fields // Use metadata logic in SelectValues, use SelectValueInfo... // SelectValuesMeta svi = new SelectValuesMeta(); svi.allocate(0, 0, sourceFields.length); for (int i = 0; i < sourceFields.length; i++) { svi.getMetaName()[i] = sourceFields[i]; svi.getMetaRename()[i] = targetFields[i]; } String selstepname = "Rename field names"; String selstepid = steploader.getStepPluginID(svi); StepMeta selstep = new StepMeta(log, selstepid, selstepname, (StepMetaInterface) svi); selstep.setLocation(350, 100); selstep.setDraw(true); selstep.setDescription("Rename field names"); transMeta.addStep(selstep); TransHopMeta shi = new TransHopMeta(fromstep, selstep); transMeta.addTransHop(shi); fromstep = selstep; // // Create the target step... // // // Add the TableOutputMeta step... // String tostepname = "write to [" + targetTableName + "]"; TableOutputMeta toi = new TableOutputMeta(); toi.setDatabase(targetDBInfo); toi.setTablename(targetTableName); toi.setCommitSize(200); toi.setTruncateTable(true); String tostepid = steploader.getStepPluginID(toi); StepMeta tostep = new StepMeta(log, tostepid, tostepname, (StepMetaInterface) toi); tostep.setLocation(550, 100); tostep.setDraw(true); tostep.setDescription("Write information to table [" + targetTableName + "] on database [" + targetDBInfo + "]"); transMeta.addStep(tostep); // // Add a hop between the two steps... // TransHopMeta hi = new TransHopMeta(fromstep, tostep); transMeta.addTransHop(hi); // The transformation is complete, return it... return transMeta; } catch (Exception e) { throw new KettleException("An unexpected error occurred creating the new transformation", e); } }
Running an existing transformation
If you already have created a transformation and saved it in a .ktr or .xml file, you can use code like the below to run it.
public static void runTransformation(String filename) { try { StepLoader.init(); EnvUtil.environmentInit(); TransMeta transMeta = new TransMeta(filename); Trans trans = new Trans(transMeta); trans.execute(null); // You can pass arguments instead of null. trans.waitUntilFinished(); if ( trans.getErrors() > 0 ) { throw new RuntimeException( "There were errors during transformation execution." ); } } catch ( KettleException e ) { // TODO Put your exception-handling code here. System.out.println(e); } }