Community Wiki Home
Spaces
Apps
Templates
Create
Pentaho Data Integration
All content
Space settings
Content
Results will update as you type.
•
A guide to setting up PDI in a Microsoft client-server style environment
•
Black Box Testing
Carte User Documentation
•
Clustering with Pentaho Data Integration
•
Exporting resources
•
Feature checkboxes
Frequently Asked Questions
•
Getting Started
•
Kitchen User Documentation
•
Launching job entries in parallel
•
My transformation is running slow, what do I do?!
•
Named Parameters
•
Pan User Documentation
PDI Developer information
•
Converting your PDI v3 plugins to v4
•
i18n made easy with the (new) Pentaho Translator
•
Kettle 4 and the art of internationalization
•
Kettle Lifecycle Plugin
•
PDI API Changes wish-list for major releases (like 5.0 or 6.0)
•
PDI Build Information
•
Registering Services to the Repository in Kettle
•
The art of data conversion
•
The Kettle 4 Plugin Registry
The PDI SDK
•
PDI Architecture
•
PDI Development resources
PDI Integration
PDI Plugin Development
•
PDI Extension Point Plugins
•
PDI Job Entry Plugin Development
•
PDI Partition Method Plugin Development
•
PDI Row Distribution Plugin Development
•
PDI Spoon Plugin Development
PDI Step Plugin Development
•
PDI Two-Way Password Encoding plugins
•
PDI Rows Of Data
Pentaho Data Integration (Kettle) Tutorial
Pentaho Data Integration 3.0 migration guide
•
Pentaho Data Integration Case Studies
•
Pentaho Data Integration - Java API Examples
Pentaho Data Integration Job Entries
•
Pentaho Data Integration Screenshots
Pentaho Data Integration Recorded Demos
Pentaho Data Integration v3.2. Job Entries
Slave servers and clustering
Special database issues and experiences
Spoon User Guide
•
Step performance monitoring
•
What's new in PDI version 3.1
•
What's new in PDI version 3.2
Special Operating System issues and experiences
Writing your own Pentaho Data Integration Plug-In
Documenting Pentaho Data Integration (Kettle) Projects
•
Kettle dependency management
Kettle Exchange
•
Monitoring SWT Graphics Resources with Sleak
Data Quality Integration Home
•
Partitioning data with PDI
•
Import User Documentation
•
Configuring log tables for concurrent access
Pentaho Data Integration (aka Kettle) Concepts, Best Practices and Solutions
•
Pig Script Executor
•
Marketplace
The Thin Kettle JDBC driver
•
Database transactions in jobs and transformations
•
Job checkpoints and restartability
•
Carte Configuration
•
Column Format
•
MongoDB Output IC
•
NuoDB
•
Documentation Template for Steps and Job Entries
•
MongoDB Input IC
•
Services_Yarn_Documentation
Alfresco Output Plugin for Kettle
Pentaho Data Integration Steps
•
What's new in PDI 4.0
Blogs
Pentaho Data Integration
/
/
PDI Plugin Development
/
PDI Job Entry Plugin Development
Summarize
PDI Job Entry Plugin Development
Former user (Deleted)
Owned by
Former user (Deleted)
Last updated:
Mar 16, 2011
Version comment
1 min read
Loading data...
PDI Job Entry Plugin Development
TODO
{"serverDuration": 14, "requestCorrelationId": "13a1b79541854ceeb0e037b117d96790"}