Community Wiki Home
Spaces
Apps
Templates
Create
Pentaho Data Integration
All content
Space settings
Content
Results will update as you type.
Show more above
PDI Developer information
Pentaho Data Integration (Kettle) Tutorial
Pentaho Data Integration 3.0 migration guide
•
Pentaho Data Integration Case Studies
•
Pentaho Data Integration - Java API Examples
Pentaho Data Integration Job Entries
•
Decrypt files with PGP
•
Encrypt files with PGP
•
Abort Job
•
Add filenames to result
•
Amazon EMR Job Executor
•
Amazon Hive Job Executor
Build Model (Job Entry)
•
BulkLoad from Mysql to file
•
BulkLoad into MSSQL
•
Bulkload into MySQL
•
Check Db connections
•
Check Files Locked
•
Check if a folder is empty
•
Check if connected to Repository
•
Check if XML file is well formed
•
Check webservice availability
•
Checks if files exists
•
Columns exist in a table
•
Compare folders
•
Convert file between Windows and Unix
•
Copy Files
•
Create a file
•
Create a folder
•
Delete a file
•
Delete filenames from result
•
Delete Files
•
Delete folders
•
Display Msgbox info
•
DTD Validator (Job Entry)
•
Dummy Job Entry
•
Evaluate files metrics
•
Example Plugin
•
Export repository to XML file
•
File compare
•
File Exists (Job Entry)
•
FTP Delete
•
Get a file with FTP
•
Get a file with FTPS
•
Get a file with SFTP
•
Get Mails from POP
•
Hadoop Copy Files
•
Hadoop Job Executor
•
HL7 MLLP Acknowledge
•
HL7 MLLP Input
•
HTTP
•
JavaScript (job entry)
•
Job (Job Entry)
•
Mail
•
Move files
•
MS Access Bulk Load (Deprecated)
•
Oozie Job Executor
•
Palo Cube Create (Deprecated)
•
Palo Cube Delete (Deprecated)
•
Pentaho MapReduce
•
Ping a host
•
Process result filenames
Publish Model (Job Entry)
•
Put a file with FTP
•
Put a file with SFTP
•
Send information using Syslog
•
Send Nagios passive check
•
Send SMNP trap
•
Set variables (job entry)
•
Shell
•
SQL
•
Sqoop Export
•
Sqoop Import
•
SSH2 Get
•
SSH2 Put
•
Start
•
Start a PDI Cluster on YARN
•
Stop a PDI Cluster on YARN
•
Success
•
Table Exists (Job Entry)
•
Talend Job Execution (Deprecated)
•
Telnet
•
Transformation (job entry)
•
Truncate tables
•
Unzip file
•
Upload files to FTPS
•
Verify file signature with PGP
•
Wait for
•
Wait for a file
•
Wait for SQL
•
Write to File
•
Write to log
•
XSD Validator (Job Entry)
•
XSL Transformation (Job Entry)
•
Zip file
•
Pentaho Data Integration Screenshots
Pentaho Data Integration Recorded Demos
Pentaho Data Integration v3.2. Job Entries
Slave servers and clustering
Special database issues and experiences
Spoon User Guide
•
Step performance monitoring
•
What's new in PDI version 3.1
•
What's new in PDI version 3.2
Special Operating System issues and experiences
Writing your own Pentaho Data Integration Plug-In
Documenting Pentaho Data Integration (Kettle) Projects
•
Kettle dependency management
Kettle Exchange
•
Monitoring SWT Graphics Resources with Sleak
Data Quality Integration Home
•
Partitioning data with PDI
•
Import User Documentation
•
Configuring log tables for concurrent access
Pentaho Data Integration (aka Kettle) Concepts, Best Practices and Solutions
•
Pig Script Executor
•
Marketplace
The Thin Kettle JDBC driver
•
Database transactions in jobs and transformations
•
Job checkpoints and restartability
•
Carte Configuration
•
Column Format
•
MongoDB Output IC
•
NuoDB
•
Documentation Template for Steps and Job Entries
•
MongoDB Input IC
•
Services_Yarn_Documentation
Alfresco Output Plugin for Kettle
Pentaho Data Integration Steps
•
What's new in PDI 4.0
Show more below
Blogs
You‘re viewing this with anonymous access, so some content might be blocked.
Close