Work in Progress
FAQ
- Limitations
Link to the SQL Reference
- Roadmap
The following things are next on the agenda:
Caching of services data in memory with validity time-out
Caching of queries in memory with validity time-out
writing to a service transformation (INSERT INTO)
...
Frequently Asked Questions
Are there any limitations on the types and number of input connections that can supply data via the thin Kettle JDBC driver?
- No, any PDI input step can be used and there are no restrictions on the number of inputs that feed the data
- You can also use SQL to invoke an orchestration
Are there any limitations in terms of data volumes or result set sizes?
- No, there are no fixed limitations, however the following characteristics must be considered:
- All data is streamed through the transformation and the JDBC driver
- Transformations will be executed for each individual JDBC connection; there is no cache sharing across connections.
- There are limitations for specific SQL queries, see the Limitations section on the JDBC and SQL Reference page.
Do all Pentaho products and modules work with the thin Kettle JDBC driver? For example: Can I create a Mondrian cube definition against it or can I create a Pentaho Metadata view?
- Yes, all Pentaho tools and products are certified to work with the thin Kettle JDBC driver, subject to the single-table limitation.
- Pentaho C*Tools applications are also supported when the application has been developed by Pentaho Professional Services
Will Pentaho support using the thin Kettle JDBC driver with third-party tools?
- Pentaho supports the JDBC API and explicitly documents the SQL dialect elements that are supported.
- Third-party tools may work with PDI-DS but there are no immediate plans to certify third-party tools.
- We already support a broad range of SQL, but we are not SQL-92 compliant.
- The thin Kettle JDBC driver covers common types of queries that our Pentaho tools and products generate.
- For further details, please see the JDBC and SQL Reference page.