Details
Details
Assignee
Reporter
Labels
Edit Screen Preamble
<div class="notify info" style="margin-bottom: 10px;">
If you are a Pentaho customer, please use the <a href="http://support.pentaho.com">Customer Support portal</a> to log issues.
<p />
This system is used for logging bugs and enhancement requests only. Please use our <a href="https://community.pentaho.com">community at https://community.pentaho.com</a> if you have questions, configuration issues, or have an issue with a marketplace plugin as Pentaho does not support marketplace plugins unless written by Pentaho.
<p />
Lastly, when creating a bug, please provide as much detail as possible. To prevent unnecessary delays in reviewing your issue, please attach complete server logs, SQL/MDX logs where applicable, schemas, etc. Also, screen-shots and screen-cams are especially helpful in demonstrating the issue.
<p />
Thank-you so much,<br />
The Pentaho Team
</div>
PDI Sub-component
Notice
<div class="notify info" style="margin-bottom: 10px;">
When an issue is open, the "Fix Version/s" field conveys a target, not necessarily a commitment. When an issue is closed, the "Fix Version/s" field conveys the version that the issue was fixed in.
</div>
In 7.1 and prior customers were able to parameterize AWS Secret key and Secret Access Key as environment variables in kettle.properties
In the attached job you can see we are using a set variables step to define a local kettle.properties
This file contains the following two variables:
AWS_ACCESS_KEY_ID=111111111111111111111111 AWS_SECRET_ACCESS_KEY=xcxccxcxcxcxcxcxcxcxcxcxcxc
In the transformation the access key and secret access key are then replaced with the following:
<access_key>${AWS_ACCESS_KEY_ID}</access_key> <secret_key>${AWS_SECRET_ACCESS_KEY}</secret_key>
As you can see executing this in 7.1 works; see screenshot:
and you can see the file is loaded to the S3 bucket
Attempting to run the same job and transformation on 8.1 these variables are not read resulting on a failed job with the following stack trace:
2018/08/29 12:07:34 - S3 File Output.0 - Released server socket on port 0 2018/08/29 12:07:35 - S3 File Output.0 - We can not find parent folder [s3://s3/carlos-sample]! 2018/08/29 12:07:35 - S3 File Output.0 - ERROR (version 8.1.0.0-365, build 8.1.0.0-365 from 2018-04-30 09.42.24 by buildguy) : Couldn't open file s3://s3/carlos-sample/SamplewithVariables.txt 2018/08/29 12:07:35 - S3 File Output.0 - ERROR (version 8.1.0.0-365, build 8.1.0.0-365 from 2018-04-30 09.42.24 by buildguy) : org.pentaho.di.core.exception.KettleException: 2018/08/29 12:07:35 - S3 File Output.0 - Error opening new file : org.apache.commons.vfs2.FileSystemException: Could not create folder "s3://s3/carlos-sample". 2018/08/29 12:07:35 - S3 File Output.0 - 2018/08/29 12:07:35 - S3 File Output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.initFileStreamWriter(TextFileOutput.java:235) 2018/08/29 12:07:35 - S3 File Output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.initOutput(TextFileOutput.java:864) 2018/08/29 12:07:35 - S3 File Output.0 - at org.pentaho.di.trans.steps.textfileoutput.TextFileOutput.init(TextFileOutput.java:835) 2018/08/29 12:07:35 - S3 File Output.0 - at org.pentaho.amazon.s3.S3FileOutput.init(S3FileOutput.java:56) 2018/08/29 12:07:35 - S3 File Output.0 - at org.pentaho.di.trans.step.StepInitThread.run(StepInitThread.java:69) 2018/08/29 12:07:35 - S3 File Output.0 - at java.lang.Thread.run(Thread.java:748) 2018/08/29 12:07:35 - trans1 - Step [TI:MEMFile.0] initialized flawlessly. 2018/08/29 12:07:35 - trans1 - Step [S3 File Output.0] initialized flawlessly. 2018/08/29 12:07:35 - trans1 - Transformation has allocated 2 threads and 1 rowsets. 2018/08/29 12:07:35 - TI:MEMFile.0 - Starting to run... 2018/08/29 12:07:35 - S3 File Output.0 - Starting to run... 2018/08/29 12:07:35 - trans1 - ERROR (version 8.1.0.0-365, build 8.1.0.0-365 from 2018-04-30 09.42.24 by buildguy) : Errors detected! 2018/08/29 12:07:35 - S3 File Output.0 - Finished processing (I=0, O=0, R=0, W=0, U=0, E=1) 2018/08/29 12:07:35 - trans1 - Transformation detected one or more steps with errors. 2018/08/29 12:07:35 - trans1 - Transformation is killing the other steps! 2018/08/29 12:07:35 - TI:MEMFile.0 - Opening file: file:///C:/Users/carlopez/Desktop/CASES_IN_PROGRESS/S3CSV_INPUT/sample/customers.csv 2018/08/29 12:07:35 - TI:MEMFile.0 - This is a compressed file being handled by the None provider 2018/08/29 12:07:35 - TI:MEMFile.0 - Finished processing (I=1, O=0, R=0, W=0, U=1, E=0) 2018/08/29 12:07:35 - trans1 - ERROR (version 8.1.0.0-365, build 8.1.0.0-365 from 2018-04-30 09.42.24 by buildguy) : Errors detected! 2018/08/29 12:07:35 - Main - Finished job entry [Transformation] (result=[false]) 2018/08/29 12:07:35 - Main - Finished job entry [Set variables] (result=[false]) 2018/08/29 12:07:35 - Main - Job execution finished 2018/08/29 12:07:35 - Spoon - Job has ended.
Please see screenshot of 8.1 job failing
According to BACKLOG-23020 To address this bug, we need to default the S3 Text Output to "s3n://s3n" however this only happens with brand new transformations