Issue
I'm trying to upload a jpg file to AWS S3 bucket with Camel's aws-s3 producer. Can I make this work with this approach and if yes how? Now I'm only getting an IOException and can't figure out what would be the next step. I know I could implement the upload using TransferManager from the aws-sdk but now I'm only interested in Camel's aws-s3 endpoint.
Here is my route with Camel 2.15.3:
public void configure() {
from("file://src/data?fileName=file.jpg&noop=true&delay=15m")
.setHeader(S3Constants.KEY,constant("CamelFile"))
.to("aws-s3://<bucket-name>?region=eu-west-1&accessKey=<key>&secretKey=RAW(<secret>)");
}
and the exception I get from running that route:
com.amazonaws.AmazonClientException: Unable to create HTTP entity: Stream Closed
at com.amazonaws.http.HttpRequestFactory.newBufferedHttpEntity(HttpRequestFactory.java:244)
at com.amazonaws.http.HttpRequestFactory.createHttpRequest(HttpRequestFactory.java:122)
at com.amazonaws.http.AmazonHttpClient.executeHelper(AmazonHttpClient.java:415)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:273)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:3660)
at com.amazonaws.services.s3.AmazonS3Client.putObject(AmazonS3Client.java:1432)
at org.apache.camel.component.aws.s3.S3Producer.processSingleOp(S3Producer.java:209)
at org.apache.camel.component.aws.s3.S3Producer.process(S3Producer.java:71)
at org.apache.camel.util.AsyncProcessorConverterHelper$ProcessorToAsyncProcessorBridge.process(AsyncProcessorConverterHelper.java:61)
at org.apache.camel.processor.SendProcessor.process(SendProcessor.java:129)
at org.apache.camel.management.InstrumentationProcessor.process(InstrumentationProcessor.java:77)
at org.apache.camel.processor.RedeliveryErrorHandler.process(RedeliveryErrorHandler.java:448)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:118)
at org.apache.camel.processor.Pipeline.process(Pipeline.java:80)
at org.apache.camel.processor.CamelInternalProcessor.process(CamelInternalProcessor.java:190)
at org.apache.camel.component.file.GenericFileConsumer.processExchange(GenericFileConsumer.java:439)
at org.apache.camel.component.file.GenericFileConsumer.processBatch(GenericFileConsumer.java:211)
at org.apache.camel.component.file.GenericFileConsumer.poll(GenericFileConsumer.java:175)
at org.apache.camel.impl.ScheduledPollConsumer.doRun(ScheduledPollConsumer.java:174)
at org.apache.camel.impl.ScheduledPollConsumer.run(ScheduledPollConsumer.java:101)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Stream Closed
at java.io.FileInputStream.readBytes(Native Method)
at java.io.FileInputStream.read(FileInputStream.java:246)
at com.amazonaws.services.s3.internal.RepeatableInputStream.read(RepeatableInputStream.java:167)
at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:73)
at com.amazonaws.services.s3.internal.MD5DigestCalculatingInputStream.read(MD5DigestCalculatingInputStream.java:88)
at com.amazonaws.internal.SdkFilterInputStream.read(SdkFilterInputStream.java:73)
at com.amazonaws.event.ProgressInputStream.read(ProgressInputStream.java:151)
at java.io.FilterInputStream.read(FilterInputStream.java:107)
at org.apache.http.util.EntityUtils.toByteArray(EntityUtils.java:136)
at org.apache.http.entity.BufferedHttpEntity.<init>(BufferedHttpEntity.java:63)
at com.amazonaws.http.HttpRequestFactory.newBufferedHttpEntity(HttpRequestFactory.java:242)
... 27 more
Solution
I did some digging and found one solution. Route works if you convert file contents to byte array before passing it to the aws-s3 endpoint like this:
from("file://src/data?fileName=file.jpg&noop=true&delay=15m")
.convertBodyTo(byte[].class)
.setHeader(S3Constants.CONTENT_LENGTH, simple("${in.header.CamelFileLength}"))
.setHeader(S3Constants.KEY,simple("${in.header.CamelFileNameOnly}"))
.to("aws-s3://{{awsS3BucketName}}"
+ "?deleteAfterWrite=false®ion=eu-west-1"
+ "&accessKey={{awsAccessKey}}"
+ "&secretKey=RAW({{awsAccessKeySecret}})")
.log("done.");
}
There must also be S3Constants.CONTENT_LENGTH header value set to the file length in bytes.
The solution above reads whole file to memory so it's not ideal to every situation. However the code above is also the most simple way that I know of using aws-s3 producer endpoint. I'm still happy to hear about other (and better) solutions.
Answered By - jnupponen