-
Notifications
You must be signed in to change notification settings - Fork 36
Open
Description
Hello,
We are running a MapR custer and webHDFS is not supported by MapR. So we are trying to populate hadoop using httpFS.
Our Webhdfs config :
@type webhdfs
host mapr-mapr-master-0
port 14000
path "/uhalogs/docker/docker-%M.log"
time_slice_format %M
flush_interval 5s
username mapr
httpfs true
However when using the fluentd plugin, logs are appended correclty to an existing file. But if the file does not exist (using a timestamp-based filename), we get a WebHDFS::ServerError instead of a WebHDFS::FileNotFoundError that would create the file I guess.
Error 500 received by Mapr :
{
"RemoteException": {
"message": "Append failed for file: /uhalogs/docker/testfile.log, error: No such file or directory (2)",
"exception": "IOException",
"javaClassName": "java.io.IOException"
}
}
logs by fluentd-webhdfs plugin :
2017-01-12 13:59:09 +0000 [warn]: failed to communicate hdfs cluster, path: /uhalogs/docker/docker-58.log
2017-01-12 13:59:09 +0000 [warn]: temporarily failed to flush the buffer. next_retry=2017-01-12 14:00:13 +0000 error_class="WebHDFS::ServerError" error="{\"RemoteException\":{\"message\":\"Append failed for file: \\/uhalogs\\/docker\\/docker-58.log, error: No such file or directory (2)\",\"exception\":\"IOException\",\"javaClassName\":\"java.io.IOException\"}}" plugin_id="object:3fe5f920c960"
2017-01-12 13:59:09 +0000 [warn]: suppressed same stacktrace
related code :
https://github.com/fluent/fluent-plugin-webhdfs/blob/master/lib/fluent/plugin/out_webhdfs.rb#L262
What I am not sure and I can't find proper specifications for HttpFS on the web is :
- Is it a bad implementation of httpFS on MapR side or should we handle this exception as well on the fluentd plugin ?
Thank You
Alban
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels