Writing a unified Log4j appender to Blobs, Files and Streams:
Events are preferred to be generated once even if they go to different destinations. This is the principle behind the appender technique used in many software development project. This technique is popularly applied to logging where the corresponding library is called log4j. Software components write to log once regardless of the display or the storage of the logging entries. An appender is simply a serializer of events with the flexibility to send to different destinations simultaneously. These entries are sent to the console or file or both usually. With the popularity of web accessible blobs and continuously appended streams, we have new log4j destinations.
This article explains how to write an appender for blobs, files and streams at the same time:
The first step is the setup. This involves specifying a configuration each for blob, stream and file. Each configuration is an appender and a logger. The set of configurations appear as a collection. The appender describes the destination, its name, target and pattern/prefix to be used. It can also include a burst filter that limits the rate. The Logger determines the verbosity and characteristics of the log generation. Specifying this configuration each for blob, file and stream makes up the setup step.
The second step is the implementation of the custom appender which we may need to pass in to the application as a runtime dependency usually or if necessary, as a compile time dependency such as when certain properties might be set on the appender only via code not declaration. The custom appender extends the well-known log4j AbstractAppender class and implements the method to register itself as an appender alongwith the method to take data to be appended to the target. In terms of a data structure, this is the equivalent of the initialization of a data structure and the method to add entries to the data structure. The latter is the actual logic of handling an event. Usually this appender is annotated as a Plugin and the registration method is annotated with the PluginFactory decoration
With these two steps, the appender is ready to be used with an application that wants to log to blob, file and stream all at the same time. This application will refer to the appender in its configuration by the same name that the plugin annotation was defined with. The logger defines the logging level to be used with this appender.
The above listed only the basic implementation. The appender can be made production ready with the help of improving reliability with error handling. It can override the custom error handler in this case which generates errors for the appender such as when used with a logging level less than that specified in the configuration.
Finally, the appender should follow the architectural standard set by the Application Block paradigm so that the implementation never interferes with the functionality of the applications generating the events.
Events can easily be generated with: https://github.com/ravibeta/JavaSamples/tree/master/EventGenerator
Events are preferred to be generated once even if they go to different destinations. This is the principle behind the appender technique used in many software development project. This technique is popularly applied to logging where the corresponding library is called log4j. Software components write to log once regardless of the display or the storage of the logging entries. An appender is simply a serializer of events with the flexibility to send to different destinations simultaneously. These entries are sent to the console or file or both usually. With the popularity of web accessible blobs and continuously appended streams, we have new log4j destinations.
This article explains how to write an appender for blobs, files and streams at the same time:
The first step is the setup. This involves specifying a configuration each for blob, stream and file. Each configuration is an appender and a logger. The set of configurations appear as a collection. The appender describes the destination, its name, target and pattern/prefix to be used. It can also include a burst filter that limits the rate. The Logger determines the verbosity and characteristics of the log generation. Specifying this configuration each for blob, file and stream makes up the setup step.
The second step is the implementation of the custom appender which we may need to pass in to the application as a runtime dependency usually or if necessary, as a compile time dependency such as when certain properties might be set on the appender only via code not declaration. The custom appender extends the well-known log4j AbstractAppender class and implements the method to register itself as an appender alongwith the method to take data to be appended to the target. In terms of a data structure, this is the equivalent of the initialization of a data structure and the method to add entries to the data structure. The latter is the actual logic of handling an event. Usually this appender is annotated as a Plugin and the registration method is annotated with the PluginFactory decoration
With these two steps, the appender is ready to be used with an application that wants to log to blob, file and stream all at the same time. This application will refer to the appender in its configuration by the same name that the plugin annotation was defined with. The logger defines the logging level to be used with this appender.
The above listed only the basic implementation. The appender can be made production ready with the help of improving reliability with error handling. It can override the custom error handler in this case which generates errors for the appender such as when used with a logging level less than that specified in the configuration.
Finally, the appender should follow the architectural standard set by the Application Block paradigm so that the implementation never interferes with the functionality of the applications generating the events.
Events can easily be generated with: https://github.com/ravibeta/JavaSamples/tree/master/EventGenerator
No comments:
Post a Comment