I am looking forward to visualize mom_logs on Kibana in Elk Stack. As a prerequisite we need to design a Grok Filter. As a part of the setup I do have setup for ELK Stack ready. I am able to send PBS mom_logs to kibana as well.
Example of Log and its filter in logstash
Log: localhost GET /v2/applink/5c2f4bb3e9fda1234edc64d 400 46ms 5bc6e716b5d6cb35fc9687c0
For e.g as this mom log has multiple factors like date, time stamp, job id. My query is how do I get to know about the parameters that are to be defined in the Grok Filter
The log record format is the same as used by other PBS daemons, with the addition of the thread number and the daemon name in the log record. The log record format is as follows:
event_code : Please refer the Table 2-38: tracejob Filters in the RG-229 ( https://www.pbsworks.com/pdfs/PBSReferenceGuide19.2.1.pdf) daemon_name: pbs_mom object_type : type of object the message is about Job, Que, Svr, Req, Node,Hook object_name: name of the specific object message_text : text of the log message
It is good to see your project with mom_logs and Kibana, please share your experiences on this with the community once you have some results. This might spawn some interests and discussions.
Just to give an overview ELK Stack consists of four main components i.e: Elasticsearch, Logstash, Kibana and Filebeat(to be installed on client machine).
Elasticsearch, Logstash, Kibana were installed on Centos7 instance and Filebeat was installed on PBS Pro Cluster(Has shared file system).
Filebeat can be installed on multiple clients in-case the log data is coming from different sources.
Grok filter in logstash helps us reading pbs_mom logs. It needs to be designed as per your requirements.
Analyzed PBS mom_logs through Kibana and have created a sample dashboard which shows visualizations of the data. Below are the snapshots:
Is it possible to get some walk through document on how to get all of these configured. I had been meaning to play with either the ELK or EFK stack for a long time now, but never found the time.