Create a logStream for each log file in cloudwatchLogs

I am using a log agent AWS CloudWatch

to send application log to AWS Cloudwatch.

In a config file cloudwatchLogs

inside mine EC2 instance

, I have this entry:

[/scripts/application]
datetime_format = %Y-%m-%d %H:%M:%S
file = /workingdir/customer/logfiles/*.log
buffer_duration = 5000
log_stream_name = {instance_id}
initial_position = start_of_file
log_group_name = /scripts/application

      

According to this configuration, all log files in the workingdir directory are sent to cloudwatchLogs on the same stream as the name is the instance ID.

My question: I want to create a separate one for each log file logStream

, so that reading the logs can be faster and easier to read
. In other words, every time I have a new log file, a new log is automatically created.

I thought to do this using a shell script in a cron job, but then I will have to change many other configurations in the architecture, so I am looking for a way to do this in the config file. In the documentation, they say that:

log_stream_name

Specifies the destination log flow. You can use a literal string or predefined variables ({instance_id}, {hostname}, {ip_address}) or a combination of both to define the name of the log stream. A log stream is automatically created if it does not already exist.

Log file names may not be 100% predictable, but they always have this structure:

CustomerName-YYYY-mm-dd.log

      

Also, another problem is this:

The agent that is running must be stopped and restarted for the configuration changes to take effect.

How can I set logStream in this case?

Any ideas or suggestions or workarounds are greatly appreciated.

+3


source to share





All Articles