Loggly - Refactoring Context Format - Indexing the Unique Limit of Field Names
Over the past few months, we have logged into Loggly incorrectly. Our contexts have historically been a numeric array of strings.
['message1', 'message2, 'message3' ...]
We want to send loggly an array of floats that should use fewer keys.
An example of a new loggly payload:
['orderId' => 123, 'logId' => 456, 'info' => json_encode(SOMEARRAY)]
When testing the new format in which we have a cleaner logging format, Loggly provides the following message:
2 out of 9 submitted in this case were not indexed due to the maximum allowed (100) for this account exceeded unique field names. The following fields were affected: [json.context.queue, json.context.demandId]
We are on a 30 day plan. Does this mean that in order for our contexts to be indexed correctly, we need to wait 30 days for the old indexed logs to expire? Is there a way to rebuild the indexing to accommodate the new format journals?
source to share