feat: Add JSON logging capability to integrated_channels#2460
feat: Add JSON logging capability to integrated_channels#2460sameeramin wants to merge 12 commits intomasterfrom
Conversation
22444a6 to
de952bc
Compare
b6ad6d5 to
0b6800a
Compare
pwnage101
left a comment
There was a problem hiding this comment.
Please do not add a layer of abstraction to functionality which already supports pluggable handlers. Using LOGGER.*() function calls directly inside of production code is also more pythonic, IMO.
|
Hey @pwnage101, Thank you for reviewing the PR I absolutely agree with the approach you proposed. Let me refactor my code. |
1a480f7 to
235515d
Compare
|
FYI, our current datadog log parsing configuration does NOT support json parsing. You need to come up with a proposal to add json parsing to the Datadog GROK Parser config in a way that is still backwards-compatible. For example, just changing
|
|
I minimally tested this approach and it seems to work, but I highly recommend you do your own testing to validate the approach for more scenarios: -_message_with_keyvalue %{regex("(?<message>.*)"):message_keys:keyvalue}
+_message_with_keyvalue (%{regex(".*(?= \\{.*\\})"):message} %{data:json_keys:json}|%{regex("(?<message>.*)"):message_keys:keyvalue})This matches existing messages optionally with keyvalue pairs, but if the message ends with JSON, then it will switch to that pattern which uses the Please perform testing on the actual Grok parser edit page (but do not hit save until you get peer review on the new regex): https://app.datadoghq.com/logs/pipelines/pipeline/of_y4sdqStWBHmooL2W3UQ/processors/edit/Wz-1KN_ISsyavH79U5dmgg |
|
This one is even better, IMO: Supports log lines with only a message: And also supports log lines with extra structured data: |
|
Ignore my latest comments, I did not realize the entire log line would be JSON, and would trigger the JSON preprocessing step in the datadog logs pipeline. |


This PR implements enhanced structured logging utilities for Integrated Channels to improve Datadog monitoring and alerting, addressing log level inconsistencies and poor structured data parsing identified in ENT-8024.
Merge checklist:
requirements/*.txtfiles)base.inif needed in production but edx-platform doesn't install ittest-master.inif edx-platform pins it, with a matching versionmake upgrade && make requirementshave been run to regenerate requirementsmake statichas been run to update webpack bundling if any static content was updated./manage.py makemigrationshas been run./manage.py lms makemigrationsin the shell.Post merge:
(so basically once your build finishes, after maybe a minute you should see the new version in PyPi automatically (on refresh))
make upgradein edx-platform will look for the latest version in PyPi.