In the past few weeks as the Valorum team has had less "day-job" hours to dedicate to maintaining scrapers, we've seen a number of data integrity issues pop up (Thanks CAN team for opening the issues and helping us!)
Most of these issues seem like they could be caught automatically, instead of manually by inspecting or using the data.
I'd like to start a discussion for how we could automate a scheduled check/validation of the data
I'm opening this issue as a place to host that discussion.
My eventual plan is to turn the items identified in this issue into a DAG that is run on a schedule as part of our airflow setup.
What I'm asking for is ideas/support for different steps that can be part of that dag.
Some ideas are:
cc @cc7768 @tlyon3 @mikelehen (please ping anyone else I missed!)
In the past few weeks as the Valorum team has had less "day-job" hours to dedicate to maintaining scrapers, we've seen a number of data integrity issues pop up (Thanks CAN team for opening the issues and helping us!)
Most of these issues seem like they could be caught automatically, instead of manually by inspecting or using the data.
I'd like to start a discussion for how we could automate a scheduled check/validation of the data
I'm opening this issue as a place to host that discussion.
My eventual plan is to turn the items identified in this issue into a DAG that is run on a schedule as part of our airflow setup.
What I'm asking for is ideas/support for different steps that can be part of that dag.
Some ideas are:
_totalcolumns don't have decreases in their value, as they are meant to be cumulative (ref Arkansas counties have 0 *_tests_total numbers since 2020-09-12 #125)cc @cc7768 @tlyon3 @mikelehen (please ping anyone else I missed!)