Like a penetration test or a security test, you don't know where you're out of compliance until you perform an audit; you don't normally shut down your site in the meantime.
The correct way to comply with this is to suggest that you don't believe that you're out of compliance, and to request specific guidance on which particular datasets aren't in compliance that should be removed.
I wrote a webapp to pull your local covid wastewater data from cdc data. It's still working, so they either left their feeds in place or haven't gotten to them yet.
The main site was replaced with a mostly empty blank page earlier today.
The bizarre thing was that the page source at that time had almost nothing except some odd javascript to build a "508 Daily Report" table of Jira tickets or some nonsense.
https://old.reddit.com/r/DataHoarder/comments/1iekywr/cdc_we...
Dead Comment
The correct way to comply with this is to suggest that you don't believe that you're out of compliance, and to request specific guidance on which particular datasets aren't in compliance that should be removed.
Dead Comment
https://madmonk13.github.io/covid_ww_levels/
[edit] Oh, I see, the actual 2020 census data is down: https://www.census.gov/2020results
Says for maintenance.
The bizarre thing was that the page source at that time had almost nothing except some odd javascript to build a "508 Daily Report" table of Jira tickets or some nonsense.
That view appears to still be up here: https://www.census.gov/main/
I can't really blame someone for a bad deployment given the raging government dumpster fires at the moment.