i hate how black tv shows now are almost always centered around racism and explaining racism/microaggressions to white people as opposed to just having shows about black people simply living their lives and the subject of racism occurring naturally as it does in real life
Right like when did this shit become entertaining