The Highland Park Shooter Published His Plan for All to See
Is it possible to start holding social media accountable?
Archived footage of a YouTube channel owned by suspected gunman Robert Crimo III explicitly detailed violence as a foreshadowing of what he would do during the July 4 parade in Highland Park, a Chicago suburb.
On Monday, Crimo climbed to a rooftop overlooking the parade route in Highland Park and fired more than 70 rounds into the crowd using a legally obtained high-powered rifle. The shooting left seven people dead and more than 30 injured. Crimo, dressed as a woman, blended into the crowd and evaded police until several hours later.
On the since-deleted YouTube channel Zerotwo, Crimo revealed violent tendencies and foretold the bloody events that would happen on the Fourth of July. “The most recent video on the channel, uploaded eight months ago, included a cartoon figure shooting people and a voice-over that implied violence,” according to NBC News. Another video “shows a wide shot of an empty street in the early morning. The camera is moving as if attached to the back of a vehicle driving down the street, and an emergency siren sound can be heard over the footage, although it is unclear whether the audio was edited into the footage.”
Crimo, who also went by the moniker Awake the Rapper, posted another violence-laced video that included this voice-over: “It is my destiny. Everything has led up to this. Nothing can stop me, not even myself. Is there such a thing as free will, or has this been planned out like a cosmic recipe? It is what I’ve been waiting for in the back of my head, ready to be awakened. It’s what I was sent here to do, like a sleepwalker walking steady with my head held high, like a sleepwalker walking blindly into the night.”
This is not the first time a suspected shooter has posted his intentions on social media. Prior to his attack at a Tops Friendly Market in Buffalo in May, Payton Gendron posted a manifesto that included racist beliefs. The incident, which Gendron live-streamed, left 10 people dead and three injured; all but two of the victims were Black.
Gendron’s preferred social media channels were those considered by some to be more “radical,” like Twitch, 4chan and 8chan. But Crimo used YouTube, a popular channel with an estimated 2.24 billion users, to outline a plan for violence and murder. His social media posts and videos have been removed, but they stayed up long enough for investigators to realize that murderous intentions had been posted in plain sight.
This begs the question: Can social media be held accountable for the spate of mass shootings in the United States?
At least one state attorney general is planning to find out. Letitia James, Attorney General of New York, has launched an investigation into the social media channels that Gendron allegedly used to discuss and even live stream the Buffalo shooting.
"The terror attack in Buffalo has once again revealed the depths and danger of the online forums that spread and promote hate," James said. "The fact that an individual can post detailed plans to commit such an act of hate without consequence, and then stream it for the world to see is bone-chilling and unfathomable. As we continue to mourn and honor the lives that were stolen, we are taking serious action to investigate these companies for their roles in this attack."
But how liable can social media channels be for threats or acts of violence posted to their channels? Section 230 of the Communications Decency Act states that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” In other words, YouTube can’t be directly tied to the violence Crimo posted on his channel. That sounds like immunity for social media platforms.
In its user guidelines, YouTube notes that content will be removed if it “(1) is in breach of this Agreement or (2) may cause harm to YouTube, our users, or third parties,” but doesn’t say what is meant by harm. Facebook says it does not allow “hate speech, credible threats or direct attacks on an individual or group” as well as “content that contains self-harm or excessive violence.” Rules and regulations for 4chan are detailed but appear to have fewer restrictions on what cannot be posted.
Social media platforms impose many self-determined rules for their sites. That’s why Twitter can suspend certain accounts like the Babylon Bee or Donald Trump but keep other controversial accounts active, and Facebook can give users a warning or put them in “Facebook jail” for doing something as innocuous as posting too many times.
The First Amendment’s guarantee of free speech may limit what penalties, if any, that can be given to social media platforms that allow violent content and videos. Perhaps, then, it is less about government involvement and more about moral obligation.
If a platform determines, for instance, that a tweet by Trump in which he uses the term “American Patriots” to describe some of his supporters is “interpreted as support for those committing violent acts at the US Capitol,” might another platform remove hate-filled rants or violent videos to protect potential victims? Can/should YouTube and other platforms report to the FBI users who talk about committing deadly acts? If so, is it possible that some of these murderous events could have been prevented? Or would that be violating someone’s rights, even if that person was intending to do harm? These are complex questions with no simple answers in sight.
I love the conclusion. Complex questions. No simple answers.