Idaho senator Jim Risch, the top Republican on the Foreign Relations Committee—who also serves on the Intelligence Committee—says he’d be surprised if they didn’t mimic the digital pressure campaign that experts say caused the bank runs. “We see all kinds of input from foreign actors trying to do harm to the country, so it’s really an obvious avenue for somebody to try to do that,” Risch says.
Some experts think the threat is real. “The fear is not overblown,” Peter Warren Singer, strategist and senior fellow at New America, a Washington-based think tank, told WIRED via email. “Most cyber threat actors, whether criminals or states, don’t create new vulnerabilities, but notice and then take advantage of existing ones. And it is clear that both stock markets and social media are manipulatable. Add them together and you multiply the manipulation potential.”
In the aftermath of the GameStop meme-driven rally—which was partly fueled by a desire to wipe out hedge funds shorting the stock—experts warned the same techniques could be used to target banks. In a paper for the Carnegie Endowment, published in November 2021, Claudia Biancotti, a director at the Bank of Italy, and Paolo Ciocca, an Italian finance regulator, warned that financial institutions were vulnerable to similar market manipulation.
“Finance-focused virtual communities are growing in size and potential economic and social impact, as demonstrated by the role played by online groups of retail traders in the GameStop case,” they wrote, “Such communities are highly exposed to manipulation, and may represent a prime target for state and nonstate actors conducting malicious information operations.”
The government’s response to the Silicon Valley Bank collapse—depositors’ money was quickly protected—shows banks can be hardened against this kind of event, says Cristián Bravo Roman—an expert on AI, banking, and contagion risk at Western Ontario University. “All the measures that were taken to restore trust in the banking system limit the ability of a hostile attacker,” he says.
Roman says federal officials now see, or at least should see, the real cyberthreat of mass digital hysteria clearly, and may strengthen provisions designed to protect smaller banks against runs. “It completely depends on what happens after this,” Roman says. “The truth is, the banking system is just as political as it is economic.”
Preventing the swell of online panic, whether real or fabricated, is far more complicated. Social media sites in the US can’t be easily compelled to remove content, and they are protected by Section 230 of the Communications Decency Act of 1996, which shields tech companies from liability for what others write on their platforms. While that provision is currently being challenged in the US Supreme Court, it’s unlikely lawmakers would want to limit what many see as free speech.
“I don’t think that social media can be regulated to censor talk about a bank’s financial condition unless there is deliberate manipulation or misinformation, just as that might be in any other means of communicating,” says Senator Richard Blumenthal, a Connecticut Democrat.
“I don’t think we should offer a systemic response to a localized problem,” says North Dakota Republican senator Kevin Cramer—although he adds that he wants to hear “all the arguments.”
“We need to be very cautious to not get in the way of speech,” Cramer says. “But when speech becomes designed specifically to short a market, for example, or to lead to an unnecessary run on the bank, we have to be reasonable about it.”
While some members of Congress are using the run on Silicon Valley Bank to revive conversations about the regulation of social media platforms, other lawmakers are, once again, looking to tech companies themselves for solutions.“We need to be better at discovering and exposing bots. We need to understand the source,” says Senator Angus King, a Maine Independent.
King, a member of the Senate Intelligence Committee, says Washington can’t solve all of Silicon Valley’s problems, especially when it comes to cleaning up bots. “That has to be them,” he says. “We can’t do that.”