More than 50 executives across the fields of tech, finance, retail, and real estate signed onto a statement released Wednesday by the Leadership Now Project, a group founded by Harvard Business School alumni focused on protecting democracy.
“America has successfully held elections through previous challenges, like the Civil War, World Wars l and ll, and the 1918 flu pandemic… we can and must do so again,” the group said in the statement. “As business leaders, we know firsthand that the health of America’s economy and markets rests on the founding principle of our democracy: elections where everyone’s vote is counted.”
Earlier this month, the editors of Scientific American, published an all-out, endorsement of Joe Biden for President—something unprecedented in the journal’s 175 year history. Then, last week, all of the New England Journal of Medicine’s editors signed a scathing review of the Trump administration’s handling of the COVID-19 emergency, calling for Trump to be voted out of office.
In truth, both editorials offer several valid criticisms of the administration on scientific grounds. And to be clear: The present article is not making any counter-endorsement of Donald Trump—far from it.
Rather, we pose an important question: Are high-profile scientists crossing a dangerous line by using their trusted platforms to influence the election? Based on behavioral science, we believe they are and their actions come at the risk of diminishing the public’s trust in
A group of tech companies dismantled a powerful hacking tool used by Russian attackers just three weeks before the US presidential election. On Monday, Microsoft announced actions against Trickbot, a Russian botnet that’s infected more than a million computers since 2016 and that’s behind scores of ransomware attacks.
Cybersecurity experts have raised concerns about ransomware attacks casting doubt on election results. While a ransomware attack wouldn’t change votes and could only lock up machines, the chaos stirred by a cyberattack could create uncertainty about the outcome of the results.
Election officials in most states have offline backup measures in the event of a ransomware attack, but have a harder time tackling the disinformation that comes with getting hacked.
An expected surge in election-related volatility in the U.S. stock market is paving the way for Asian shares to make a run at besting their American peers.
Since hitting an all-time low relative to the S&P 500 on Sept. 2, the MSCI Asia Pacific Index has outperformed the U.S. benchmark by almost five percentage points. That nascent trend is expected to persist at least through the November poll and potentially beyond, according to strategists.
“There is a better than average chance that Asian stocks will outperform U.S. stocks over the course of the next month,” said Eoin Murray, head of investment for international business at Federated Hermes. “The volatility rise will be more pronounced in U.S. risk assets, and will pervade more globally but with less strength.”
Fears about a contested election result and President Donald Trump’s decision not to push for further stimulus
Twitter is imposing tough new rules that restrict candidates from declaring premature victory and tighten its measures against spreading misinformation, calling for political violence and spreading thoughtless commentary in the days leading up to and following the Nov. 3 U.S. election.
The social platform will remove tweets that encourage violence or call for people to interfere with election results. Tweets that falsely claim a candidate has won will be labeled to direct users to the official U.S. election
The effort is part of what Gen. Paul Nakasone, the head of Cyber Command, calls “persistent engagement,” or the imposition of cumulative costs on an adversary by keeping them constantly engaged. And that is a key feature of CyberCom’s activities to help protect the election against foreign threats, officials said.
“Right now, my top priority is for a safe, secure, and legitimate 2020 election,” Nakasone said in August in a set of written responses to Washington Post questions. “The Department of Defense, and Cyber Command specifically, are supporting a broader ‘whole-of-government’ approach to secure our elections.”
Trickbot is malware that can steal financial data and drop other malicious software onto infected systems. Cyber criminals have used it to install ransomware, a particularly nasty form of malware that encrypts users’ data and for which the criminals then demand payment — usually in cryptocurrency — to unlock.
Twitter’s moves, like those announced recently by Facebook, are aimed mainly at combating efforts to manipulate the political landscape at critical moments in the hotly contested national vote. The policy changes are the culmination of years of reforms intended to prevent a repeat of 2016′s electoral debacle on social media, when disinformation, false news reports and Russian interference rampaged virtually unchecked across all major platforms.
“Twitter has a critical role to play in protecting the integrity of the election conversation, and we encourage candidates, campaigns, news outlets and voters to use Twitter respectfully and to recognize our collective responsibility to the electorate to guarantee a safe, fair and legitimate democratic process this November,” company officials said in a blog post published at noon Friday. The authors were Vijaya Gadde, the Legal, Policy and Trust & Safety Lead at Twitter, and Kayvon Beykpour, its product lead.
The social network says the move is intended to limit misinformation and abuse of its service, following broad criticism that it has not done enough to stamp out falsehoods on its platform. Facebook hasn’t said how long the ad suspension will last, but in an internal memo to its sales staff that was obtained by the Washington Post, executives told staff to tell advertisers the ban would last a week.
The changes less than a month before Election Day underscore how tech companies are scrambling to address a fast-changing political environment.
Tech companies have been making key changes to rein in disinformation since Russia used their platforms in 2016 to divide and sow discord among Americans. But critics say many of those steps to limit foreign influence haven’t gone far enough to address disinformation emanating from within the United States – often from the megaphone of the president.
Facebook on Wednesday said it will stop running political or social issue ads after the US polls close on November 3 to reduce chances of confusion or abuse.
The leading social network also said that any posts prematurely declaring a winner or contesting the count will be labeled with reliable information from news outlets and election officials.
“If a candidate or party declares premature victory before a race is called by major media outlets, we will add more specific information in the notifications that counting is still in progress and no winner has been determined,” said vice president of integrity Guy Rosen.
Facebook and other social networks have been tightening rules as they gear up for post-election scenarios, including efforts by President Donald Trump to wrongly claim victory or contend the outcome is not legitimate.
The California-based internet giant has been under pressure to avoid being used to spread misinformation