Chelsea Manning on Sharing Military Documents With Wikileaks: It Wasnt a Mistake

Chelsea Manning on Sharing Military Documents With Wikileaks: \It Wasn\t a Mistake\
Chelsea Manning issues grim warnings about cybersecurity
Chelsea Manning, the former Army intelligence analyst and whistleblower who was convicted of leaking classified information, talked about re-entering civilian life after spending seven years in federal prison.

Related 'This Is Us' Creator Dan Fogelman & Cast Talk Future Storylines And The Show's Optimism In A Time Of Divide – SXSW

In the year since President Obama commuted her 35-year sentence for stealing classified U.S. military files and diplomatic cables and leaking them to WikiLeaks, Manning has resumed her role of public advocacy — declaring her candidacy for the U.S. Senate in Maryland, granting press interviews and peaking at public forums like today’s appearance at South by Southwest.

But Manning said her private life has been less than the “happy ending” she imagined it would be, once she left her prison cell in Fort Leavenworth in Kansas.

“I’m dealing with a lot of loneliness and struggling to adjust to life,” Manning said in conversation with Vogue’s creative digital director, Sally Singer.

Im not afraid of being a former prisoner,” Manning said. “Im not afraid of being a trans person. Im not afraid of being who I am and saying this is why I did—what happened. And Im going to continue to do things.” 

Manning talked about the difficulty of re-entering civilian life following incarceration: How she emerged from federal prison displaced, without so much as a valid driver’s license. This transition was complicated by the fact that her name and gender-marker had changed.

She said when she got out, she saw predictive policing as a “feedback loop” where neighborhoods that were already heavily policed due to racism and biases were even more targeted after biased data was entered into algorithms. 

“I couldn’t get an apartment,” Manning said. “For a while I crashed in lower Manhattan. I couldn’t move anywhere. I didn’t have a photo ID. I didn’t have access to my bank account. I had to wait for my lawyers to give em an allowance from my own bank accounts.”

Manning has since returned to her native Maryland, where she is struggling to adjust to live alone, without being surrounded by inmates.

Manning was echoing a common critique of artificial intelligence systems, and one that shes made before. Facial recognition algorithms largely trained on white faces fail to recognize dark skin, for instance, and predictive policing input is skewed by departments heavily policing certain neighborhoods. Manning called for an ethical framework that would govern software development. We as technologists and as developers, especially those of us that work on systems that affect millions of people — and yes, Im talking about the Twitter algorithms, the Google algorithms, as well as predictive policing — we need to be aware of the consequences of what were making, she said. Like doctors have a code of ethics, software developers should have a code of ethics.

“I’m not used to it,” Manning said. “I get lonely, especially at night. Some of the darkest and loneliest moments that I’ve had are at 1 o’clock in the morning. I’m in this big apartment and I’m all by myself.”

Manning said she’s also found America to be a more frightening place than when she entered prison, a nation marked by a militarization of policing and caustic political rhetoric.

Manning compared her work on predictive analysis in the Army a decade ago to how she fears modern programmers have approached artificial intelligence. The idea of using algorithms in government, and in making decisions about credit reporting, for instance, is that its better. That if we just write a better algorithm, more accurate algorithm, if I just math the crap out of this problem … If I just math it really well, I can problem-solve. And I came into Iraq with that mindset, she said. The algorithms themselves are not unbiased. we put our biases in there when we write it. And we also feed it data that might be biased to begin with.

“It’s become so much darker and scarier,” said Manning. “This has been decades in the making and it’s not an aberration. The political rhetoric and style of governance we’ve been seeing is not an aberration. It’s the conclusion of systems that we’ve built.”

Part of the problem, Manning said, was the collection of huge amounts of information that can be repurposed over time. I operate at paranoia levels of security. That said, what I advise people the most is: be self-aware of the information youre putting out there. That includes information thats technically voluntary, like allowing location tracking by phone apps, as well as data thats incredibly difficult to keep private, like purchase histories that can be sold to advertisers.

Manning said she sees overtones of what she encountered during her service in Iraq, where, then known as Bradley Manning, the Army private observed a stifling of political dissent and indifference to ethnic cleansing.

“I see this mindset has, on the streets of American cities today, where police are viewing entire neighborhoods as criminals. This mentality and tools that are used — the algorithms I worked on in Iraq — have found their way into policing.”

Manning talked about her military work in predictive analytics — how she used math to anticipate when the next attack might occur. She said she struggled to convince senior officers that U.S. military offensives were contributing to a “feedback loop” that would produce the same predictable reaction.

In systems that affect millions of people, technologists and developers have a responsibility to be aware of the potential consequences of the tools they develop. Manning, herself, said shes been in the hot seat to ship code quickly and has gotten caught up in the development process without thinking too much about the bigger picture.

The former intelligence analyst talked about the biases inherent in these algorithms — both in who writes the code, and the data that’s entered. This has implications for civilian police forces that rely on algorithms to inform decisions, she said.

Manning pointed to how before she served in Iraq, she worked with marketing data to write predictive algorithms based on purchase history to try to determine where future customers may be, and how to maintain customer loyalty. In Iraq, Manning said she reused that knowledge to write algorithms for war.

“We have to remember statistics themselves are not unbiased,” Manning said. “If you’re over-policing a neighborhood, and you feed that into an algorithm, it will predict more crime. So it’s not unbiased.”

Manning called on developers to adopt a code of ethics — to take responsibility for the software programs they write to guard against potential abuse. She said in her own experience as a coder, the algorithms she wrote to predict consumer purchasing behavior, based on past history, informed her work in Iraq.

Its also possible that engineers feed it biased data, she said, and pointed to predictive policing. With predictive policing, if a police department is already over-policing a neighborhood due to racism and biases, for example, the data they put into it will already be tainted.

“I reused that knowledge to write software in Iraq,” Manning said. “It’s the same algorithmic basis … It was not for marketing anymore. It was to find people to kill.”

Manning expressed no regret for her decision, in 2010, to leaking more than 70,000 files to Wikileaks, including video taken during an American helicopter attack in Baghdad in which civilians were killed.

In Iraq, Manning said she had gone in with the mindset that algorithms could make things better. But during her service, Manning said she noticed people would pick and choose the data they would and wouldnt believe.

“I’m more focused on what’s happening in 2018 than I am with what happened in 2010,” Manning said.

OUT100 Newsmaker of the Year Chelsea Manning attends OUT Magazine #OUT100 Event presented by Lexus at the the Altman Building on November 9, 2017 in New York City. Bryan Bedder—Getty Images for OUT Magazine By Valentina Zarya 2:43 PM EDT Chelsea Manning has no regrets.

Speaking at the SXSW Conference Tuesday morning, the former U.S. Army intelligence analyst said she had no regrets about her “data dump” of hundreds of thousands of classified military documents with WikiLeaks in 2010.

“I made a decision to do something and I made that decision and Im owning that decision. When it comes to something like that its not about second-guessing it or regretting it,” Manning told the audience in Austin, Tex.

Manning, who served seven years in prison for violating the Espionage Act before her sentence was commuted by President Obama in 2017, says she has a tendency to look forward instead of backward. She is doing just that now, having recently filed to run for the U.S. Senate in her home state of Maryland.

While she noted that her appearance at the Texas tech conference was “not a campaign appearance,” Manning spent a lot of time discussing the current political environment, which she described as “hostile.”

Being released from prison and being thrown headfirst into the new Trump administration was “everything that I feared,” she recalled. “The militarization of police, styles of policing, everythings changed and become so much darker and so much scarier.” She does not, however, see the current administration as particularly unique: “The political rhetoric and style of governance weve been seeing is not an aberration.”

In addition to launching her political campaign, Manning is writing a book and working on a documentary about her life. She gained national attention for undergoing gender conversion therapy—she identifies as a trans woman—while still in prison.

Posted in World