Search Mailing List Archives

Limit search to: Subject & Body Subject Author
Sort by: Reverse Sort
Limit to: All This Week Last Week This Month Last Month
Select Date Range     through    

[liberationtech] New Yorker debut's Aaron Swartz's 'Strongbox.'

Eleanor Saitta ella at
Thu May 16 07:45:59 PDT 2013

Hash: SHA256

On 2013.05.16 10.45, Fabio Pietrosanti (naif) wrote:
> On 5/16/13 12:05 AM, Eleanor Saitta wrote:
>> Which parts of the Dead Drop architecture do you think are
>> unnecessary for a leaking platform?
> First of all "leaking" is not necessarily "whistleblowing" (it's
> like cracking vs hacking "wording debate" :P) .

Well, in this case, the system was designed to receive leaked
documents, fairly specifically; I think that's probably a reasonable
term here.

> If i would had to take actions on DeadDrop i would simplify as
> follow: - Make everything work only with 1 server

Why do you think that less compartmentalization will result in a more
secure system, if that system is likely to be under active attack by
corporate and nation state security forces?

> A journalist (or a group of journalist) need to work on received 
> material "online" and not "offline" because they need to search 
> databases, browse google and apply investigative techniques to 
> investigate on the topic. And do it in an efficient way, because
> time is always a scarce resource.

There is a difference between "reading leaked documents" and doing
investigation.  It's perfectly reasonable to have another laptop right
next to the viewing workstation, where story notes go, searches are
run, less confidential background material is looked at, etc.

> Additionally they need, for efficiency purpose, to "collaborate" on
> the received material and to do so there are excellent platform for
> sharing it like or DMS (document
> management system) like Alfresco ( that can help
> extracting text, applying semantic analysis, collaborating on
> documents.

This depends on the kind of documents you're talking about, and the
kind of story.  If you've been given a dump of millions of documents
that need to be analyzed in the manner you're talking about, sure.
Not all leaks look like that; many don't.  In a case like this, it
might be a reasonable decision to, having looked at a document dump,
move it to a non-airgapped machine where it can be accessed in a
collaborative way.  However, one might well not want to bring over
potentially incriminating records of messages with a source into that
environment, and one might wish to ensure that unnecessary metadata
had been removed from documents first, again to protect sources.

> So i really think it's unrealistic to handle dozen or hundreds of 
> submission per month by copying received data offline, decrypting
> and analyzing it offline trough a different workstation.

What do you base your assumptions of submission rate and workload on?

> IMHO in a realistic workflow, at first the journalist "evaluate"
> the data received quickly, identifying if it's spam or ham, define
> how securely he should handle that data, and then will apply
> "appropriate operational security procedure" depending on the data
> received.

If you do this on a non-airgapped machine that's been compromised and
you figure out that what you've been handed is serious, it's a bit
late, no?  Operational security isn't magic sauce you can spread
around afterwards.

> - Too Many Servers Looking at 
> we see that there are 4 servers, 1 switch, several dedicated
> hardware for operational security (external encrypted hard drive)
> with a quite complex installation procedure 
> .
> This increase the cost and effort required to startup a
> whistleblowing initiative in terms of hardware, software, services
> and skill set required.

...because this is what's needed, in this architecture.  You're
talking about analyzing hundreds of submissions a month
collaboratively and using large scale document analysis systems, and
you're worried about buying a few boxes and hiring a sysadmin?

> - Too Much Customized Software Looking at the installation
> procedure there are several customized procedures and software such
> as using "Hardened GRSecurity" linux kernel, requiring to manually
> maintain security update for all kernel release, and manual setup
> of a Certification Authority (with OpenSSL), requiring manual
> handling and management of certificate via command line.

Well, if folks start shipping properly hardened distributions (and
there are some arguments for moving over to tails, for this reason),
then this'd be a bit less work.  Again, just because it's hard doesn't
mean it's not necessary.

> I just find it overkill for a general use.

What's "general" use?


- -- 
Ideas are my favorite toys.
Version: GnuPG v2.0.17 (MingW32)


More information about the liberationtech mailing list