Search Mailing List Archives


Limit search to: Subject & Body Subject Author
Sort by: Reverse Sort
Limit to: All This Week Last Week This Month Last Month
Select Date Range     through    

[liberationtech] The worrisome trend toward liability in networking technology

Josh jdsaxe at gmail.com
Mon Nov 7 14:56:11 PST 2011


Thanks Daniel, I think you make important points.

Just to corroborate what you say, the UNIX command

*tcpdump -A > log &; tail -f log | grep "protest"*

Would monitor network traffic for mentions of the word "protest."  Give a
developer a couple hours and they could write you a python script that
could do much more.

I think in this discussion perhaps it's important to remember that
governments get their surveillance power from their control of the physical
medium of the Internet, the software applications being talked about here
are the icing on the cake.  Certainly software that parses application
protocols and reconstructs what the user's web browser looks like could be
helpful for an intelligence analyst working for the Syrian government, but
governments can also hire analysts to sift through more voluminous, noisy
data from more primitive tools.

In either case the problem of educating activists to avoid surveillance
remains.  If they are going to ensure the security of their communications
to some reasonable degree of certainty activists need to understand the
ways in which their IP traffic are exposed to interlopers, and they need to
know strategies for circumventing surveillance, as you say.

So that said, I think you're probably right about regulating filtering
software as a red herring.  I don't think such a tactic, for those of us in
countries like the U.S., will be particularly effective in actually
stopping oppressive regimes from spying on their citizens, even if it
succeeds in depriving these governments of specific monitoring software.

It could, however, be a useful tactic for getting the message out to the
public that such monitoring is going on.  But, I would worry that it
confuses people - if their takeaway is that they think "Blue Coat" is what
enables the monitoring, and that stopping Blue Coat seriously hampers
surveillance they have been handed an incorrect mental model of how
Internet surveillance works.

Josh

On Mon, Nov 7, 2011 at 4:18 PM, Daniel Colascione
<dan.colascione at gmail.com>wrote:

> I've read with interest the recent discussion on this list of the
> ramifications of the use of western filtering technology by oppressive
> regimes. While the problem warrants serious discussion, I feel that
> the direction of remedies is veering dangerously and I'd like to adopt
> a bit of a contrary position on the nature of an appropriate response:
>
> We're all outraged by the use by oppressive regimes of western
> filtering and network monitoring technology. But we should pause
> before acting on this outrage: the situation is more subtle than might
> be immediately apparent.  Oppressive regimes don't censor for a
> profit: the filtering and monitoring are existential and ideological
> imperatives. While these regimes (like all rational actors) seek to
> minimize costs, they would continue to censor even at greatly
> increased prices. Of course, there is some price at which these
> regimes would discontinue censorship: but, as the failure of
> conventional economic sanctions to discourage other behavior has
> shown, oppressive regimes are nowhere near being inconvenienced enough
> to change their behavior.
>
> If the efforts discussed here were successful and oppressive regimes
> were somehow cut off from western filtering technology, we wouldn't
> see an end to Internet censorship: Instead, we would see regimes
> invest in domestically-developed solutions, which are actually less
> expensive than one might think: anyone can create professional-quality
> software for free, and hardware can be manufactured with a little
> capital investment. Iran has already made headway in this area. All
> the sanctions, codes of conduct, lawsuits, and so on we've been
> discussing here amount to an increase in the cost of censorship, and
> oppressive regimes simply aren't cost-bound at this point. Now, all
> things being equal, the world becomes a better place as censorship
> becomes more expensive. If all things were equal, I would support the
> ideas being raised on this list.
>
> But not all things are equal: the measures under discussion have
> serious externalities that warrant discussion, and they could affect
> the entire technology industry.
>
> 1) How do we decide what technology is subject to regulation? Network
> technology is quintessentially dual-use: anyone who has administered a
> network knows that the same features that allow us to block outbound
> attacks, accelerate the web via transparent caching, scan our servers
> for vulnerabilities, detect cross-site scripting attacks, and scan our
> email for spam can also be used for malicious ends. Cluster bombs and
> Saturday night specials have no legitimate purpose, but proxies,
> firewalls, and logging tools have many.
>
> The only reasonable way to draw a line between filtering and general
> network technology is to use intent, which is a notoriously vague
> standard. If marketing censorship technology were to become attached
> to serious liability issues, vendors would develop software with
> precisely the same features and market them with "wink, wink, nudge,
> nudge" references to benign network management.
>
> 2) Where does responsibility end? Brett Solomon suggests that we
> attach liability to technology, but the concept isn't entirely clear.
>
> Consider Blue Coat: the exact mechanism by which Syria acquired Blue
> Coat's technology remains unclear. If Blue Coat knowingly sold this
> technology to Syria, it ought to be prosecuted under applicable
> laws. But if it turns out that Syria indirectly obtained Blue Coat's
> technology through the gray market or outright piracy, I feel it'd be
> difficult to attach legal or moral blame to the company.
>
> Let's suppose we do hold Blue Coat liable for Syria's use of its
> products because it provided automated software updates or other
> routine and free support: to avoid future liability, companies would
> be forced to consult blacklists and prevent certain IP address ranges
> from downloading updates, manuals, and other materials. (Or they could
> be required to go through a more robust process to verify the
> nationality of anyone requesting support.)
>
> These restrictions would be wholly ineffective --- any marginally
> competent network administrator could obtain support via third parties
> and proxy servers, but every networking firm would need to bear the
> expense of implementing these restrictions. Would we propose that
> companies use a more robust process to verify the nationalities of
> those requesting support?
>
> After the failure of these measures, what would be next? Would we
> compel networking vendors to include "kill switches" that could be
> used to remotely disable network technology remotely if it's found in
> an unapproved context?  Should devices refuse to function without
> continuous approval of the original manufacturer?
>
> Never mind the financial cost of these measures: instead, imagine the
> potential for misuse. What happens when bad actors inevitably
> commandeer these safeguards to cause mischief? What happens when
> governments force companies to cut off support for (and perhaps
> disable outright) network equipment in a rival nation?
>
> 3) What about open source software: do we subject it to the same
> regulations, effectively destroying the tradition of openness that
> fosters its development?  Or do we exempt open source software from
> regulation, allowing oppressive regimes to access filtering technology
> almost as easily as before, but penalizing companies that try to
> innovate with new networking technology they'd like to keep
> proprietary. Should downloading Squid or Nessus warrant a background
> check? The result of such a scheme would be a retroactive,
> self-inflicted defeat in the crypto wars of the 1990s.
>
> When we ask for regulation of filtering technology, we're asking for
> the very dystopia we're all trying to avoid. Brett Solomon's proposal
> for "human rights by design" is in fact "oppression by
> design". Because software cannot judge the moral orientation of the
> bytes it processes, the only mechanism with which we can enforce
> enforce human rights is authoritative third party control, which can
> be used to do more evil than it can ever do good.
>
> Look: it makes sense to enforce simple, clear measures to disrupt
> censorship and surveillance: we already effectively prohibit direct
> transfer of technology to oppressive regimes, as anyone who has dealt
> with OFAC can attest. But the increasingly strenuous measures proposed
> here, e.g. kill switches and licensing, come with diminishing returns
> and serious costs to our budgets and liberties, and any "victory"
> would be Pyrrhic.
>
> Most participants in this discussion have analogized filtering
> software to weapons technology, but the better analogy is drug
> prohibition: both drugs and filtering technology can be produced
> without massive capital investment; both can be produced clandestinely
> (albeit at higher cost relative to commercial production); both are in
> high demand; both are easily moved on black and gray markets; both
> often have perfectly legitimate uses impossible to distinguish from
> illicit ones; both may very well have important adverse effects on
> society.  Most importantly, in both cases, attempts to crack down on
> trade simply moves the trade underground while simultaneously causing
> collateral damage to society as a whole.
>
> I may attract some flames for the sentiment, but I believe that
> outrage over the use of western filtering technology is a red
> herring. Our resources are better spent on 1) circumventing of the
> filtering that will inevitably be put in place, and 2) activism that
> will reduce the demand for censorship and surveillance in the first
> place.
>
>
>
> _______________________________________________
> liberationtech mailing list
> liberationtech at lists.stanford.edu
>
> Should you need to change your subscription options, please go to:
>
> https://mailman.stanford.edu/mailman/listinfo/liberationtech
>
> If you would like to receive a daily digest, click "yes" (once you click
> above) next to "would you like to receive list mail batched in a daily
> digest?"
>
> You will need the user name and password you receive from the list
> moderator in monthly reminders.
>
> Should you need immediate assistance, please contact the list moderator.
>
> Please don't forget to follow us on http://twitter.com/#!/Liberationtech
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.stanford.edu/pipermail/liberationtech/attachments/20111107/e77f86d2/attachment.html>


More information about the liberationtech mailing list