Search Mailing List Archives


Limit search to: Subject & Body Subject Author
Sort by: Reverse Sort
Limit to: All This Week Last Week This Month Last Month
Select Date Range     through    

[liberationtech] The worrisome trend toward liability in networking technology

Jillian York jyork at cyber.law.harvard.edu
Wed Nov 9 09:37:20 PST 2011


Hi Collin,

I'm not sure I can speak for EFF just yet on this front...to be frank, this
is something we're still mulling over.  On a personal level, I would say
that regulations are probably--for better or worse--inevitable.  I say "for
better or worse" because I have grave concerns about their implementation.
On the one hand, something *like* EFF's "Know your customer" standards
could be implemented as a licensing scheme; on the other hand, we could end
up with GOFA, where countries are split into "good" and "bad" lists--and,
depending on how that's implemented, it could mean our allies (e.g.,
Bahrain, Saudi) are "good", despite their use of surveillance technology
and their torture of detainees.

I know that doesn't sound too optimistic, but I am very interested in
hearing from others as to how regulations might be enacted.

Best,
Jillian


On Tue, Nov 8, 2011 at 6:28 PM, Collin Anderson
<collin at averysmallbird.com>wrote:

> Jillian,
>
> A number of individuals, including myself, are interested in using the
> opening presented by Sen. Mark Kirk's public comments regarding Western
> involvement with censorship regime to encourage specific legislation
> regarding the regulation of the export of surveillance equipment. Speaking
> in both the contexts of yourself as an activist and as a representative of
> EFF, do you believe there is a future in legislation, or that industry
> self-regulation backed by standard sanctions is the best we can achieve?
>
>
> On Mon, Nov 7, 2011 at 6:03 PM, Jillian York <jyork at cyber.law.harvard.edu>wrote:
>
>> *It could, however, be a useful tactic for getting the message out to
>> the public that such monitoring is going on.  But, I would worry that it
>> confuses people - if their takeaway is that they think "Blue Coat" is what
>> enables the monitoring, and that stopping Blue Coat seriously hampers
>> surveillance they have been handed an incorrect mental model of how
>> Internet surveillance works.*
>>
>> To tack on to that (because I agree, and because I forgot to say it in my
>> longer email): Right now, Syrian friends do see Blue Coat very much as an
>> American export.  In the sense that all other US "net freedom" policy is
>> moot when American companies are doing this, there's yet another reason for
>> such tactics.  We're not exactly "winning hearts and minds" with this stuff.
>>
>>
>> On Mon, Nov 7, 2011 at 2:56 PM, Josh <jdsaxe at gmail.com> wrote:
>>
>>> Thanks Daniel, I think you make important points.
>>>
>>> Just to corroborate what you say, the UNIX command
>>>
>>> *tcpdump -A > log &; tail -f log | grep "protest"*
>>>
>>> Would monitor network traffic for mentions of the word "protest."  Give
>>> a developer a couple hours and they could write you a python script that
>>> could do much more.
>>>
>>> I think in this discussion perhaps it's important to remember that
>>> governments get their surveillance power from their control of the physical
>>> medium of the Internet, the software applications being talked about here
>>> are the icing on the cake.  Certainly software that parses application
>>> protocols and reconstructs what the user's web browser looks like could be
>>> helpful for an intelligence analyst working for the Syrian government, but
>>> governments can also hire analysts to sift through more voluminous, noisy
>>> data from more primitive tools.
>>>
>>> In either case the problem of educating activists to avoid surveillance
>>> remains.  If they are going to ensure the security of their communications
>>> to some reasonable degree of certainty activists need to understand the
>>> ways in which their IP traffic are exposed to interlopers, and they need to
>>> know strategies for circumventing surveillance, as you say.
>>>
>>> So that said, I think you're probably right about regulating filtering
>>> software as a red herring.  I don't think such a tactic, for those of us in
>>> countries like the U.S., will be particularly effective in actually
>>> stopping oppressive regimes from spying on their citizens, even if it
>>> succeeds in depriving these governments of specific monitoring software.
>>>
>>> It could, however, be a useful tactic for getting the message out to the
>>> public that such monitoring is going on.  But, I would worry that it
>>> confuses people - if their takeaway is that they think "Blue Coat" is what
>>> enables the monitoring, and that stopping Blue Coat seriously hampers
>>> surveillance they have been handed an incorrect mental model of how
>>> Internet surveillance works.
>>>
>>> Josh
>>>
>>> On Mon, Nov 7, 2011 at 4:18 PM, Daniel Colascione <
>>> dan.colascione at gmail.com> wrote:
>>>
>>>> I've read with interest the recent discussion on this list of the
>>>> ramifications of the use of western filtering technology by oppressive
>>>> regimes. While the problem warrants serious discussion, I feel that
>>>> the direction of remedies is veering dangerously and I'd like to adopt
>>>> a bit of a contrary position on the nature of an appropriate response:
>>>>
>>>> We're all outraged by the use by oppressive regimes of western
>>>> filtering and network monitoring technology. But we should pause
>>>> before acting on this outrage: the situation is more subtle than might
>>>> be immediately apparent.  Oppressive regimes don't censor for a
>>>> profit: the filtering and monitoring are existential and ideological
>>>> imperatives. While these regimes (like all rational actors) seek to
>>>> minimize costs, they would continue to censor even at greatly
>>>> increased prices. Of course, there is some price at which these
>>>> regimes would discontinue censorship: but, as the failure of
>>>> conventional economic sanctions to discourage other behavior has
>>>> shown, oppressive regimes are nowhere near being inconvenienced enough
>>>> to change their behavior.
>>>>
>>>> If the efforts discussed here were successful and oppressive regimes
>>>> were somehow cut off from western filtering technology, we wouldn't
>>>> see an end to Internet censorship: Instead, we would see regimes
>>>> invest in domestically-developed solutions, which are actually less
>>>> expensive than one might think: anyone can create professional-quality
>>>> software for free, and hardware can be manufactured with a little
>>>> capital investment. Iran has already made headway in this area. All
>>>> the sanctions, codes of conduct, lawsuits, and so on we've been
>>>> discussing here amount to an increase in the cost of censorship, and
>>>> oppressive regimes simply aren't cost-bound at this point. Now, all
>>>> things being equal, the world becomes a better place as censorship
>>>> becomes more expensive. If all things were equal, I would support the
>>>> ideas being raised on this list.
>>>>
>>>> But not all things are equal: the measures under discussion have
>>>> serious externalities that warrant discussion, and they could affect
>>>> the entire technology industry.
>>>>
>>>> 1) How do we decide what technology is subject to regulation? Network
>>>> technology is quintessentially dual-use: anyone who has administered a
>>>> network knows that the same features that allow us to block outbound
>>>> attacks, accelerate the web via transparent caching, scan our servers
>>>> for vulnerabilities, detect cross-site scripting attacks, and scan our
>>>> email for spam can also be used for malicious ends. Cluster bombs and
>>>> Saturday night specials have no legitimate purpose, but proxies,
>>>> firewalls, and logging tools have many.
>>>>
>>>> The only reasonable way to draw a line between filtering and general
>>>> network technology is to use intent, which is a notoriously vague
>>>> standard. If marketing censorship technology were to become attached
>>>> to serious liability issues, vendors would develop software with
>>>> precisely the same features and market them with "wink, wink, nudge,
>>>> nudge" references to benign network management.
>>>>
>>>> 2) Where does responsibility end? Brett Solomon suggests that we
>>>> attach liability to technology, but the concept isn't entirely clear.
>>>>
>>>> Consider Blue Coat: the exact mechanism by which Syria acquired Blue
>>>> Coat's technology remains unclear. If Blue Coat knowingly sold this
>>>> technology to Syria, it ought to be prosecuted under applicable
>>>> laws. But if it turns out that Syria indirectly obtained Blue Coat's
>>>> technology through the gray market or outright piracy, I feel it'd be
>>>> difficult to attach legal or moral blame to the company.
>>>>
>>>> Let's suppose we do hold Blue Coat liable for Syria's use of its
>>>> products because it provided automated software updates or other
>>>> routine and free support: to avoid future liability, companies would
>>>> be forced to consult blacklists and prevent certain IP address ranges
>>>> from downloading updates, manuals, and other materials. (Or they could
>>>> be required to go through a more robust process to verify the
>>>> nationality of anyone requesting support.)
>>>>
>>>> These restrictions would be wholly ineffective --- any marginally
>>>> competent network administrator could obtain support via third parties
>>>> and proxy servers, but every networking firm would need to bear the
>>>> expense of implementing these restrictions. Would we propose that
>>>> companies use a more robust process to verify the nationalities of
>>>> those requesting support?
>>>>
>>>> After the failure of these measures, what would be next? Would we
>>>> compel networking vendors to include "kill switches" that could be
>>>> used to remotely disable network technology remotely if it's found in
>>>> an unapproved context?  Should devices refuse to function without
>>>> continuous approval of the original manufacturer?
>>>>
>>>> Never mind the financial cost of these measures: instead, imagine the
>>>> potential for misuse. What happens when bad actors inevitably
>>>> commandeer these safeguards to cause mischief? What happens when
>>>> governments force companies to cut off support for (and perhaps
>>>> disable outright) network equipment in a rival nation?
>>>>
>>>> 3) What about open source software: do we subject it to the same
>>>> regulations, effectively destroying the tradition of openness that
>>>> fosters its development?  Or do we exempt open source software from
>>>> regulation, allowing oppressive regimes to access filtering technology
>>>> almost as easily as before, but penalizing companies that try to
>>>> innovate with new networking technology they'd like to keep
>>>> proprietary. Should downloading Squid or Nessus warrant a background
>>>> check? The result of such a scheme would be a retroactive,
>>>> self-inflicted defeat in the crypto wars of the 1990s.
>>>>
>>>> When we ask for regulation of filtering technology, we're asking for
>>>> the very dystopia we're all trying to avoid. Brett Solomon's proposal
>>>> for "human rights by design" is in fact "oppression by
>>>> design". Because software cannot judge the moral orientation of the
>>>> bytes it processes, the only mechanism with which we can enforce
>>>> enforce human rights is authoritative third party control, which can
>>>> be used to do more evil than it can ever do good.
>>>>
>>>> Look: it makes sense to enforce simple, clear measures to disrupt
>>>> censorship and surveillance: we already effectively prohibit direct
>>>> transfer of technology to oppressive regimes, as anyone who has dealt
>>>> with OFAC can attest. But the increasingly strenuous measures proposed
>>>> here, e.g. kill switches and licensing, come with diminishing returns
>>>> and serious costs to our budgets and liberties, and any "victory"
>>>> would be Pyrrhic.
>>>>
>>>> Most participants in this discussion have analogized filtering
>>>> software to weapons technology, but the better analogy is drug
>>>> prohibition: both drugs and filtering technology can be produced
>>>> without massive capital investment; both can be produced clandestinely
>>>> (albeit at higher cost relative to commercial production); both are in
>>>> high demand; both are easily moved on black and gray markets; both
>>>> often have perfectly legitimate uses impossible to distinguish from
>>>> illicit ones; both may very well have important adverse effects on
>>>> society.  Most importantly, in both cases, attempts to crack down on
>>>> trade simply moves the trade underground while simultaneously causing
>>>> collateral damage to society as a whole.
>>>>
>>>> I may attract some flames for the sentiment, but I believe that
>>>> outrage over the use of western filtering technology is a red
>>>> herring. Our resources are better spent on 1) circumventing of the
>>>> filtering that will inevitably be put in place, and 2) activism that
>>>> will reduce the demand for censorship and surveillance in the first
>>>> place.
>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> liberationtech mailing list
>>>> liberationtech at lists.stanford.edu
>>>>
>>>> Should you need to change your subscription options, please go to:
>>>>
>>>> https://mailman.stanford.edu/mailman/listinfo/liberationtech
>>>>
>>>> If you would like to receive a daily digest, click "yes" (once you
>>>> click above) next to "would you like to receive list mail batched in a
>>>> daily digest?"
>>>>
>>>> You will need the user name and password you receive from the list
>>>> moderator in monthly reminders.
>>>>
>>>> Should you need immediate assistance, please contact the list moderator.
>>>>
>>>> Please don't forget to follow us on
>>>> http://twitter.com/#!/Liberationtech
>>>>
>>>
>>>
>>> _______________________________________________
>>> liberationtech mailing list
>>> liberationtech at lists.stanford.edu
>>>
>>> Should you need to change your subscription options, please go to:
>>>
>>> https://mailman.stanford.edu/mailman/listinfo/liberationtech
>>>
>>> If you would like to receive a daily digest, click "yes" (once you click
>>> above) next to "would you like to receive list mail batched in a daily
>>> digest?"
>>>
>>> You will need the user name and password you receive from the list
>>> moderator in monthly reminders.
>>>
>>> Should you need immediate assistance, please contact the list moderator.
>>>
>>> Please don't forget to follow us on http://twitter.com/#!/Liberationtech
>>>
>>
>>
>>
>> --
>> jilliancyork.com | @jilliancyork | tel: +1-857-891-4244
>>
>>
>>
>>
>> _______________________________________________
>> liberationtech mailing list
>> liberationtech at lists.stanford.edu
>>
>> Should you need to change your subscription options, please go to:
>>
>> https://mailman.stanford.edu/mailman/listinfo/liberationtech
>>
>> If you would like to receive a daily digest, click "yes" (once you click
>> above) next to "would you like to receive list mail batched in a daily
>> digest?"
>>
>> You will need the user name and password you receive from the list
>> moderator in monthly reminders.
>>
>> Should you need immediate assistance, please contact the list moderator.
>>
>> Please don't forget to follow us on http://twitter.com/#!/Liberationtech
>>
>
>
>
> --
> *Collin David Anderson*
> averysmallbird.com | @cda | Washington, D.C.
>
>


-- 
jilliancyork.com | @jilliancyork | tel: +1-857-891-4244
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.stanford.edu/pipermail/liberationtech/attachments/20111109/c8cb1182/attachment.html>


More information about the liberationtech mailing list