Search Mailing List Archives


Limit search to: Subject & Body Subject Author
Sort by: Reverse Sort
Limit to: All This Week Last Week This Month Last Month
Select Date Range     through    

[liberationtech] decloudification: reverting to typewriters?

Shava Nerad shava23 at gmail.com
Wed Jul 17 15:44:18 PDT 2013


Surely Russians have the equivalent of Tempest?

https://en.wikipedia.org/wiki/Tempest_(codename)

Funny, I haven't heard about emanation hardening standards for ages.

Frankly, I think you could rewrite that article regarding absolutely any
bureaucracy in the world, taking out all references to Snowden and the
current crisis.  They all use three part forms.  The paperless office is a
sham.  They all use faxes and huge amounts of letters and paper.  This is
not new.

And, "Neuromancer" was written on a manual typewriter in 1983 -- this being
the novel that the term "cyberspace" was popularized in by William Gibson.
http://en.wikipedia.org/wiki/William_Gibson.  There are still writers who
write on them -- or on legal pads.  I am not one of them.
http://site.xavier.edu/polt/typewriters/tributes.html

A friend of mine works with US AID to take very old old old computers to
African clinics with Ubuntu (appropriately enough) systems on them to
automate medical records, not necessarily because computerized records are
all that much easier to maintain in many of these environments (electricity
is solar, parts can be hard) but because actually?  It's just as hard
sometimes to maintain paper records in small clinics and at least the
computer records can be searched easily.  They are less subject to rot,
although they still have environmental vulnerabilities in some locales.  In
full clinics, it's a real win.

We tend to forget that vast swaths of the world and many institutions are
simply not wired, psychologically or physically.

But yes, I've been ranting about cloud services all along, tin-hat that I
am...talking about these arcane third party doctrine things, all that.  I'm
such a freaking social engineering geek.  If you count, you know, politics,
legal, government affairs, tech & society, digital divide/inclusion, civic
media, whatever..;)

To be a bit broad and shorthand...

Between keyloggers, emanations, moles, agents provocateurs, whatever --
really, what we have is an age of casual reliance on tools and "social
media friends" that is reverting to traditional organizing and measures of
trust.  And very few of us have those sensitivities and skill sets well
developed any more.  It's not that we have to rely on typewriters.  We have
to rely on discretion and craft.

We have to remember what good boundaries are, and learn a little more
diplomacy.  How do you teach it on short notice?  Fascinating prospect.

That's a more subtle training protocol, teaching protocol -- the human
kind, not the network kind.  It's a discipline we used to be raised with, a
sort of manners.  A polite paranoia, that is particularly foreign to modern
American culture with its emphasis on "authenticity." Our culture can have
a reputation as extremely brash and flatland in terms of simultaneous
multi-layered trust relationships particularly the further north or west
you go.  (obviously grossly generalized -- many even huge exceptions for
situations such as northern New England vs southern New New England,
ethnic/transplant overlay cultures in Hawai'i, Florida, etc.)

We haven't had to do that for a generation or so.  Especially the west
coast culture here.  Brash nearly doesn't cover it.  Modern Silicon Valley
is the apex.  It moves at the speed of venture funding specifically because
it has grown to require no trust relationships at all from any parties
involved, no real reputation (fail early and often; fund 40 ideas on the
idea that one will hit big), and the culmination of the game-theory rush of
the financial anarchists who develop ideas without much interest in social
impact in terms of product or workforce or privacy.  Some of the people who
run SV know about duplicity, but the people who come into the dev teams and
certainly the people who feed their products (often *as* users-as-product)
are conditioned to be very uncritical regarding trust.  Otherwise, the
system would be...criticized.

To view oneself as possibly having to shake someone's hand and smile and
yet view them as a possible adversary is anathema to a lot of young people
I meet.  It's not a game or an aspect of business/government/advocacy
politics they want to touch.  It's actually a taboo thing that makes them
view me with distaste that I should find it an acceptable idiom of
interaction in my work.  I am delaying the transformation of the world into
a more happy place they want it to be, I'm told.

As a side note, I had a person on G+ tell me that I am missing the Great
Awakening (which seems to be some great bakti-style new age-y enlightenment
movement online) and he pities me. <blink>  This Great Awakening means that
he doesn't have to get involved in activism of any sort that involves going
out of his door to a meeting because it doesn't matter that laws and courts
and all are not online.  Soon everything will be online (transhuman,
electronically prayerful, who knows...).  None of that will matter.  Only
online relationships will matter and the unpleasant real world will not
harsh their mellow, I suppose.  This seems to have NO relationship to the
charismatic christian movement's great awakenings of American history.
 It's very earthy crunchy crystal new age stuff.  And seriously into
conspiracy theories, and anything on the "outside" that can smack their
amygdala about the bad real world.

So the outcome of being uncritical regarding trust seems to be morphing
into religion?  Or just tying into a sort of what buddhists might call
"idiot compassion" (pardon my judgmental outlook) online.  Online ==
heaven.  RL == hell.

Really, the Bronies are doing real good work in the world.  Friendship *is*
magic.  These people strike me as pod people.

But, back to the point, in a lesser aspect, there are a lot of young
professionals who are far more used to relying on crypto rather than
couching language to protect their communications and contexts.  We used to
do otherwise, by understanding that our communications were vulnerable and
communicating in public only that which was open to scrutiny, and reserving
our secrets to contexts we were reasonably assured were confidential or
harmless.

We never have had control of our data, really, from the moment it was open
to the net or could be dialed into.  Or before that, if people could walk
in and make copies.  Or before that, if government or others were willing
to pay someone to report on our activities.

What do you want to be confidential?  Who needs to know?  Where do you
extend trust?  And why is it required?  What are the consequences of that
failing?  And how do you detect failure?  How do you recover?  And how do
you retaliate, or reveal the failure, or take whatever measures, if you do?
 What are the consequences of that?

These are risk management measures that predate computing.  We haven't
asked a lot of them for a long time.  They are not about paper or
computing.  They are about people.  Who has access to information and
decisions.  Who is walking out the gate with a Lady Gaga CD or a personal
device or briefcase -- clearance or not?  Who is walking into this meeting,
and who does their brother work for?  We are caught between a world where
we want to trust people who seem to be "with us" and a world where we could
spent all of our time asking too many questions.

The answer to the world (says the woman who worked with Tor) is to minimize
risk, not think you have eliminated it.  Not to work in a room with a
typewriter, but to manage risks efficiently, develop judgement based on
human decisions, not just tools, and develop discretion and discipline and
mindfulness *and plans that are agreed upon* -- and move forward.
 Liberation technology where people are the focus, and the tools are only
force multipliers.

Technology is only ever an extension of the capacity of human culture.  We
forget human culture as the driver at our peril.  For the people.  Culture,
history, diplomacy, politics, military science, psychology in their name
for their benefit.  Then the toolkits.  Isn't that why we are here?

Ultimately, though, I suspect the Guardian reporter who filed the story was
having fun with a Russia story -- and that Russian story was sniping at
government or just getting attention with some pretty banal bureaucratic
facts.

But why not take it as a teachable moment.  That's why it made a good
story.

Everything old is new again...  Evgeny is probably smiling.

yrs,
SN

On Wed, Jul 17, 2013 at 6:52 AM, A.Cammozzo <a.cammozzo at gmail.com> wrote:

> According to newspapers [1], one of the outcomes of the NSA leaks is to
> push Russian secret services to use typewriters.
>
> Sounds a bit like a joke... how serious is this?
>
> However it's very likely that some form of de-cloudification is going to
> happen.
> What steps will take government and corporate IT to get back full (or at
> least reasonable) control of their data?
> Using paper as an extreme form of data protection may perhaps be viable
> for some secret service, but surely not as a general case!
>
>
> Alberto
>
>
>
> [1]
> <
> http://www.guardian.co.uk/world/2013/jul/11/russia-reverts-paper-nsa-leaks
> >
>
>     Russian article: <http://izvestia.ru/news/553314>
>
> -
> A.Cammozzo
> http://tagMeNot.info
> http://cammozzo.com/en
>
> --
> Too many emails? Unsubscribe, change to digest, or change password by
> emailing moderator at companys at stanford.edu or changing your settings at
> https://mailman.stanford.edu/mailman/listinfo/liberationtech
>



-- 

Shava Nerad
shava23 at gmail.com
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.stanford.edu/pipermail/liberationtech/attachments/20130717/d3bd9b15/attachment.html>


More information about the liberationtech mailing list