Search Mailing List Archives


Limit search to: Subject & Body Subject Author
Sort by: Reverse Sort
Limit to: All This Week Last Week This Month Last Month
Select Date Range     through    

[liberationtech] Fwd: Now Anyone Can Create Their Own Personalized Alexa Skill in Just Minutes

Shava Nerad shava23 at gmail.com
Thu Apr 26 20:34:23 PDT 2018


While I agree there is no way to use an Alexa device without privacy
concerns, or any app or device with an open recording "phone home"
microphone (check your apps for permissions, folks!  You may be
amazed...) *privacy
is a slider*.  You meter out your privacy for services, just as you meter
out your labor in exchange for wages.  The difference is, your privacy
includes information that can be replicated, so you have to decide how much
you understand and trust the receptacle, and how much you value the
services, and balance the risk/reward -- not only for yourself, but for
society.

While some people find the risk for society to be unacceptably high, others
find that (as with file sharing) the risk to society of sharing information
prolifically and diluting its artificial value created by scarcity is
acceptable to them.  So, the risk to artistic production, publishing
models, etc, on the one hand, is ignored by the file sharing community, and
the risk to PII/privacy is ignored by big data, for the sake of having the
cookie NAOW.

In file sharing, the artificial value of scarcity was created by PI laws
and licensing.  In privacy, by PII and personal choices and opsec.

In current days, the prevailing choice is to make PII cheap, by brokering
it prolifically for services, reserving very little, and engaging in very
little consumer education (self- or industry/govt/public school/etc).  This
means the effective value of the market of PII is cheapened, so long as the
market is uneducated as to the value that increasing scarcity would add to
their PII, and disorganized in exercising their power as a market to demand
more value for their assets.

It's nearly the opposite of the prisoner's dilemma -- so long as most
people will give up their genetic code for a coupon for a cheeseburger
(regardless of what they overtly express about privacy concerns, their
actions rarely match), reserving your individual information as .0001% of
the market will remain insignificant and unthreatening in the scenario.
You could give yourself a stroke stressing about privacy, sure, but unless
you can educate that other 99.9999% to understand the true potential value
of their PII beyond ad-supported email clients and cat memes sharing
platforms?  You're SOL.  You will never see the last one out.  You will die
waiting.

Recently, since I live with permanent disability that leaves me flat on my
butt in bed sometimes for weeks, half dead with pain, fatigue and profound
malaise (it is what it is), I got an Echo Dot.  I'm a privacy advocate.
But I believe privacy is a slider.

I live alone and my life at this point is profoundly consistent and routine
and boring, day to day, if you consider slogging through involuntary
retirement in public subsidized senior housing with medical supports in
Cambridge MA boring, though I do try to keep blogging, researching, and I'm
slowly slogging through a book.  I don't talk to myself, and I rarely even
talk on the phone, preferring text communications.  Alexa (Amazon) knows I
love to sleep to Mozart, Yo-Yo Ma, or  SomaFM Suburbs of Goa and to listen
to the Grateful Dead/jam bands, free jazz, minimalist compositions,
Kraftwerk, Nina Hagen, and Shostakovich cello concertos while I game, to do
housework to the Stones, and chill to Celtic harp -- and I am ok with
that.  "She" also knows I shop for bananas, masa, eggs, coconut milk, and
various odd things that end up on my shopping list which I call out as I'm
cooking. She knows when I take my meds, and so if I fall asleep, as I often
do, she is more reliable than the staff on the floor at waking me.  I do
suppose I talk to myself, in that I tell "her" "Please" and "Thank you,"
for my own soul's ease, not for "her's."  Guess what?  Now all of you know
all of this, too.  I am unconcerned about my privacy, even though this is a
publicly archived list.

When rms comes to visit, I unplug the power to the device before he
arrives, out of courtesy, lol.  (He still sniffs at me for having it.)

But as a person who sometimes can't get up to change the radio station,
having a voice command widget with such flexibility -- as well as a "I've
fallen and I can't get up" device that doesn't charge me extra -- is a
bonus worth the privacy hit, which I find minimal.  I still feel shy about
it because of contributing to the cheapening of PII.

In my case, I believe I've made a thoughtful exchange.

yrs,


Shava Nerad
shava23 at gmail.com
https://patreon.com/shava23

On Thu, Apr 19, 2018 at 3:00 PM, Thomas Delrue <thomas at epistulae.net> wrote:

> (Dropping mailinglists other than LibTech...)
>
> On 04/19/2018 09:22 AM, Phil Shapiro wrote:
> > I do not own an Alexa device and am wary of privacy issues in
> > general.
>
> If /you're/ wary of privacy issues, then why encourage others to use it?
>
> > At the same time, I think there are ways of using this device that do
> > not raise privacy concerns.
>
> I think you're wrong; I don't think there is a way to use this device in
> a way that does not raise privacy concerns, at all. The same is true for
> Google Home.
> Just like malware tries to establish persistence on your machines, these
> devices exist to establish persistence for their true owners - which
> ain't you. The parallels with malware go further than that, but I'll
> leave it there...
>
> If you really must do something like this, consider Mycroft
> (https://mycroft.ai/; https://en.wikipedia.org/wiki/Mycroft_(software) )
> enclosed as a picroft (https://mycroft.ai/documentation/picroft/); it's
> not ideal, it still reaches out to someone else's servers, but at least
> it's open source, it's a start... and you can modify it to prevent it
> from doing that.
>
> There's a repository of skills, written in Python, over here:
> https://github.com/MycroftAI/mycroft-skills
>
> --
> Liberationtech is public & archives are searchable on Google. Violations
> of list guidelines will get you moderated: https://mailman.stanford.edu/
> mailman/listinfo/liberationtech. Unsubscribe, change to digest, or change
> password by emailing the moderator at zakwhitt at stanford.edu.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.stanford.edu/pipermail/liberationtech/attachments/20180426/4d26a6fd/attachment.html>


More information about the liberationtech mailing list