Search Mailing List Archives


Limit search to: Subject & Body Subject Author
Sort by: Reverse Sort
Limit to: All This Week Last Week This Month Last Month
Select Date Range     through    

[liberationtech] DHS Social Media Scrapes

Fran Parker lilbambi at gmail.com
Thu Feb 9 12:48:21 PST 2012


Katitza,

Thank you so much. Excellent article. I totally missed that article somehow!

Fran

On 2/9/12 3:33 PM, Katitza Rodriguez wrote:
>
> I wrote about it in August 2001.
>
> Mexican Newspaper Uncovers Systemic Monitoring Plans of Public Online
> Sources
> https://www.eff.org/deeplinks/2011/08/mexican-newspaper-uncovers-systemic-monitoring
>
>
>
> On Feb 9, 2012, at 6:07 PM, Carrie Budge <budge.caroline at gmail.com> wrote:
>>> Great, thank you for the help Andrew.
>>>
>>> I've learnt since writing earlier that the FBI has just put out a RFP
>>> for developers to create an app so that they can scrape social
>>> networks too. Delightful.
>>>
>>>
>>>
>>> //////////////////////
>>> carrie budge
>>>
>>> On 9 Feb 2012, at 17:05, Andrew Lewis<andrew at pdqvpn.com> wrote:
>>>
>>>> Keyword searching in most cases.
>>>>
>>>> Depending on the business unit, I'd be surprised if it was anything
>>>> more then a horrendously written app or a commercial product with an
>>>> obscene markup.
>>>>
>>>>
>>>>
>>>> Sent from my iPad
>>>>
>>>> On Feb 9, 2012, at 12:34 PM, caroline
>>>> budge<budge.caroline at gmail.com> wrote:
>>>>
>>>>> Hello Lib Tech community!
>>>>>
>>>>> I was referred to the community by a professor in the CAST
>>>>> department at Goldsmiths University in London. I am currently on a
>>>>> MA Digital Journalist course there where I am investigating the
>>>>> recent incident where two british tourists got turned away from LAX
>>>>> because the DHS had detected a tweet that said they were going to
>>>>> 'destroy america' --which is british slang for "party in america'.
>>>>>
>>>>> The DHS has contracted a security company called General Dynamics
>>>>> to carry out a majority of their work scraping the internet.
>>>>>
>>>>> Does anyone in the community have a good idea at what sort of
>>>>> methods they are using to do so. I have a very basic understanding
>>>>> of programming (I've been scraping twitter using python recently),
>>>>> but I was wondering if the community had a better idea of how they
>>>>> are going about performing these scrapes.
>>>>>
>>>>> I hope that this is clear and that someone can help me write about
>>>>> the methods they are using to collate user data in real time.
>>>>>
>>>>> Thank you,
>>>>>
>>>>> Carrie
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> liberationtech mailing list
>>>>> liberationtech at lists.stanford.edu
>>>>>
>>>>> Should you need to change your subscription options, please go to:
>>>>>
>>>>> https://mailman.stanford.edu/mailman/listinfo/liberationtech
>>>>>
>>>>> If you would like to receive a daily digest, click "yes" (once you
>>>>> click above) next to "would you like to receive list mail batched
>>>>> in a daily digest?"
>>>>>
>>>>> You will need the user name and password you receive from the list
>>>>> moderator in monthly reminders.
>>>>>
>>>>> Should you need immediate assistance, please contact the list
>>>>> moderator.
>>>>>
>>>>> Please don't forget to follow us on
>>>>> http://twitter.com/#!/Liberationtech
>> _______________________________________________
>> liberationtech mailing list
>> liberationtech at lists.stanford.edu
>>
>> Should you need to change your subscription options, please go to:
>>
>> https://mailman.stanford.edu/mailman/listinfo/liberationtech
>>
>> If you would like to receive a daily digest, click "yes" (once you
>> click above) next to "would you like to receive list mail batched in a
>> daily digest?"
>>
>> You will need the user name and password you receive from the list
>> moderator in monthly reminders.
>>
>> Should you need immediate assistance, please contact the list moderator.
>>
>> Please don't forget to follow us on http://twitter.com/#!/Liberationtech
>>
>
>



More information about the liberationtech mailing list