In my last post I took a look at what permissions granularity was and how it might impact on user behavior. The short version of the conclusion that post made is: If permissions granularity is not transparent – easy to understand and easy to use – most people will fall back on whatever the site defaults are. Of course incentive to use restrictions in the first place is dependent on an understanding that 1. the stuff you are putting out is searchable and accessible to the general public 2. there are people in that category you don’t necessarily want to see your stuff.
I remember an audience member in a conference I attended last year who
was outraged that a potential employer might Google her and then base a
judgment about her on her personal activity. And I’ve seen school kids
squirm in horror as their Bebo and YouTube pages were looked at by
teachers and parents. It’s increasingly common for recruiters, universities, and other
authoritative gatekeepers to use public social network information to
fill in candidates ‘other interests’: goodbye fervent interest in hang
gliding and byzantine pottery; hello getting drunk and pinching road
It also seems fair to say that a large number of people depend on fairly flimsy strategies to avoid managing their data (or having to work out any permissions granularity). These include counting on the fact that your name is a fairly common one, simply playing the odds in the face of the sheer amount of information everyone else is putting out, and imagining your social networking service is one that no one you feel uncomfortable with would possibly use.
Way back in 2002 Katz and Rice describe the internet as a panopticon.
Those of you who’ve flirted with Foucault or are interested in
architecture will remember that the key characteristic of the Bentham’s
prison design is that people keep themselves in line, because the
possibility of being observed is always present. The panopticon
encouraged self-policing since inmates were aware they could be seen
(and subsequently punished) at all times. While web 2.0 Community sites
have no realistic alternative to encouraging self-regulation thorough
a participatory panoptism, the internet has not turned out to be a
hotbed of self denial and careful self regulation. One of the conclusions made by Pew’s Digital Footprints report in December 2007 was that “Most internet users are not concerned about the amount of information available about them online, and most do not take steps to limit that information”.
Partly this can
be attributed to the charmed circle people believe themselves to be positioned in – the imaginary frameworks of space and place that allows for
the fun interchange of information, the subjective psychogeographic environment alluded to in my title.
There’s a gap in perception between what many users believe to be
the context and audience that they are writing for – a closed group of
friends – and the numbers of people actually able to view their
information. Many users are unaware that the information they have
posted may be publicly available, and able to be searched for and read
by a much wider audience than their group of friends. Acquisti and
Gross (2006) characterise social networking services as "imagined communities"
in recognition of the gap between users’ perceptions of a private,
closed network and the reality of who can access their information
Additionally there is the issue of time. Embarrassing or inappropriate stuff may
still be around in a few years’ time. We don’t know the full
consequences yet of a generation which has grown up online, or the
future implications of new types of search – for example social search,
which aggregates information from across a range of social networking
sites by your name or email address, or of the development of facial
recognition search software.
I’ve been working quite a bit around e-safety and digital literacy, so my thinking in this area is largely around presence issues – not just how we keep ourselves safe online but also how our online activity represents us to the rest of the online world. It’s becoming increasingly easy to track peoples unprotected conversations, and the rise of social search pretty much demolishes any illusionary protection that acting within a silo might offer. The current tidalwave of lifestream apps further puts paid to this notion of the public internet being a series of discreet islands.
I agree whole heartedly with the argument that any good
service should ensure members can get all of their data out both easily
and meaningfully (i.e. in some useful format that can be recognised and
repurposed by other tools and services). However – we also need to recognise that a lot of people who use the
web don’t care about data portability. If fact, some of them even use
services precisely because they seem closed and hard to get information
out of, and when they do stumble across their data outside of its
origional context, it sometimes comes as a shock to them.
And recontextualisation isn’t just about taking information from one place and replanting it in another – it can be about someone from outside of the charmed circle you imagain yourself addressing reading your stuff. This doesn’t mean that we shouldn’t be pressing hard to open up services – it means we need to be mindful of the importance of context, and the value of closedness/closeness, to people using services.