Wednesday, 10 December 2008

My twittered notes on the Leweb Social panel

Platform Love: Getting Along - Panel

Panelists:

  • David Glazer - Director of Engineering, Google
  • Jeff Hansen - General Manager, Services Strategy/Live Mesh, Microsoft Corporation
  • Dave Morin -Senior Platform Manager, Facebook

  • David Recordon - Open Platforms Tech Lead , SixApart
  • Max Engel, Head of Data Availability Initiative, MySpace

Moderator: Marc Canter - CEO, Broadband Mechanics

Watching the 3 Davids, Max, Marc and Jeff talk social at LeWeb
says Marc Canter 'open is the new black' - and asks about the Open Stack
says @daveman692 google, yahoo, microsoft all building on the open stack - won't FaceBook become the underdog when openness wins?
Canter suggets OpenID will be the brand that ties the Open stack together
max of MySpace "what we're doing with these standards is moving the web forward - when the web hits a roadblock it routes round it"
max of MySpace:"90% of our users think of themselves as URLs so OpenID is a natural fit for us"
Dave Glazer: the goal is to let users do anything they want to, with others, anywhere on the web. OpenID lets you log in anywhere
Dave Glazer: openSocial solves a different bit of the puzzle - JS APIs to run the same app in different social contexts REST APIs web to web
says @daveman692 the web is designed to be distributed, and the Open Stack fits this model
Jeff of Microsoft: live mesh is built on symmetric sync - supports Open Stack, OpenID shipping, OAuth looks good, support PortableContacts
Jeff of Microsfot: we're evaluating the OpenSocial gadget container
Marc canter "we're putting all our balls into ev williams vice"
Jeff: we offer lots of languages. Marc: lots of ways to put our balls in your vice
Max: we support OpenID, Oauth, OpenSocial but you can too
Marc: anything good for the Open Web is good for Google
Marc Canter wants a URL for each Gmail? DG: each one does have that, but only you can see it
Dave Glazer: there are 3 classes of information: Public, Private and Complicated - users should never be surprised by who can see what
says @davemorin facebook wants people to have a social context wherever they go
says @davemorin FaceBook had to create a Dynamic Privacy model for FB Connect @daveman692 calls shenanigans - LJ had those in 1999
asks @daveman692 of @davemorin why are you giving microsoft access to all our email addresses wihtout asking permission?
Max of MySpace - we've shown that security and openness work together by using OAuth, and can revoke them from in MySpace
Dave Glazer: need to separate the technical levers from the social customs. technology can't stop people putting your bizcard on the web
says @techcrunch "call bullshit on facebook" - broke integration with google. FB don't want an open stack, they may be forced into it
says @tommorris how can MS be on the panel after the debacle of Office OOXML which wasn't open or XML?
says @dave500hats could we get contacts with certain features eg tennis fans?
Dave Glazer: there's an open spec process to define new attributes in the spec - if you want to add one go and propose it

Monday, 8 December 2008

Cycling to new layers of freedom

Dave Winer used the public beta of Google Friend Connect to reflect on tech industry cycles:
A new generation of young techies comes along, takes a look at the current stack, finds it too daunting (rightly so) and decides to start over from scratch. They find that they can make things happen that the previous generation couldn't cause they were so mired in the complexity of the systems they had built. The new systems become popular with "power users" -- people who yearn to overcome the limits of the previous generation. It's exhilirating! [...]
The trick in each cycle is to fight complexity, so the growth can keep going. But you can't keep it out, engineers like complexity, not just because it provides them job security, also because they really just like it. But once the stack gets too arcane, the next generation throws their hands up and says "We're not going to deal with that mess."

Now, I may be a few years behind Dave, but I think he is throwing the baby out with the bathwater, or the stack out with the cycle here. Back when I started out, to get my computer to generate sound, I had to make my own D to A converter to attach to the parallel port, and for non-character graphics, my hardware hacker friends swapped the character generator ROM for RAM, and I had to code in assembler to swap the display data in time.

Now my son thinks nothing of mixing 10 polyphonic Midi tracks in an afternoon or editing hi-def video (and yes, it's on an OS I helped to make capable of that).

Dave's revolutionary impulsiveness has a germ of truth, but what really happens is that successful technologies become invisible infrastructure for the next things that build on them.

I no longer need to write assembler, heck I no longer need to write C code. Dave's very URL - scripting.com - shows how we have built up layers of utility to work upon.

HTTP, HTML, JSON, Atom and Javascript are infrastructure now. Our deepest role as developers is to build the invisible infrastructure for the next generation to take for granted, so they imagine new abstractions atop that. Dave did it with feeds.

What we're doing with the Open Stack — OpenID, OAuth, PortableContacts and OpenSocial— is part of this evolutionary cycle too. We're combining building blocks into a simplified whole that makes sense to people who want their websites to become social.

It comes down to what you can take for granted as the baseline to build the next exciting cycle on.

Thursday, 13 November 2008

OpenSocial’s birthday today


OpenSocial Reach chart
Originally uploaded by Kevin Marks
Just over a year ago, we launched OpenSocial to the web, with a few example applications and a lot of potential. Now, a year on, over 600 million social network users can use OpenSocial applications in their preferred social network sites.
Then, applications had to be embedded in sites as gadgets, which makes the social context clear for users, but means developers have to write some Javascript, and can only run code when the user is looking at the site.
With OpenSocial 0.8 rolling out, the REST APIs mean that developers can integrate with social sites using server-side code directly, potentially delegating user registration, profiles and friend relationships to an already-trusted social site, and feeding activity updates back into them.
To do this, we are building an Open Stack, based on OpenID, XRDS-Simple, OAuth, PortableContacts and OpenSocial. By composing open standards in this way, we can make each one more valuable. The advantages of OpenID over email login in itself are not that obvious to users, but if the OpenID can be used to bring in your profile and contacts data - with your permission via OAuth - suddenly the added value is clear to users and developers alike. This connection was one of the exciting discussions at the Internet Identity Workshop this week - here's a video of myself, Steve Gillmor, David Recordon and Cliff Gerrish talking about it.

Saturday, 8 November 2008

Missing the point of OpenID

I'm puzzled by Dare's post on OpenID, as he is wilfully misunderstanding its advantages at each stage, and I know he's smarter than that. He gets it right that OpenID is a way to confirm that a user owns a URL, without the rigmarole required to do so for an email address. 

However, the then uses his unmemorable Facebook URL http://www.facebook.com/p/Dare_Obasanjo/500050028 as an example, rather than any of the memorable ones he actually uses and people refer to, such as http://www.25hoursaday.com/weblog/ or http://carnage4life.spaces.live.com/ or http://twitter.com/Carnage4Life

DeWitt Clinton did an excellent job of clearing up some of Dare's other innaccuracies, but he then rhetorically exaggerated thus:
URLs make fantastic identifiers — for the 0.1% of the web population that understands that they “are” a URL. Fortunately, the other 99.9% of the world (our parents, for example) already understand that they have an email address.

This is missing the huge population of the online world (our children, for example) who consider email a messy noisy way to talk to old people, or to sign up to services when forced to, but are happy using their MySpace or Bebo or Hi5 or LiveJournal or Blogger or Twitter URLs to refer to themselves.
As I said in URLs are People Too:
The underlying thing that is wrong with an email address is that it's affordance is backwards - it enables people who have it to send things to you, but there's no reliable way to know that a message is from you. Conversely, URLs have the opposite default affordance- people can go look at them and see what you have said about yourself, and computers can go and visit them and discover other ways to interact with what you have published, or ask you permission for more.

Where I see OpenID providing a key advantage is in it's coupling with URL-based endpoints that provide more information and save the user time. The OpenID to PortableContacts connection as demonstrated by janrain can add your friends (with permission) from an OpenID login directly via OAuth.
This makes the OpenID login instantly more useful than an email one, and by connecting to an OpenSocial endpoint too, you can couple activities you take on the wider web with the site you trust to be a custodian of your profile and friends data, so your friends can discover what you are doing elsewhere, and come and join you.

I'm looking forward to talking through these issues at Internet Identity World next week in Mountain View.

Friday, 7 November 2008

Blogging's not dead, it's becoming like air

One thing I learned at Technorati is that one sure-fire way to get linked to by bloggers is to write an article about blogging. Sure enough, The Economist and Nick Carr have, with their 'death of the blogosphere' articles, garnered a fair bit of linkage.

Their curious obsession with the Technorati Top 100 is missing what is really happening. As JP points out, the old blogging crew are still around, they're just blogging less that those paid to do so a dozen times a day. Not because they are less interested or engaged, but because there are now many new ways to do what we used blogs for back then.

In 2001, if we wanted to share brief thoughts, we used a blog; to link to others’ posts, we used a blog. If we wanted a group discussion, we made a group blog.

With Technorati, and trackback and pingback, we built tools to follow cross-blog conversations, and learned that we are each others’ media. As I wrote in 2004:

The great thing about weblogs is when you discover someone. Someone who makes sense to you, or someone who surprises you with a viewpoint you hadn't thought of. Once you have found them you can subscribe to their feeds and see how they can keep inspiring or surprising you.
You can even start a blog, link to them, and join the conversation

A year later I reiterated:

By tracking people linking to me or mentioning my name, Technorati helps me in this distributed asynchronous conversation (thats how I found Mike and Dave's comments, after all). However, as I've said before, "I can read your thoughts, as long as you write them down first". In order to be in the conversation, you need to be writing and linking. Perforce, this means that those who write and link more, and are written about and linked to more, are those who most see the utility of it.

What has happened since is that the practices of blogging have become reified into mainstream usage. Through social networks and Twitter and Reader shared items and Flickr and HuffDuffer and all the other nicely-focused gesture spreading tools we have, the practice of blogging, of mediating the world for each other, has become part of the fabric of the net.
This may be the first blogpost I've written since August, but the many digital publics I'm part of have been flowing media and friendly gestures to and from me all the time.

Monday, 4 August 2008

Social Disease, or making magic?

Suw is musing thoughtfully on the overtones of describing something as ‘social’:

Monica thanked me for the explanation, saying that she was glad I had elaborated as she had thought, and I hope she forgives me for paraphrasing, that 'social software was something awful, like social workers'. That really made me think, and I haven't quite got to the end of where that throwaway comment has led me.

Is 'social' the problem with social software? Certainly in the UK, 'social' has some rather negative connotations: Social workers are often despised and derided as interfering, and often incompetent, busybodies. Social housing is where you put people at the bottom of the socioeconomic heap. Social sciences are the humanities trying to sound important by putting on sciency airs. Social climbers are people who know how to suck their way up the ladder. Social engineering is getting your way deviously, by using people's weaknesses against them. Social security is money you give people who can't be bother work for themselves. Socialism is an inherently flawed system that is prone to corruption. Social disease is venereal.


This reminds me of early in the Social Sofware story:
The SSA meeting was fairly chaotic - perhaps reflecting the diverse meanings of 'Social'. Clay Shirky did not show up (or if he did, did not speak up); Dave Winer later poured scorn on the efforts, implying it was all about social climbing.

Friedrich Hayek famously said that the word 'social' empties the noun it is applied to of their meaning. Hayek goes on:

...it has in fact become the most harmful instance of what, after Shakespeare's 'I can suck melancholy out of a song, as a weasel sucks eggs' ( As You Like It , II, 5), some Americans call a 'weasel word'. As a weasel is alleged to be able to empty an egg without leaving a visible sign, so can these words deprive of content any term to which they are prefixed while seemingly leaving them untouched. A weasel word is used to draw the teeth from a concept one is obliged to employ, but from which one wishes to eliminate all implications that challenge one's ideological premises.


Perhaps the problem is that the social realm is the realm of trust, so saying things are social is asserting "trust me". As Adam Gopnik writes on magic in the New Yorker:

But the Too Perfect theory has larger meanings, too. It reminds us that, whatever the context, the empathetic interchange between minds is satisfying only when it is “dynamic,” unfinished, unresolved. Friendships, flirtations, even love affairs depend, like magic tricks, on a constant exchange of incomplete but tantalizing information. We are always reducing the claim or raising the proof. The magician teaches us that romance lies in an unstable contest of minds that leaves us knowing it’s a trick but not which one it is, and being impressed by the other person’s ability to let the trickery go on.[...]

I saw, too, that David Blaine is absolutely sincere in his belief that the way forward for a young magician lies not in mastering the tricks but in mastering the mind of the modern age, with its relentless appetite for speed and for the sensational-dressed-as-the-real. And I thought I sensed in Swiss the urge to say what all of us would like to say—that traditions are not just encumbrances, that a novel is not news, that an essay is a different thing from an Internet rant, that techniques are the probity and ethic of magic, the real work. The crafts that we have mastered are, in part, the tricks that we have learned, and though we know how much knowledge the tricks enfold, still, tricks is what they are.

Thursday, 31 July 2008

Open Source and Social Cloud Computing

Tim O'Reilly has written an excellent review post on Open Source and Cloud Computing which says, among other things:

The interoperable internet should be the platform, not any one vendor's private preserve.

So here's my first piece of advice: if you care about open source for the cloud, build on services that are designed to be federated rather than centralized. Architecture trumps licensing any time.

But peer-to-peer architectures aren't as important as open standards and protocols. If services are required to interoperate, competition is preserved. Despite all Microsoft and Netscape's efforts to "own" the web during the browser wars, they failed because Apache held the line on open standards. This is why the Open Web Foundation, announced last week at OScon, is putting an important stake in the ground. It's not just open source software for the web that we need, but open standards that will ensure that dominant players still have to play nice.

The "internet operating system" that I'm hoping to see evolve over the next few years will require developers to move away from thinking of their applications as endpoints, and more as re-usable components. For example, why does every application have to try to recreate its own social network? Shouldn't social networking be a system service?

This isn't just a "moral" appeal, but strategic advice.[...]

A key test of whether an API is open is whether it is used to enable services that are not hosted by the API provider, and are distributed across the web.


I think this API openness test is not strong enough. As I wrote in An API is a bespoke suit, a standard is a t-shirt, for me the key test is that implementations can interoperate without knowing of each others' existence, let alone having to have a business relationship. That's when you have an open spec.

The other thing I resist in the idea of an internet operating system is that that the net is composable, not monolithic. You can swap in and implementations of different pieces, and combine different specs that solve one piece of the problem without having to be connected to everything else.

The original point of the cloud was a solved piece of the problem that means you don't have to worry about the internal implementation.

Thus, the answer to "shouldn't social networking be a system service?" is yes, it should be a Social Cloud. That's exactly what we are working on in OpenSocial.

Monday, 28 July 2008

Here Comes Everybody - Tummlers, Geishas, Animateurs and Chief Conversation Officers help us listen

Bob Garfield's de haut en bas attack on web commenters upset two very skilled conversational catalysts, Ira Glass, and Derek Powazek. The false dichotomy of 'we choose who you get to hear' and 'total anarchic mob noise' was dismissed by Jack Lail too. At the same time, Ben Laurie explained how the IETF's open-to-all mailing lists can be hijacked by time-rich fools, talking about the Open Web Foundation.

At Supernova last month, listening to Clay Shirky talk about the problems of collective action reminded me of a small nit I have with his excellent book Here Comes Everybody (which you should all read). He talks about the deep changes that ridiculously easy group forming online has wrought, but he also explains that most of these groups fail, in various ways.

The key to this is finding people who play the role of conversational catalyst within a group, to welcome newcomers, rein in old hands and set the tone of the conversation so that it can become a community. Clay referred to Teresa Nielsen-Hayden, who is a great example of this, and I have had the privilege to discuss this with Teresa, Amy Muller,Christy Canida and others at the Troll Whispering session at Web2Open, and heard very similar stories from Gina Trapini, Annalee Newitz, Jessamyn West and Jeska Dzwigalski at The Users Are Revolting at SXSW.

The communities that fail, whether dying out from apathy or being overwhelmed by noise, are the ones that don't have someone there cherishing the conversation, setting the tone, creating a space to speak, and rapidly segregating those intent on damage. The big problem with have is that we don't have a English name for this role; they get called 'Moderators' (as Tom Coates thoroughly described) or 'Community Managers', and because when they're doing it right you see everyone's conversation, not their carefully crafted atmosphere, their role is often ignored.

In other languages there are words closer to this role - Suw and I thought of geisha a while back, whereas Teresa suggested the Yiddish Tummler - both Deb Schultz and Heather Gold liked that one. In French animateur has the broader connotations of discussion, leadership and guidance needed, but in English we are stuck with enervated latinate words like facilitator. Even an eloquent and charismatic presidential candidatehad a difficult time explaining what a 'Community Organizer' does, around the same time that Bartlett was resorting to card tricks.

Which brings me back to Clay's book - in it he gives an account of the #joiito chatroom that completely misses the rĂ´le that JeannieCool played there, making her sound like a n00b. The software tool, jibot, that has helped keep that conversation going for 5 years, was built to support Jeannie's role as conversational catalyst. I do hope he gets a chance to correct this in the next edition.

The broader issue is one that we are still working on - building rules for who gets to speak where and when, re-imagining the historic model of a single hegemonic public record that print Journalism still aspires to, from its roots in the coffeeshops of London into the many parallel publics we see on the web, and how legal precedents designed for a monopoly of speech make no sense here.

In the meantime, if your newspaper, social media initiative or website isn't working right, you need to find your tummler, geisha, animateur or conversational catalyst, but you should consider giving them a big name title like 'Chief Conversation Officer'.

Tuesday, 8 July 2008

Shortening URLs, or getting inbetween?

With the rise of short message systems like Twitter, there is a growth in URL shorteners (as each one's namespace gets full, others get shorter). Today bit.ly launched to big fanfare in the blogosphere.

I took a closer look. What I noticed is that the older generation of these - tinyurl.com and xrl.us use a 301 Moved Permanently redirect, whereas bit.ly and is.gd use a 302 Found redirect, which means 'don't cache the redirected URL, keep checking the original'.

In other words, these services are saying in their HTTP responses that they may change what the short URLs point to in future, putting browsers, indexers and caches on notice that this may happen.

I also noticed that bit.ly, like tinyurl.com, allows you to pick a custom label from their namespace, but if you do it returns two 302 redirects in sequence (once to a more cryptic bit.ly url, then to the external one you chose). I pointed bit.ly/k at this blog, so you can check it yourself with curl:

$ curl --head http://bit.ly/k
HTTP/1.1 302 Found
Location: http://bit.ly/fwNKA

$ curl --head http://bit.ly/fwNKA
HTTP/1.1 302 Found
Location: http://epeus.blogspot.com

Apart from the extra delay this introduces, this is also telling your browser and web crawlers not to cache this, as they may change it in future. Compare tinyurl.com:

$ curl --head http://tinyurl.com/kevinm
HTTP/1.1 301 Moved Permanently
Location: http://epeus.blogspot.com

Google's advice for webmasters is to use 301 for redirects, as this signals the preferred URL.

Monday, 30 June 2008

Google as a restaurant? Watch Gordon Ramsay

Jeff Jarvis says he's writing a metaphorical application of Google principles to running a restaurant. Over the last few weekends, while sorting out stuff at home, I've been watching Gordon Ramsay's Kitchen Nightmares which BBC America seems to be playing continuously at weekends. If you haven't seen it, do watch some - each episode, Ramsay spends a week at a failing restaurant in the UK and tries to help them turn it around.

After seeing a few, there are recurrent themes that Ramsay comes up with: simple menus, built on good ingredients that local people understand, served promptly. Which fits well with Google's ten things - simple frontend, low latency results, user-focused. How he tries these out involve analogues for user testing, A/B experiments, and profiling under high load.

Of course, Google does run restaurants - so Jeff can read how they get built and tested directly.

Saturday, 14 June 2008

I'm with the stupid network

I'm looking forward to the Supernova conference next week, because Kevin Werbach always brings together an interesting group of people who care about the Internet and its future. We don't all agree on everything, which makes for some interesting debates, but we do tend to back the Open Web and the Stupid Network. It was the tenth anniversary of David Isenberg's 'Rise of the Stupid Network' paper this week, so I came up with this t-shirt design idea.

Sunday, 8 June 2008

How not to be viral

Graphing Social Patterns East is on tomorrow, and I'm sorry not to be there, though m'colleague Patrick Chanezon will be. However, reading the schedule I notice the word 'viral' is still much in evidence.

If you behave like a disease, people develop an immune system

At the Facebook developer Garage last week, I heard a developer say: when I hear 'viral' applied to software I replace it with 'cancerous' to clarify. A few months back I wrote that social Apps should be Organic, not Viral, and at Google I/O last week I expanded on this with m'colleagues Vivian Li and Chris Schalk. Here's an overview of the alternative reproductive strategies to being a virus that we came up with:

r-Strategy - scatter lots of seeds


Break free
Originally uploaded by aussiegall

Some plants and animals, like dandelions and frogs, rely on having huge numbers of offspring, with the hope that a few of them will survive - this is known as an r strategy. In application terms this is like wildly sending out invitations, or forcing users to invite their friends before showing them useful information. It may help you spread your seed, but most of them will die off rapidly.

K-Strategy - nurture your young

proud Mama

Proud mama
Originally uploaded by debschultz

Mammals take the opposite strategy; they have a few young, and nurture them carefully, expecting most of them to grow up to adulthood and reproduce themselves. This is known as a K strategy. This translates into software by following Kathy Sierra's principles to create passionate users who will share your application through word of mouth. Another way to nurture your users is to encourage them to use your application before they have to install it, as Jonathan Terleski describes.


Fruiting - delicious with a seed in


help yourself
Originally uploaded by *madalena-pestana*

Many plants encourage their seeds to be spread more widely by wrapping them in fruit, so that animals or birds will carry them further, eating the fruit and helping the seed to propagate. The analogy here is in making sure your invitations aren't just bald come-ons for your application "a friend said something - click here to find out what" - with a forced install on the way, but instead are clearly bearing gifts to the receiving user, so they will want to click on the link after seeing what is in store. This is one of Jyri Engström's principles for Web 2.0 success with Social Objects.


Rhizomatic - grow from the roots up


Sweetness / Dolcezza
Originally uploaded by WTL photos

Another reproductive strategy that many plants, including strawberries and ginger use is to send out runners or shoots from the roots, so that they spread out sideways, from the bottom up, known as rhizomes or stolons. The analogy here is for social applications that spread through appearing in users activity streams and via entries in application directories, growing outwards through the 'grass roots' runners that they send out as part of their normal usage.


Being dumb gets low CPMs

A lot of the debate around viral applications reminds me of a David Foster Wallace quotation:
TV is not vulgar and prurient and dumb because the people who compose the audience are vulgar and dumb. Television is the way it is simply because people tend to be extremely similar in their vulgar and prurient and dumb interests and wildly different in their refined and aesthetic and noble interests.

Social networks aren't like TV - everyone sees something different in them. If you want to gather engaged, inspired, interested and indeed valued users, write an application that speaks to their refined and aesthetic and noble interests, and see how they will spread it through their social networks to find the others who share their interests.


It was interesting to see Slide redirecting away from virality today. GSP West was on at the same time and place as eTech, and I heard some eTechies refer to it as 'Grasping Social Parasites'; I hope that the growing realisation that a disease is not a good model to base your business on means that tomorrows conference will spread a better reputation for GSP East.

Tuesday, 27 May 2008

Miasma theory - wrong in the 1840s, wrong now

A couple of years ago I wrote:
My generation draws the Internet as a cloud that connects everyone; the younger generation experiences it as oxygen that supports their digital lives. The old generation sees this as a poisonous gas that has leaked out of their pipes, and they want to seal it up again.

Bill Thompson and Nick Carr are worried about governments interfering too:

In the real world national borders, commercial rivalries and political imperatives all come into play, turning the cloud into a miasma as heavy with menace as the fog over the Grimpen Mire that concealed the Hound of the Baskervilles in Arthur Conan Doyle's story.

Except, if you have read or listened to Steven Johnson's excellent The Ghost Map, you'll know that the miasma theory of disease was a fatal error for urban England in the 1840s - the real problem was not the bad smells in the air, but the diseases in the water. The fault, dear governments, lies not in our clouds but in your pipes.

Monday, 26 May 2008

An API is a bespoke suit, a standard is a t-shirt

Brad is calling for APIs, and even the NYT is proposing one, but there is a problem with APIs that goes beyond Dave's concern about availability.

When a site designs an API, what they usually do is take their internal data model and expose every nook and cranny in it in great detail. Obviously, this fits their view of the world, or they wouldn't have built it that way, so they want to share this with everyone. In one way this is like the form-fitting lycra that weekend cyclists are so enamoured of, but working with such APIs is like being a bespoke tailor - you have to measure them carefully, and cut your code exactly right to fit in with their shapes, and the effort is the same for every site you have to deal with (you get more skilled at it over time, but it is a craft nonetheless).

Conversely, when a site adopts a standard format for expressing their data, or how to interact with it, you can put your code together once, try it out on some conformance tests, and be sure it will work across a wide range of different sites - it's like designing a t-shirt for threadless instead.

Putting together such standards, like HTML5, OpenID, OAuth or OpenSocial or, for Dave's example of reviews, hReview, takes more thought and reflection than just replicating your own internal data structures, but the payoff is that implementations can interoperate without knowing of each others' existence, let alone having to have a business relationship.

I had this experience at work recently, when the developers of the Korean Social network idtail visited. I was expecting to talk to them about implementing OpenSocial on their site, but they said they had already implemented an OpenSocial container and apps using OpenID login, and built their own developer site for Korean OpenSocial developers from reading the specification docs.

I'm looking forward to more 'aha' moments like that this week at I/O.

Tuesday, 6 May 2008

Portable Apps, not data?

Brad Templeton has a post on Data Hosting not Data Portability that fits in neatly with the VRM proposal I discussed yesterday. In fact, what he describes is a great fit for OpenSocial.

He says:

Your data host’s job is to perform actions on your data. Rather than giving copies of your data out to a thousand companies (the Facebook and Data Portability approach) you host the data and perform actions on it, programmed by those companies who are developing useful social applications.

Which is exactly what an OpenSocial container does - mediate access to personal and friend data for 3rd party applications.

This environment has complete access to the data, and can do anything with it that you want to authorize. The developers provide little applets which run on your data host and provide the functionality. Inside the virtual machine is a Capability-based security environment which precisely controls what the applets can see and do with it.

This maps exactly on to Caja, the capability-based Javascript security model that is being used in OpenSocial.

Your database would store your own personal data, and the data your connections have decided to reveal to you. In addition, you would subscribe to a feed of changes from all friends on their data. This allows applications that just run on your immediate social network to run entirely in the data hosting server.

Again, a good match for OpenSocial's Activity Streams (and don't forget persistent app data on the server).

Currently, everybody is copying your data, just as a matter of course. That’s the default. They would have to work very hard not to keep a copy. In the data hosting model, they would have to work extra hard, and maliciously, and in violation of contract, to make a copy of your data. Changing it from implicit to overt act can make all the difference.

The situation is worse than that; asking people for their logins to other sites is widespread and dangerous. I'd hope Brad would support OAuth as a step along the way to his more secure model - especially combined with the REST APIs that are part of OpenSocial 0.8

If you're interested in these aspects of OpenSocial, do join in the linked mailing lists, and come along to the OpenSocial Summit on May 14th (just down the road from IIW).

Monday, 5 May 2008

Mixing degrees of publicness in HTTP

At the Data Sharing Workshop the other day, we had a discussion about how to combine OAuth and Feeds, which I was reminded of by Tim Bray's discussion of Adriana and Alec's VRM proposal today.
The session was tersely summarized here, but let me recap the problem.

When you are browsing the web, you often encounter pages that show different things depending on who you are, such as blog, wikis, webmail or even banking sites. They do this by getting you to log in, and then using a client-side cookie to save you the bother of doing that every time. When you want to give a site access to another one's data (for example when letting Flickr check your Google Contacts for friends), you need to give it a URL to look things up at.

The easy case is public data - then the site can just fetch it, or use a service that caches public data from several places, like the Social Graph API. This is like a normal webpage, which is the same for everyone, returning a HTTP 200 response with the data.

The other common case is where the data is private. OAuth is a great way for you to delegate access to a web service for someone else, which is done by returning an HTTP 401 response with a WWW-Authenticate: OAuth header showing that authentication is needed. If the fetching site sends a valid Authorization header, it can have access to the data.

The tricky case is where there is useful data that can be returned to anyone with a 200, but additional information could be supplied to a caller with authentication (think of this like the social network case, where friends get to see your home phone number and address, but strangers just get your hometown). In this case, returning a 401 would be incorrect,as there is useful data there.

What struck me was that in this case, the server could return a 200, but include a WWW-Authenticate: OAuth header to indicate that more information is available if you authenticate correctly. This seems the minimal change that could support this duality, and much easier than requiring and signalling separate authenticated and unauthenticated endpoints through a HTML-level discovery model, or, worse, adding a new response to HTTP. What I'd like to know from people with deeper HTTP experience than me is whether this is viable, and is it likely to be benign for existing clients — will they choke on a 200 with a WWW-Authenticate header?

HTTP does have a 203 response meaning Non-Authoritative Data, but I suspect returning that is more likely to have side effects.

Tuesday, 29 April 2008

Digital publics, Conversations and Twitter

Last week, I left the Web 2.0 conference to listen to Mimi Ito, danah boyd and their colleagues talk about their research on Digital Publics.

Now if you haven't been paying attention, that plural of 'public' there may throw you. Surely things are either 'public' or 'private'? As danah explains:

Just as context is destabilized through networked publics, so is the meaning of public and private. What I learned from talked to teens is that they are living in a world where things are "public by default, private when necessary." Teens see public acts amongst peers as being key to status. Writing a public message to someone on their wall is a way of validating them amongst their peers. Likewise, teens make choices to go private to avoid humiliating one of their friends.

Yet, their idea of public is not about all people across all space and all time. They want publics of peers, not publics where creeps and parents lurk.

Bly Lauritano-Werner (17, Maine):

My mom always uses the excuse about the internet being 'public' when she defends herself. It's not like I do anything to be ashamed of, but a girl needs her privacy. I do online journals so I can communicate with my friends. Not so my mother could catch up on the latest gossip of my life.

Properties of technology have complicated what it means to be in public. We are all used to being in publics that don't include all people across all space and all time. Many of us grew up gossiping with friends out in public and stopping the moment that an adult walks over. This isn't possible when things are persistent. And it's really hard to be public to all peers and just keep certain people out. So teens are learning how to negotiate a world where the very meaning of public and private have changed. Again, this is a good thing. They're going to need these skills in the future.

The day before, at Web2Open, I had heard something similar in the Troll Whispering session. Christy Canida explained that when someone posts something trollish or otherwise dubious on her site, they get put in a state where only they can see their posts, but no-one else can (except Christy and the other conversation monitors). This damps down the flame responses until Christy and co have time to review, and maybe release them, but in their view the post is on the site, but no-one is responding.

This varying view of the web, depending on who you are, seems odd at first, but it is in fact a recognition in code of what actually exists in human attention. We don't all read the same web, we see our own reflections in what we seek through searches or filtered by our homophily-led reading.

Which is where Twitter comes in. Like Jeff, I've been twittering more than blogging recently, and while immediacy is part of it, a far stronger thing is that I have a sense of public there - a public of people I choose to follow and who chose to follow me. Everyone who uses Twitter sees a different, semi-overlapping public, which maps closer to our individual idea of the digital public we are speaking to, and listening to; one that maps more closely what the socialogist and theorists have been describing for a while.

Wednesday, 16 April 2008

Comcast's Bialystock and Bloom Business Model?

Tomorrow, the FCC is holding a public hearing at Stanford on Broadband network management practices. With striking timing, Comcast today managed to announce a 'Internet Bill of Rights' without inviting any users, and simultaneously cut off Dave Winer's net connection for exceeding their secret usage limits. I can't link to Comcast's policy because their website mungs the text in via javascript - here's what they say:
Excessive use means data usage that is not characteristic of a typical residential user of the service as determined by Comcast.[...]Comcast currently identifies well less than 1% of Comcast High-Speed Internet customers as excessive users each month. [...]Many excessive users consume more data than a business-class T1 line running at full capacity in a month. [T1 is 1.5 Mbit/sec - Comcast claims to offer 12 Mbit/sec for PowerBoost, and 6/8 Mbit/sec standard] [...] Currently, each month Comcast identifies the top bandwidth users of its High-Speed Internet service by determining aggregate data usage across its entire customer base nationwide.

What they are saying is that they use a crude averaging model, and penalize you if you don't fit, for example by using the connection capacity they promise more than 10% of the time. Now, this could be called Procrustean, but it reminds me of The Producers, where Bialystock and Bloom sold a hundred people 10% shares of the show, assuming it would fail. Sadly for Comcast, people like Dave are finding new uses for the net's bandwidth, and not just checking email sporadically any more.

Conventional internet service user models are based on users downloading more then they upload, from common big media sites that can be easily cached. However, as Odlyzko pointed out, citing Lesk's now decade-old work, the dominant form of data creation is photographs. Now all these photographs are actually digital, and we want to share them so others can see them. Because we aren't allowed to run our own servers by the likes of Comcast, we have to upload them to Flickr or Photobucket or Picasa to share them. This gives us an 'upload more than you download' network flow, as we send them up at full multi-megapixel resolution, but browse a few of each others' at thumbnail or reduced size. And that's before we even consider video uploading (which I've noticed Comcast throttles at 0.4 Mbit/sec for me).

Comcast hit the news before by sabotaging Bittorrent transfers by faking reset packets, but what Bittorrent is really doing is arbitraging around the asymmetric network bandwidth delivered by these outdated user models.

Bob Briscoe recently wrote an interesting proposal on handling congestion by TCP signalling to reveal the costs of congestion. This was spun by George Ou as an attack on P2P protocols, but the underlying principle of penalising those who cause congestion is an interesting one. The question I'd like answered is that if I have a gigabit network at home, and the internet backbone is multi-terabit, when Comcast throttles my uploads to 400 kilobits, aren't they the ones causing the congestion?

Tuesday, 19 February 2008

Be Organic, not Viral

I just got back from the VLAB Multi-platform Social Networking event, which I thought was very interesting overall. Jeremiah Owyang did a great moderating job, and Jia Shen, Sourabh Niyogi, Ken Gullicksen and Steve Cohen brought lots of different viewpoints to the discussion. Growing and deriving value from Apps within Social Networks is still full of lots of unknowns, but it was good to hear some basic shared principles come through - my summary of one point was 'before you think about a Business Model, make sure you have a Pleasure Model'.

Another point well made by Steve Cohen of Bebo was something I've been thinking for a while too - the hunger for 'Viral' growth is a mistake - what you really need is 'Organic' growth. Just as we distinguish between Organic search results and bought or spammed ones, social network sites and their users are distinguishing between the viral apps that are essentially parasitic, using their hosts as a means to their propagation, and the ones that organically become part of the social ecology, making both the site and the users richer by their presence.
I spent the last weekend fighting off a flu virus, partly by eating lots of organic fruit. I expect social networks and their users will continue to do the same.

Monday, 11 February 2008

The Social Cloud

My talk from LIFT is here for you to watch below (20mins, needs flash):


The others are up at the LIFT Video site

Thursday, 7 February 2008

LIFT Conference starts


Geneva Sunrise
Originally uploaded by Kevin Marks
I'm in Geneva for the LIFT conference, watching Bruce Sterling riff on Carla Sarkozy as a black swan. The photo is what the sunrise looked like over the Alps at breakfast.

Saturday, 26 January 2008

Sheet music redux

I've long been involved with amateur theater and music performance (my boys are performing in a Schumann recital tomorrow with the rest of their piano teacher's pupils), and I grew up seeing double bass cases plastered with Musicians' Union "Keep Music Live" stickers around the place, but I always thought this was a luddite rearguard action against the tide of recorded media that began flowing about a century ago.
But this week, the news was that Rock Band  had sold 2.5 million downloaded songs for you to play along with (it comes with 58). 
Having researched this thoroughly with my boys, the fun of this game is more in the playing than the listening — the 'guitar' playing is clearly simplified, though the drumming is pretty close to reality, and the less said about my 'singing' the better.
Looking at this in the longer view, it can be seen as the return of sheet music in a new form; before recordings took over, sheet music sold for amateur performers was the dominant form. Here's Douglas Adams again:
during this century we have for the first time been dominated by non-interactive forms of entertainment: cinema, radio, recorded music and television. Before they came along all entertainment was interactive: theatre, music, sport – the performers and audience were there together, and even a respectfully silent audience exerted a powerful shaping presence on the unfolding of whatever drama they were there for. We didn’t need a special word for interactivity in the same way that we don’t (yet) need a special word for people with only one head.

I expect that history will show ‘normal’ mainstream twentieth century media to be the aberration in all this. ‘Please, miss, you mean they could only just sit there and watch? They couldn’t do anything? Didn’t everybody feel terribly isolated or alienated or ignored?’

‘Yes, child, that’s why they all went mad. Before the Restoration.’

‘What was the Restoration again, please, miss?’

‘The end of the twentieth century, child. When we started to get interactivity back.’

Friday, 18 January 2008

Fear of the new - the Internet, Tea, and MapReduce

Sir Richard Dearlove, former head of MI6 said:

“Al-Qa’eda has prospered and as it were regrouped largely because of the energy and effort it has put into its propaganda, largely through the internet.”

Sir Richard added that the internet had become the main channel for “radicalisation” and coordination between al-Qa’eda cells. He said: “In dealing with this problem, there is no alternative to imposing significant controls over the internet.”


This is what I call the "cup of tea" problem, after Douglas Adams:

Newsreaders still feel it is worth a special and rather worrying mention if, for instance, a crime was planned by people 'over the Internet.' They don't bother to mention when criminals use the telephone or the M4, or discuss their dastardly plans 'over a cup of tea,' though each of these was new and controversial in their day.

Some people have been surprised that tea was controversial, but William Cobbett's 1822 'The evils of tea (and the virtues of beer)' had this to say:

It must be evident to everyone, that the practice of tea drinking, must rended the frame feeble and unfit to encounter hard labour or severe weather, while, as I have shown, it deducts from the means of replenishing the belly and covering the back. Hence, succeeds a softness, an effeminacy, a seeking for the fireside, a lurking in the bed, and in short, all the characteristics of idleness, for which, in this case, real want of strength furnishes an apology. The tea drinking fills the public-houses, makes the frequenting of it habitual, corrupts boys as soon they are able to move from home, and does little less for the girls, to whom the gossip of the tea-table is no bad preparatory school for the brothel. At the very least, it teaches them idleness.

Which brings me to the attack on MapReduce today, which spectacularly misses the point by attacking a programming technique for not being a database and contains the striking line:

Given the experimental evaluations to date, we have serious doubts about how well MapReduce applications can scale.

(MapReduce is what Google uses to run complex data-manipulation problems on lots of computers in parallel to do things that databases fail at, like building an index for all the webpages it has found, or rendering map tiles for everywhere on earth in Google maps).

Wednesday, 9 January 2008

OpenSocial Hackathon next week in SF

OpenSocial Hackathon hosted by Six Apart


Wednesday, January 16, 2008 4:00 PM - 11:00 PM
Six Apart 548 4th St,San Francisco, California

Find out the latest news about OpenSocial's 0.6 release and what Shindig and Cajoling can do for your next web application
Work with developers of OpenSocial Social Networks to get your applications up and running.
What to bring:
  • Your laptop
  • Your web application code or your social networking idea

What we provide:
  • Wifi and power
  • Help getting into OpenSocial 0.6 sandboxe
  • Developers from at least Google, MySpace, Hi5, Plaxo, and Six Apart
  • and don't forget pizza!

Hosted at Six Apart's 4th street offices, it's a short walk from Caltrain and indeed the Macworld Expo.
Six Apart's post
RSVP at Upcoming

Monday, 7 January 2008

Identity Theft is not a crime

Fraud however, is.

Jeremy Clarkson scoffed at the UK Government data leak debacle, and published his bank details in The Sun:

"All you'll be able to do with them is put money into my account. Not take it out. Honestly, I've never known such a palaver about nothing," he told readers.

But he was proved wrong, as the 47-year-old wrote in his Sunday Times column.

"I opened my bank statement this morning to find out that someone has set up a direct debit which automatically takes £500 from my account," he said.

"The bank cannot find out who did this because of the Data Protection Act and they cannot stop it from happening again.

"I was wrong and I have been punished for my mistake."

Police were called in to search for the two discs, which contained the entire database of child benefit claimants and apparently got lost in the post in October 2007.

They were posted from HM Revenue and Customs offices in Tyne and Wear, but never turned up at their destination - the National Audit Office.

The loss, which led to an apology from Prime Minister Gordon Brown, created fears of identity fraud.

Clarkson now says of the case: "Contrary to what I said at the time, we must go after the idiots who lost the discs and stick cocktail sticks in their eyes until they beg for mercy."

I'm amazed that the normally combative Clarkson has accepted this feeble excuse from his bank, when they have just handed out a huge sum of his money to someone else against his wishes, revealing that they are failing in their primary purpose of keeping money safely.

That their security process can fail spectacularly in this way, enabling fraudsters to siphon off money, is sadly all too common.

What is notable is that the banks have spent enormous sums of money promoting the concept of 'identity theft' through clever TV adverts, diverting their customers' attention from their security cock-ups, despite the fact that they are liable for the fraudulently dispersed funds. I don't understand why the banks continue to use "mothers maiden name" as default password, and enable debits this way, then hide behind data protection legislation when their error is pointed out. Clarkson should be railing at the idiots at his bank, too.

Update:

Thanks to Kerry Buckley in the comments for this excellent comedy sketch that sums it up perfectly:

Thursday, 3 January 2008

memes, dreams and themes

Cameo pointed me at this dada album cover meme today:
  1. The first article title on the Wikipedia Random Articles page is the name of your band.

  2. The last four words of the very last quotation on the Random Quotations page is the title of your album.

  3. The third picture in Flickr's Interesting Photos From The Last 7 Days will be your album cover.

  4. Use your graphics programme of choice to throw them together, and post the result.

I got the following via this flickr image (which I hope counts as fair use - Suw uses CC images instead)

album_meme

I just found out via Twitter that my colleague from (mumble mumble) years ago, Nikki Barton, has a blog; she's wise - read her.

Rosie asked me "Was there a Solar eclipse in Yorkshire in 1967?" - the answer was No, but I found this great NASA site, which reminded me of my Astronomy tutor at Cambridge from 1987, who at the time had booked a hotel in Cornwall for the 1999 total eclipse (I hope the clouds lifted for him). I'm re-reading Neal Stephenson's Baroque Cycle at the moment, so I am tickled that I can look up an astronomical ephemeris this easily.

Finally, the Edge question this year is What have you changed your mind about? - a lot of food for thought there.

Wednesday, 2 January 2008

URLs are people too

There is an assumption buried in the collective mind of developers that is hard to remove, and it is that people are best represented by email addresses. Go to almost any website to sign-up, and you are prompted for an email address and password. Signing up usually involves digging out the site's reply from your spam folder and clicking on a link to get confirmed, then giving it a password. Sometimes you get to pick a username too, from whatever stock of namespace is left at the site.


Elizabeth Churchill and Ben Gross looked into this and found out that people find it easier to remember passwords than usernames, because they use the same passwords everywhere, and they end up with multiple different email accounts to handle the problem of having handed them to to all these sites and getting spammed by them.


Meanwhile, over here in the blog world, we've been using blog URLs to refer to people for years, and social network sites have proliferated URLs that are people. I have several that refer to me, my events, my music, my twitters and my photographs linked from the sidebar here. We even have XFN's rel="me" to connect them together, and OpenID to allow them to be used as logins elsewhere, instead of emails.


The underlying thing that is wrong with an email address is that it's affordance is backwards - it enables people who have it to send things to you, but there's no reliable way to know that a message is from you. Conversely, URLs have the opposite default affordance- people can go look at them and see what you have said about yourself, and computers can go and visit them and discover other ways to interact with what you have published, or ask you permission for more.


So, developers, remember that URLs are people too.


Update: This tension between email-as-identifier and email-as-way-to-be-spammed is what makes Scoble's attempt to extract 5,000 people's emails from Facebook for his own use less defensible than it appears at first. Dare Obasanjo recognises the tensions, but strangely dismisses the OpenSocial attempt to abstract out this kind of data into a common API.

Tuesday, 1 January 2008

Tardy blogging

Between Twitter and my Reader shared items, my blogging impulses have been diverted elsewhere recently; I'll try to rectify this in the new year by writing here more often, but do follow those two links if you want to hear more of my brief observations and recent reading respectively.