Category: Sessions

  • Humanising The Enterprise Using Ambient Social Knowledge

    • Lee Bryant of Headshift, a scial software consulting firm, spoke of the death of traditional enterprise software and the rise of a new breed of social software that is attentuating the torrent of information within large companies.
    • The lost world of IT dinosaurs – legacy systems that are too big and expensive to kill
    • In the outside world – the dotcom bust swept away software ‘predators’ and people began constructing their own tools.
    • Humans are back – People found that if their systems all connected up, they could share stuff!
    • Fear, enterprise=expensive – Managers continue to buy arguments about process, workflow, security and control that software vendors use to keep them in the Stone Age…seeing IT as an enabler again.

    Information and attention overload is a consequence of this legacy…

    • Management by email limits peripheral vision, damaging individual decision making.
    • Information overload is worse for concentration than smoking dope.
    • Who controls your inbox, tasklist and your agenda?
    • Too much push, too little pull – bad signal-to-noise ratio.
    • Companies store millions of documents but have little idea which are important. People rarely find what they need. Knowledge cannot be codified as bits, but needs context.

    The emergence of social software…

    • Simple actions repreated at scale within a social network produce emergent network effects.
    • Easy interfaces and a low ‘cognitive footprint’ reduces barriers to participation.
    • Operate locally, aggregate globally.
    • del.icio.us is a more effective for knowledge sharing than KM products.
    • Confluence provides a better intranet with its syndicatable wiki features.
    • Budgets are shifting to social tools such as blogs, wikis, social tagging, lightweight group tools.
    • IT departments are realising they need to loosen the reins and make it easier for people to get on with their jobs.
    • A new relationship with information – feeds, flows, syndication, subscriptoins, social tagging, blogging, wiki co-production.
    • Continuous partial attention.
    • Variable interaction modes, depth, time relations.
    • Reuters Financial Glossary and WikiLaw
    • New behaviours – give people control over their information flows using aggregators,

    How do we process information?

    • Pattern matching and the ‘best-first-fit’…the only humns who analyse all the data and then make a rational chouce are autistic.
    • The brain takes in more than we know, but filters and simplifies using archetypes and patterns…this is why we can real newspapers quickly.
    • Existing systems limit the diversity of inputs to stimulate intuitive decision making.

    How do we innovate?

    • Innovation requires a problem or idea, a solution and a project evangelist – they may not be in the same place.
    • R&D is forward-facing and therefore not served by enterprise storage or classification.
    • These days, the best innovation often comes from passionate users or extra-firewall partnerships.

    The answer to information overload?

    • Simply, more information, consumed differently and each taken, less seriously. Diversity will compensate.
    • Build a better radar, use social tools and trust people to make decisions on that.
    • More peripheral, contextual information flows.
    • Less dependency on email and task assignment.
    • Better findability, not storage.
    • Classification = calcification (tags are chea and fun!)

    A new relationship with people…

    • We are hard-wired for socialisation, but enterprise tools are rooted in 1950s Fordist theories.
    • Complexity thinking is more useful here…
    • A social fabric for knowledge sharing, collaborative filtering and connected conversations.

    Small pieces

    • Break everything into small pieces.
    • Everything should have a URI and a feed.
    • Social bookmakring, tagging and selection.
    • Simple group spaces to share within a trusted context.
    • When it’s atomised, let people build it back up.
    • Pick. Mix. Share. Feed. Improve findability with user-driven metadata and organisation, bring old content to life with layers of usage and context metadata. Support remixing and mashps.

    An information-rich environment

    • Remote-controlleed shared displays
    • Ambient devices?
    • Smart walls and whiteboards
    • Architecture and office design.

    A sharing culture

    • Opennness is an aspirational value.
    • An ecological approach to knowledge – not knowledge managament (wiki gardening).
    • Self-directed support and peer-to-peer assistance.
    • Whing framework of objectives, let people find their own way.

    Build onwhat you’ve got

    • Use the heavy-lifting and storage of legacy systems, not the GUI.
    • give people their own social UI on existing infra to dosciver, store, share and create.
    • Bring out feeds from legacy systems; creating mediating services.

    TOOLS. CONTENT. CULTURE.

  • The Data Dump – Fun With Graphs & Charts

    The Data Dump has two strict rules:

    • the data must make a larger point about the Internet and its users, not just about the source company
    • since data visualization is as or more important than data collection, it’s gotta look good.

    State of the Blogosphere – March 2006, David Sifry

    • 30m blogs tracked, doubling in size every 5.5 months, consistent doubling for the previous 36 months.
    • 100’000 blogs created each day, almost every second.
    • 50% of bloggers still blogging after 3 months, 10% of blogs updated weekly, 9% are spam.
    • 60% of pings are from known spam sources – Technorati registers them as splogs.
    • 1.2m posts/day, 50’000 posts/hour.
    • Frequency should be measured in megahertz!
    • Blogging has brought a friction-free publication mechanism – trade magazines are being displaced by blogs.
    • 41% Japanese, 28% English, Chinese 14%, Spanish 3% – Japanese growth has come in the last four months.
    • 50% of blog posts use tags or categories.
    • 81m+ tagged posts, with 400’000 more each day.
    • It’s about exposing community and adding context.

    Feeds, Eric Lunt

    • Feedburner measures feed traffic for 200’000 feeds.
    • ManifestDigital visualisation of feed activity as drops of water on a pond, colour indicates media type and splash-radius, the peak subscription figure.

    Gauntlet Systems, Adam Messinger

    • Next-generation continuous source control system…no more broken builds or smoke tests….conprehensive analytics.
    • Do the few most productive developers account for the majority of code?
    • Does open source enable a long tail for porting and internationalisation?
    • Example – Two people do 80% of coding, the rest make changes…the distribution flows exponentially.
    • Lucene – Three developers do most of the work,
    • Hibernate – Development is evenly distributed in this commercial project

    Windows Live, The Year In Review

    • Queries from Live Virtual Earth and Live Search were rendered as a a cloud of keywords.
    • ‘The diversity of activities is mind-boggling’…

    O’Reilly Radar, Roger Magoulas

    • Normalisation, how to spot emerging trends and weak signals.
    • How do you compare things with vastly different scales?
    • Correlating AJAX pages and AJAX jobs showed that SF, NYC, Boston were the key hiring regions and that hiring trends were as seasonal as other languages.
    • Google book searches for O’Reilly titles are driven by searches originating in India (70%).

    Root.net, Jonas Goldstein

    • After founding the Attention Trust and a clickstream recorder that streams data to a personal online vault.
    • Stamen-produced statistical visualistions of browsing activity.

    , August Capital, David Hornick

    • Previous six months of Hornick’s email.
    • Email = Mtgs + Schmooze.
    • 17’779 recieved.
    • Thanksgiving and Chanukah were only dips!
    • Holidays seem to be only dips, even when normalised across all VCs.
    • 90 references to Cabo, 159 Hawaii, 198 Wine.
    • Most common email – titled ‘Introduction’ (979 times).
    • hornik@augustcap.com!
  • RFID: A Case Study of the Risks and Benefits of Location-Aware Technologies

    Jen King’s (Yahoo’ Berkeleys Marc Davis is King’s professor) session on RFID began with a review of first principles and the two basic RFID components – a tag/chip/smartcard and a reader, communicating through radio signals. Most current applications aren’t consumer, but largely enterprise logistics, supply-chains and inventory control.

    The US E-Passport (containing an ISO14443 contactless 64Kb smartcard) is to be issued by all US passport agencies by the end of 2006. By 2008, the US, Canada and Mexico will require E-Passports for travel. King reviews these examples rather than consumer applications, simply because they are live, have direct impact on people’s lives and raise questions about some of the problems with RFID.

    RFID was selected for the passport because of the difficulty to counterfeit, remote reading, inclusion of biometric data, ICAO adoption and heavy lobbying by the smartcard industry. Data is encrpyted, but not signed and includes some basic demographics and a JPG passport photo.

    Security weakenesses with the E-Passport include skimming, eavesdropping and cloning. Originally the US State Department chose not to require encryption as information was in the printed copy anyway and encryption would require global infrastucture upgrades and slow the reading process. Following several studies and some criticism, State has now requied all E-Passports to include anti-skimming material, though this is problematic also and King recommends an anti-static bag! Also, numbers in the machine readable area are now scanned for use as a PIN to maintain the document’s security.

    Incidentally, of the 2’335 comments recieved, during the hearings on development of the passport, 98.5% were negative!

    Despite these problems, RFID hacking is not as easy as might be imagined. ISO14443 readers and tags do not assert complete compatibility, read-range experiments (current ceiling is 69ft) are still in process and equipment is not portable. However, demand for this type of equipment is likely to increase and scanners can be located in fixed positions with high footfall, negating issues of portability.

    RFDump.org‘s Lukas Grunwald has created an application that reads and writes RFID tags at ‘Metro-Future’ stores in the UK. In this store, Grunwald managed to swap the prices of cream cheese and DVDs!

    The US-Visit I-94A forms, for transit through a land port, include an embedded RFID tag. Unfortunately, users have to hold the form in the window as the metallic body of the car blocks the signal, negating the value of a border system with faster throughput.

    Both these case studies indicate that users, privacy impact and usability were considered as afterthoughts. IN the case of the E-Passport, throughput is actually slower than areas that utilise printed passports.

    The ReadID Act of 2005 now requires that all ID issued by 2008 has to include a machine-readable technology, most likely RFID (though barcodes could be employed). Ironically, stronger ID won’t prevent terrorism and makes ID theft more rewarding.

    In conclusion, RFID-enabled products need to be designed with usability in mind and privacy/security concerns cannot be taken lightly. Notably, RFID can be implemented securely with minimal impact on privacy. Most worryingly, it seems RFID adoption and national ID cards in both the US and UK have been driven largely by collusion between the smartcard industry and foreign ministries, with little to no regard for user-centric design…this is borne out in examining public records on the development of ID cards and passports in both countries.

  • Everybody’s It – Tagging With Identity

    Mary Hodder’s session on tagging and identity, builds on some of the work from the Identity 2.0 movement, proposing that tagging has value for annotating rich media. Technorati’s tags provide a partial solution but doesn’t address how people wish to include tags on their own site, but still participate in communities.

    In usabilities, bloggers requested:

    • Trusted tags.
    • Tags that didn’t require links.
    • Tags with flexibility to mobilise tags from the issuing site.
    • Visibility vs. everybody.
    • Make their own tag clouds for their blogs.
    • Easier, automated systems.
    • Tagging objectgs separately from other posts.

    65% of Technorati tags are drawn from blog categories, running to about 10m a month. User’s of Hodder’s own Dabble.com tag around 53%of their content. Media from third-parties tends not to be tagged – also the richer the media, the closer to 100% the tagging draws.

    Dabble users tend to look at tags and the duration of video clips in order to make decisions about whether to view the content.

    iTags are Hodder’s solution – encompassing a subject+verb model:

    • Tags.
    • Identity (blog URL, known identity, pseudonym, privacy+permissions).
    • Creative Commons licensing.

    Hodder sees value in itags to express licensing to aggregators, uncoupling an object from URLs and move to XRIs for structured identity that indicate media and licensing data.

    Interestingly, itags’ development team includes Kaliya Hamlin, one of the contributors to YADIS, indicating that our work on Simpatico (using LID) could be extended with itags.

  • Feed To The Future

    Feedburner’s Eric Lunt opened by presenting the growth rate of RSS subscriptions – ranging from 221’375 feeds in January 2005 to 9’547’171 by February 2006 (source)

    Subscriptions are outpacing feed recognition. Report after reportt shows that people are unfamiliar with the terms, but subscriptions continue to grow dratically.

    Subscriptions are becoming more embeddedinto more worlds – Democracy TV, iTunes’ podcasting directory, OS X screensavers, My Yahoo and Slide are examples of services where feeds are invisible to the user.

    In examining the question of whether publishers shoul offer full or partial feeds, Lunt relates that full feeds are out-subscribed by partial feeds by an order of ten. However, partial feeds grow at the same pace and experiements in reducing the content of a feed don’t substantially alter click-through-rates to the parent site.

    Lunt recommends that publishers focus on items rather than feeds. Filtering of content by tags and searches drives and increases item-level distribution.

    In 2004, several hundred clients expanded to a thousand clients in 2005 and tnow thousands of clients today. The progression from aggregators to filters, browsers and now AJAX home pages is continuing – the proliferation of readers is not leading to consolidation, but driving innovative approaches, indicating the market is still to play for,

    Subscriptions are growing to the point where feeds are becoming the principal mode of interaction with web content. This indicates that understanding the consumption of feeds, their content, audience, distribution, aggregation and usage is also growing in importance.

  • Rich Local & Social Experiences

    Jointly presented by Meetro and PlaceSite, this session explored various locative media developments. The central question posed, ‘Who are you NOT meeting right now?’, is particularly appropriate to the conference environment.

    Location as a principal factor is deconstructed as one of the driving factors in community. Regarless of the mediated nature of networked communities, physical presence and location largely shape our participation in the places we work, live and play. Interestingly, realtime social networks with a community dimension were seen as a critical carrier of locative media in the future.

    Interestingly, IBM’s experience of collaborative media in the enterprise has been very successful, but organising locative meetings, even in the same city, required ad-hoc organisational mechanisms that didn’t exist – Meetro addressed this need for IBM.

    Meetro has become a ‘neighbourly’ tool, enabling people within proximity to share resource – can i borrow your vacuum?

    Amongst the lessons learned by PlaceSite and Meetro in launching their services:

    • User’s options to represent themselves through photos or avatars.
    • Proximity drives meetups, particularly serendepitous, spontaneous meetings within a few blocks.
    • Absolute location shouldn’t be revealed; abstracting to a radial ‘aura’ around an actual location is more desirable.

    Where Meetro is tied to the person, Placesite is largely linked to actual locations – this is a useful distinction in models pursued by various locative media.

    Placesite’s location-centric approach enables existing social boundaries to be extended into locative media and ensures an intrinsic community is related to an actual location. Placesite also offers router code which can PlaceSite-enable a WiFi hotspot; partnerships are in place with wireless providers Sputnik and Wavestorm.

    Longer term plans for PlaceSite include plans to address housing developments, conferences, muniwireless zones, an open API and the general transformation into a platform and network provider.

    As we consider FON-enabled Wanadoo Liveboxes to create a public wireless network for Wanadoo broadband subscibers, a Livebox+FON+PlaceSite combination offers a more compelling user proposition. Each Livebox could be transformed into a hub for local users, advertising, organisations, media and companies.

    A Wanadoo Placebox could illuminate neighborhoods with connectivity and information – a digital lamppost!

  • The New Community

    Communities occur when people have the ability to use their voice in a public and immediate way, forming intimate relationships over time.

    Web 1.0 communities were the era of company towns. You can use you voice, but only within the format and rules of the bossman…The Well, Salon Table Talk, Builder Buzz. The new generation of communities are increasingly self-powered and independent…Dooce, Kottke, BoingBoing. The differentiator is no-one can turn the new generation of communities off.

    There is a connective tissue that is powering distributed communities – blogs, comments, trackbacks, tags, APIs, blogrolls, referrers and links. Third party aggregators such as Technorati, Bloggies, Photoblogs.org and ORblogs also have significance in the connectivity between communities. Indeed, this tissue forces better behaviour of all particpants in an extended community.

    Memes are increasingly the fabric of communities online – where user contributions might lie buried within a forum post, connective tissue is enabling memes to spread further and wider, outside the constituencies where they would previously have remained…SelfPortraitday.com, Whiskerino, BlogThis quizzes.

    However, with no centralised authority, moderation, community scale that exceeds the personal and complex tools, the new generation of communities are creating their own problems.

    Flickr, YouTube, MySpace, Friendster, LiveJournal, Typepad and Last.FM are the reference examples of the new generation of communities, with a mixture of 1.0 models which have evolved and pure models fashioned from current thinking.

    The session closed with some guides to community building…

    • Treat your community well, don’t prevent them from leaving.
    • Go to where your community is – create a group in Flickr rather than a new photo service.
    • Decentralised community mirrors real community more closely.
    • Move towards a community afiliation cycle, grow up in your parents house then move out on your own and buy a house.
    • Blogs have forced older closed models to interact with the rest of the world.
  • Playsh, the playful shell

    Playsh is a ‘narrative-driven "object navigation" client, operating primarily on the semantic level, casting your hacking environment as a high-level, shell-based, social prototyping laboratory, a playground for recombinant network toys.’

    More literally, playsh is a command-line interface that uses MUD and text adventure conventions to navigate and manipulate the web. Features include:

    • looking for patterns in source code.
    • navigating URLs geographically through ‘rooms’ or as a deck of cards in your hand (!)
    • opening feed items as ‘doors’.

    Superficially, playsh appears to be a command-line interface for the web, though lacking the intuitive nature of YubNub, though Yubnub does lack the ability to pipe data from one silo into another. Though Webb’s motivations for exploring recombinant interfaces and playful metaphors are appropriate and valuable, playsh itself doesn’t seem to address these motivations.

    Maybe I’m missing something, but it’s difficult to see the value here…