About these ads

Follow Everything by a Select Few, Select Content by Everyone

Item #1: Fred Wilson tweet:

@timoreilly i want to follow less people and more keywords in my twitter timeline. can’t wait for summize to get integrated into twitter

Item #2: Adam Lasnik FriendFeed post:

I switched over to reading mostly a ‘subgroup’ (“Favorites”) on FF, and was missing the serendipity of new voices. One way I’ve remedied that is to do searches on some of my favorite things (“a cappella”, “lindy hop”, etc.) and see who and what comes up.

Item #3: Steve Gillmor blog post:

A small number of Follows combined with Track produces a high degree of coverage on a daily basis.

The three items above share a common theme…limit the number of people you follow. At first, this sounds obvious. Isn’t that what people normally would do? Well no, it’s not. In social networks, there’s a dynamic whereby people tend to return the favor when someone follows them. This build up your follows over time.  As Louis Gray noted in a recent post:

While you might be following thousands of people and making new “friends” on Facebook, LinkedIn, Twitter, FriendFeed and all the other networks, you would likely hesitate before sending them an open invitation to your home.

“Thousands of people” I’m doing it: following 1,000+ people on FriendFeed, 600+ on Twitter. For seeing a broad range of information and opinion sources, it’s great to track so many people.

But there is a big downside. Much of what I see doesn’t interest me. The greater the number of people you follow, the more content you will see that falls outside your areas of interest. Putting this into attention terms, for any given minute you spend on a site, what is the probability you will see something that interests you?

It’s an odd phenomenon. I actually like that I’m following a lot of people, because it increases the number of instances where something that interests me will go by on my screen. But it affects the rate at which something interesting goes by. As you follow people that stretch outside your core interests, their streams do have a higher percentage of stuff that you don’t care about. And the overall probability of seeing content that interests you declines.

I want to differentiate this idea from Dunbar’s number, which describes limits on people’s ability to maintain inter-personal relationships. I’m not talking inter-personal relationships. I’m talking information foraging.

What Are You Trying to Get from Your Social Media

I enjoy following people that stream content outside my normal range of interests, such as Anna Haro on FriendFeed. It’s important to step outside the things that regularly occupy you, if you want to grow.

But the three items above show there is another rationale for people to participate in social media. Rather than seek content outside their interests, they want a concentrated dose. Personally, I’m finding I need this professionally. The Enterprise 2.0 space (my field) is fluid, and undergoing the stress of the global recession. Tracking the news, ideas, perspectives, trends and relationships is critical. For example, the microblogging trend (e.g. Yammer) is new and I’m interested in seeing how that plays out.

If you can see the point of that social media use case, you can understand the value of this idea:

Follow everything by a select few, select content by everyone

As I noted in my last blog post, I’m tracking everything for a select group of Enterprise 2.0 people, and keywords/tags for everyone.

In terms of the three items with which I started this post, Fred Wilson describes this approach. Adam Lasnik isn’t too far away. His manual searches for “a cappella”, “lindy hop”, etc. could be turned into persistent searches to find new content and people. Steve Gillmor is a little more of the social media whale philosophy, where he only wants to follow a specific set of users and then interact with the @replies on Twitter. But even Steve could add keyword tracking via a FriendFeed Room as a way to improve his daily “coverage”.

Will This Trend Grow?

I’m a fan of this use case. It fits my needs professionally. It’s almost like I have my 9-to-5 social media, and then my nighttime social media.

I suspect this use case will make more and more sense as social media expands its mainstream footprint. Information workers are the ones who will be most interested. The hardest part is figuring out which keyword/tags to follow, what sites to track and what mechanism to use for this tracking. I’d argue FriendFeed with its Rooms and Lists is perfect for this, but certainly there are other ways.

One final thought. If this trend takes hold out in the wider market, I can see people practicing a little SEO on their content. Get those hash tags in your tweets to make sure Fred Wilson will see your content (if he ever reveals what he tracks).

For kicks, I’m curious what you think of this idea. Please take a second to answer the poll below. If you’re reading this via RSS, click out to participate in the poll.

*****

See this post on FriendFeed: http://friendfeed.com/search?q=%22Follow+Everything+by+a+Select+Few%2C+Select+Content+by+Everyone%22&who=everyone

About these ads

Workplace Productivity vs. Tracking the “Flow”

I mentioned in an earlier blog post that at Defrag last week, Stowe Boyd gave a presentation on following a cascade of information, a flow. While I couldn’t attend his presentation, what I heard from others was that Stowe argued that there are no limits on people’s attention. They can get work done and track information in real-time simultaneously. It is all a matter of training.

A common opinion I heard from others was that this was BS. Workers have things to get done, and cannot spend their time watching a ticker of information going by.

I know there are dedicated social media/blogging types who do swim in the cascade of information via apps like Twirl, Tweetdeck and other clients. It makes sense for these folks – they live a life of staying up-to-date on what other social media types are talking about, and engaging others in real-time conversations.

But does that fly inside the workplace? It’s hard to imagine your average worker watching a constant stream of information. (a) They likely don’t care. (b) It seems to imperil their productivity.

Yet should information workers care about what’s happening in their field? And does this flow really affect their productivity?

I’m not one of those with an Adobe Air client feeding me updates from Twitter and FriendFeed. I’m generally resistant to client-based apps, and I don’t feel the need to track the flows so much. But on FriendFeed, I found myself continually going back to the site to check my Enterprise 2.0 List. This list consists of entries from the Enterprise 2.0 Room plus the feeds of a number of people who are active in the space.

Well if I’m going to constantly go back to that List on FriendFeed, why not bring the real-time updates to me? So I’ve been experimenting with running my Enterprise 2.0 List in real-time on my work computer this week.

Getting Work Done and Enjoying the Flow

Here’s a picture of my screen, with a Word document open to the left, and FriendFeed real-time opened in a mini window to the right:

working-while-tracking-the-flow

I have two screens at my desk. A flat screen monitor, and my laptop screen. The graphic above is from the monitor, which is big enough to allow two windows.

Here are a few thoughts about adding flow to my daily work.

I already have an ADD work style: I’m probably not alone in this. Since way back when I was a banker writing client pitches and offering memoranda, I have a hard time writing something straight through over the course of an hour. I just can’t do it. I’ll write something, then a I need a break. I don’t know why that is. If I trudge through the writing without break, the quality suffers.

Thus the real-time updates are a welcome break as I write.

The pace of updates isn’t too fast: Not that FriendFeed real-time couldn’t handle it. There are 33 people in my Enterprise 2.0 List currently (the Enterprise 2.0 Room is one of them). They tweet a lot, rarely interact on FriendFeed, post blogs and share/bookmark articles. The pace of updates seems to average once every couple minutes, with a decent-sized standard deviation.

If I had real-time up for my FriendFeed home page, where I’m tracking over 1,000 people, I imagine the movement in the screen to the right would be constant. That would be too distracting.

I feel more on top of my game: Let’s talk about the reason you’d track the flow. By having this up, I’ve got a really good sense about the ideas, arguments, conferences, information and relationships that are going on out in the Enterprise 2.0 world. Professionally, I’ve never been so aware of the goings-on. A lot of this I feed back internally here at Connectbeam.

I also love seeing the @reply tweets of the people I’m following on real-time. I’m finding more interesting people to follow on Twitter as a result. Some of these folks end up on my Enterprise 2.0 List.

I’m still getting my work done: And this is the crux of the experiment. I’m still getting work done while that real-time window is up.

There Are Limits to Our Attention, But I’m Not Approaching Those

Probably the single biggest factor that’s making this flow thing work for me is that I’m not bombarded with an update every second. I think the Defrag attendees who thought Stowe was talking crazy probably were thinking about one update-per-second type of flows. If that was the case, then yes, it’s a mistake to try this.

But a more limited flow built on a select group of people and a feed of keywords is quite manageable. And actually really beneficial.

*****

See this post on FriendFeed: http://friendfeed.com/search?q=%22Workplace+Productivity+vs.+Tracking+the+%E2%80%9CFlow%E2%80%9D%22&who=everyone

Newsletters Are Still Viable? How I Approached My First Newsletter Email

In some ways, I’m the worst guy to be in charge of emailing a newsletter out on behalf of my company Connectbeam. I’m very dismissive of spam email. I hang up on telemarketers without guilt, after quickly saying “put me on your do not call list”. I ignore newspaper and website ads, I don’t watch commercials.

I like my little cloistered world…

But newsletters are back in vogue it seems. Longtime blogger Jason Calacanis famously dropped his blog for an email newsletter. Chris Brogan maintains a newsletter. I subscribe to a newsletter provided by analyst relations firm SageCircle.

Clearly there continues to be life in newsletters despite the advent of RSS. I guess I should rephrase that. The dominant form of online information distribution is email, with a RSS still a small part of the pie. And email does have some advantages – people spend more time there in a business setting. It also has an outreach aspect that bugs the hell out of A-List bloggers, but can be less intrusive for everyday people.

As a young company looking to expand its brand and message, Connectbeam needs to consider the newsletter a part of its overall engagement strategy.

So I recognize the importance of it, even as I’m probably the last person who would read anything like this. Which, in a way, made me well-suited for tackling this.

Making It More Than a Typical Company Marketing Piece

The email went out to 253 people – we didn’t spam some purchased list of thousands of names. The subject line was: “Social Software During a Recession – Connectbeam Nov 2008″. I wanted the email to be topical, not some spam about a product release.

Here’s the email (also available online):

connectbeam-newsletter-nov-2008One of our HTML guys put together a good-looking layout.The email went out yesterday morning, and here are two stats on it so far:

  • 28% opened
  • 2 unsubscribes

My overall objective is to make the email useful, and to build Connectbeam’s presence out in the market. If I’m successful in the former, I believe I’ll be successful in the latter. A third objective is to advertise upcoming webinars as well. That’s going to be an ongoing battle, as I’ll describe below.

There are four sections highlighted in the email graphic. Here’s what I was doing with each of those.

1. Opening Message

The first challenge is getting people to open the email. Once you’re through that hurdle, next you’ve probably got 5 seconds to catch their attention. My guess is that people will do a quick scan of the different sections, then read the opening sentence at the top of the newsletter.

In writing the opening message, I essentially wrote a mini-blog post. Readers of this blog know I’m a big fan of linking to others’ work, and this was no exception. I linked to a nice post by Jevon MacDonald on the FAST Forward blog. Then I added my two cents. Right off the bat, I wanted to give the reader something useful. A couple people clicked on the link to Jevon’s post.

Something else that seemed important – putting my name on the email. I’ve been immersed in social media enough to know that a soul-less corporate entity as the sender immediately loses some of the engagement. It comes across as a pure marketing exercise. So I wanted my name on there.

The other thing about putting your name on it? It raise your own expectations for the utility of the email. After all, people are going to associate its quality with you.

2. Three Things We’re Reading

There are three objectives with this section:

  1. Give the reader links to information that they may find useful
  2. Provide links that fit a theme for the newsletter
  3. Connectbeam is all about collaborative information sharing – so practice what we preach

Based on the click stats, this seems to been a successful part. That first link for the MIT study of a company’s implicit social network (I blogged about it here) has been clicked 18 times. Clearly people were digging that one. The IBM tagging savings story was clicked 7 times, which was not too bad either.

My intention with this section is to design something that will be useful to recipients. Even if you currently have no interest in Connectbeam, you’ll find enough value in these links to continue receiving the email. That’s why I’m particularly attuned to the unsubscribe stats. Having only two people unsubscribe so far is a good start from my perspective.

There are no silver bullets with this newsletter program. It’s not like I expect people to sign contracts after reading the email. I’m looking at the newsletter as a long term brand-building exercise, and as a basis for increased engagement over time. But the only way that works is if they agree to continue receiving it.

3. Upcoming Webinar

Alas, this part of the email has not gotten a lot of love so far. I understand. Webinars are a time commitment. People have to make their choices.

Enterprise vendor webinars are a tough sell. I’m starting to appreciate the finer points of webinars. When I was at BEA, I led a webinar for social search inside the enterprise, talking about general issues and the Pathways application. We had 80 attendees, including many from the Fortune 500 set. It established a baseline for me on these things. But the driver of that level of attention? BEA was a significant presence in the market. Many, many companies had BEA portal software, and were curious about the new social computing applications available for that.

Connectbeam isn’t BEA. We don’t have nearly the presence. So a webinar by our company doesn’t yet have the fertile ground that a BEA did.

One trick I’ve seen companies employ (which we even did at BEA), is to partner with a well-known analyst or consulting firm, or with a big-name vendor. SocialText has done these with Forrester. NewsGator has worked with Microsoft. There’s an upcoming $100 webinar (yes, attendees pay $100!) by market research firm Radicati with Atlassian, SocialText and Telligent.

This webinar partnering idea is one I’m going to look into more.

4. News and Events

In this section, I describe the recent 3.1 release of Connectbeam’s application. This is an area where I can give an update on what is happening with Connectbeam. It’s the closest thing we have to an annoying email PR blast about what we’re doing.

But integrated into a useful email with other parts, I think it works. This section will rise on the email when I don’t have any upcoming webinars to tout.

Any Suggestions?

You now know my approach and objectives with this email program. I’m the guy who doesn’t like these things, put into a position of sending them out. And Connectbeam isn’t a major name like Google or Oracle, so there isn’t a ready-to-read audience out there. This stuff takes some hustle and experimentation.

If you have any thoughts on what you see, or what’s worked well for you, I’d love to hear it.

*****

See this post on FriendFeed: http://friendfeed.com/search?q=%22Newsletters+Are+Still+Viable%3F+How+I+Approached+My+First+Newsletter+Email%22&who=everyone

Social Media “Glue” and Gnip’s Co-opetition with FriendFeed

We believe that enabling web technologies are going through a similar development cycle as enterprise application integration technology did 10+ years ago. Companies are creating tools, applications and platforms to enable more productive and automated uses of resources that have become ubiquitous parts of the online ecosystem. We think about these enabling technologies as the glue that will increasingly hold together that ecosystem.

Seth Levine, Foundry Group

Venture capital firm Foundry Group, which includes partner Brad Feld, described an important investment theme for their firm. Titled Theme: Glue, the thesis is that the growing number of web services and content-generating sites are causing increased complexity, and that there is a need for an infrastructure to handle all this.

This can seem a bit dry (“I know this back end plumbing stuff is boring to most of you”, as Michael Arrington says), but its relevance is can be considered in the context of:

Foundry’s thesis extends beyond server-load management. But its initial investment in Gnip starts on that part of the “Glue” story.

The Problem Gnip Solves

The rise of user generated content has made this problem particularly acute. We’re creating so much content, all over the place. Flickr, Del.icio.us, Digg, YouTube, Twitter, WordPress.com, Google Reader, etc. I mean, there’s a lot of stuff!

It turns out a lot of other sites want to consume this stuff – FriendFeed, Plaxo Pulse, Strands, SocialMedian and many, many other sites. And the direction for production and consumption of all this content is only going one way –> up.

The problem that arises is the way consuming services, such as FriendFeed, have had to find out if you’ve got a new tweet, blog post, Digg, etc. The consuming services need to ping every individual user’s account on some site, such as Flickr,  with the query, “got anything new?” For most people, the answer is no. But that query is the cause of a lot of traffic and latency. Imagine all these new web services pinging en masse all the UGC sites. It can be quite a load to handle. In Twitter’s case, it was too much to handle.

Gnip addresses this issue, standing between UGC sites and consuming sites:

gnip-flowThe UGC sites (aka producers) push updates for their various users to Gnip. These updates are either change notifications, or full content for each user. So Gnip becomes the reference layer for anything occurring for a UGC site’s users.

The consuming sites then look to Gnip as a “single throat to choke” in terms of updates. Gnip handles the updates, and gets them out to consuming sites in real-time. Gnip also removes the burden on consuming sites to write and maintain polling scripts for all the various UGC sites of users.

The idea is a good one, as it offloads a lot of the burden for both producers and consumers. Of course, it shines a light on Gnip’s scalability and uptime stats.

Initial consuming sites using Gnip include:

Gnip is off to a nice start. But what about FriendFeed?

FriendFeed’s Ex-Googlers Roll Their Own

FriendFeed is one of those consuming sites. But they’ve not signed on for Gnip so far. Not surprising, considering their Google background. Lots of good knowledge about scalability to be learned from Google.

Rather than sign on to Gnip’s service, Friendfeed has proposed the simple update protocol (SUP). What’s SUP?

SUP (Simple Update Protocol) is a simple and compact “ping feed” that web services can produce in order to alert the consumers of their feeds when a feed has been updated.

The idea is that the UGC sites provide a single point for posting notifications of new user activities. Rather than the consuming sites running the “got anything new?” query for every single user on their platform, they go to a single place to see what’s new. They have a list of the user IDs they want to check, which they run against the SUP location. Much more efficient.

Which does sound a little like Gnip, doesn’t it? Here’s a Q&A between Marshall Kirkpatrick of ReadWriteWeb and FriendFeed co-founder Paul Buchheit:

RWW: [Where is this relative to] Gnip? (See our coverage of Gnip, a startup that appears to be aiming to do what SUP will do and more.)

Buchheit: We’re talking with several companies about supporting SUP, but aren’t ready to announce anything.

On the TechCrunch post about Gnip 2.0, commenter Nikolay Kolev writes:

Even if I like Gnip’s concept a lot at this moment, I think it’s just a temporary solution of the real problem. It solves deficiencies of the vast majority of the data producers nowadays, yes, but if more implement XMPP PubSub, FriendFeed SUP and other similar technologies, there will be less incentives for data consumers to make their business rely on a single provider that supposedly aggregates and replicates all of the Web…

On FriendFeed, user Dani Radu writes this in response to Gnip:

pretty interesting, I mean making all this data handling drop dead simple is great – but this means they want to cache and route the direction of interest. Own the process and sell it. Again, perfectly sane – but I’d rather go for the SUP (simple update protocol) which in a way – if adopted widely – does the same but keeps the handling free (as free as the services are anyway) We shall see what future brings tho…

The Gnip vs. SUP question came up on Hacker News, which included this exchange with FriendFeed’s Paul Buchheit:

ryanwaggoner: Isn’t this what Gnip is doing, except that Gnip’s solution is readily available to anyone who wants it? In fact, I believe Gnip uses XMPP to push notifications to data consumers, which seems even more efficient. Am I missing something?

paul: No, Gnip is a complementary service and will likely consume SUP. SUP is intended to make it easier for feed publishers to expose information about which feeds have been updated. Without this information, Gnip can’t know when feeds have updated except by polling all of them. SUP allows them to poll a single URL instead.

ryanwaggoner: Got it. So this is designed to be the piece that allows publishers to easily integrate with intermediate services like Gnip, or with aggregation services like FF, SocialThing, etc.

paul: Exactly

Paul’s right, but the earlier comments are also right. Gnip may want to get updates for its UGC producing sites using SUP. But there’s truth to the idea that if producers offer SUP, some of the value proposition of Gnip is eroded.

Gnip: More Than Real-Time Updates

But Gnip appears to provide a range of services above and beyond simple update notifications. My guess is that those extra services Gnip provides above and beyond providing a single place to get notifications will be their secret sauce.

On the Gnip blog, product head Shane Pearson writes the four use cases on which Gnip is focused:

  1. Eliminate the need for developers to write dozens of pollers for UGC sites, all of which must be maintained and updated
  2. Target business specific applications that need this data. There may be interesting functional or vertical application that SUP won’t cover.
  3. Offload the overhead on UGC producers’ sites (which sounds like SUP). But beyond that, create an alternative channel for their content, provide analytics on the data consumed through Gnip ,and add filters an d target endpoints.
  4. Use Gnip as a source of market research and brand analysis for what consumers are saying about companies.

So what you see here is that the developer world sees SUP competition in the infrastructure part of Gnip. Gnip is looking beyond the developer world in terms of where it delivers value.

I wouldn’t be surprised if other companies enter the mix. “Glue” is an early, interesting space right now.

*****

See this post on FriendFeed: http://friendfeed.com/search?q=%22Social+Media+%E2%80%9CGlue%E2%80%9D+and+Gnip%E2%80%99s+Co-opetition+with+FriendFeed%22&who=everyone

A Promising Future for Newspapers

nyt-front-page-111008

Item #1: FriendFeed Widget Motivates Reporters to Use Social Media:

“This last week, I have been busy reorganizing our major financial blog, Bear&Bull, adding FriendFeed widgets in hopes of encouraging more audience interaction. The results have been surprising — although the audience has been slow to react, the changes have motivated many of my normally technophobic colleagues to start using video, pictures and live-blogging techniques.”

Item #2: Al Gore speaking at Web 2.0 Summit (thanks to Dion Hinchcliffe tweet):

“Gore says regulate the Internet as little as possible and says there is a future for journalists in curating content/new media. #web2summit”

Item #3: Forrester analyst Jeremiah Owyang on a “freemium” business model for analysts:

“Talking to @susanmernit about analysts sharing. I told her I give the appetizers away for free –but still charge for entres. It’s working”

Newspapers continue to suffer declining readership, hitting their bottom lines hard. Robert Scoble started a good FriendFeed/blog post around this. Two ideas I read there were:

  • A la carte funding – you only pay the specific categories of news you like
  • Crowd funded reporting – consumers pay upfront for specific stories to be created by journalists

A la carte is interesting, and is worthy of further exploration. Crowd funding won’t make it. A critical mass of people will not take the time to fund specific stories. Forget that idea – requires too much engagement by an audience that would just turn attention elsewhere.

I’d like to suggest a different possibility that builds on the existing advertising and subscription models, while leveraging journalism’s historic role in the context of modern social media. Journalists have traditionally played a role as information filters. That is, they are dedicated practitioners of finding information, evaluating what’s true, determining what’s relevant and providing it to a wide audience.

Using that definition of journalism, the items at the start of this post point toward a promising future for journalism. Think about it. Journalists are the original information junkies. They have to be. Their livelihood depends on being better informed than most of us.

This positions them well to providing a stream of content to readers outside of the normal daily articles that are the staple of newspapers. Rather than the single daily articles they deliver, here’s what a future set of content looks like for reporters:

  1. Longer, well-developed articles
  2. Quick blog posts
  3. Twitter messages
  4. Sharing content created by others

#1 above is the stuff of today’s newspapers. It doesn’t go away. Look how much power a daily has – New York Times and Wall Street Journal articles drive a lot of linking as seen in the Techmeme Leaderboard. That’s just the online effect. And unlike social media content, newspaper articles still adhere to high standards for sourcing, finding nuggets from people most of us don’t have access to, and bring a wealth of facts and voices to the stories. This type of content continues to have value.

#2 and #3 are the lighter weight stuff. This is flow information. The tidbits that a reporter gets after talking to a source. The legislative maneuver that will affect how new laws will look. The dissatisfaction expressed by a customer. The filling of a key company or government position.

#4 is a nod to the research and content that informs the worldview of the reporter. Reporters find useful information for the beat they cover, and would be great sources for Del.icio.us bookmarks and Google Reader shares.

The Bear & Bull blog is part of the Mediafin publishing company in Belgium. The FriendFeed widget is a great example of #2 – #4 above. Sounds like reporters are intrigued with it.

Combining Flow with Subscription-Based Revenues

Two revenue models are available:

  • Lightweight flow = advertising
  • Articles = advertising, subscriptions

I can see a newspaper’s website filled during the course of a day with content generated by reporters. A lot of that content will be great standalone stuff. It should make readers want to come back to the site to see what’s new. Tweets, blog posts and shared items all displaying on the newspaper’s web page.

The Jeremiah Owyang tweet above points to another element of the future newspaper. He describes providing appetizers to potential customers. Enough to give them some information. But if they want to know the full story, they need to pay Forrester. This idea applies to newspapers as well. Reporters will reveal just enough to give a sense of a story. But not so much to fill really know it. Readers will need to read the newspaper article to know the story. Note that article need not wait until the next morning. It goes live when it’s ready.

One area that benefits from this approach is the important, but less popular beats. These may not get as much attention, but newspapers can retain reporters to continue an important role in recording society’s history. A lot of the less popular beats may “just” get coverage via blog posts and tweets. But that continues to provide visibility to them.

Curated Sources of Information

As Al Gore opined, the future of journalism has a vibrant role in curating the chaotic mass of data out there. This view appears to be shared by watchers of the newspaper space. On the Printed Matters blog, here’s a quote from Journalism is important:

In a world where anyone can post, use and re-use the news, what is the role of the professional?

Professional journalists are more important than ever in a world of oversupply. We need credible people, people we can trust, to sort the wheat from the chaff, to make sense of the barrage, to order things.

That statement appears to rally around traditional newspaper articles, but I think it applies to an expansion of journalism’s mission. Newspapers are a huge attention platform. Entrepreneurs try to get the attention of TechCrunch, ReadWriteWeb, Mashable and Robert Scoble. Why? Because they command a huge audience. Well so do newspapers. People and organizations from all parts of society – business, governement, fashion, etc. – will continue to be interested in getting coverage by newspapers. Of course there’s a need for the continuing role of sorting “the wheat from the chaff”.

And lest we forget, mainstream consumers don’t hang on every utterance of Steve Jobs or what Google is releasing today. I like the way Rob Diana put it on his Regular Geek blog:

People have been calling for the death of newspapers for quite some time. In their current printed form, they may be dying. However, we are already starting to see the evolution from a printed newspaper to the online version. Who is going to be leading the charge of RSS content for the mainstream user? Newspapers. Why? They understand what the mainstream user wants. I think we, the techies, have forgotten that.

His post focused on adoption of RSS, but I think he’s hit on an important piece of the puzzle. Newspapers are way ahead of everyone else in understanding what interests the mainstream. As the public moves to the web for news, sure they’ll go on Facebook and Twitter. But their core interests haven’t changed.

If newspapers can adapt social media tools to their (1) historic information filtering role; and (2) understanding of the interests of the mainstream, I’m betting on a bright future.

*****

See this post on FriendFeed: http://friendfeed.com/search?q=%22A+Promising+Future+for+Newspapers%22&who=everyone

Defrag 2008 Notes – Picasso, Information Day Trading, Stowe “The Flow” Boyd

defrag-logo

One of the most consistently provocative conferences I attended last year — my own Money:Tech 2008 aside, of course — was Eric Norlin’s Defrag conference. Oodles of interesting people, lots of great conversation and all of it aimed at one of my favorite subjects: How we cope with the information tsunami.

Paul Kedrosky, Defrag 2008 Conference

I spent two days out in Denver earlier this week at Defrag 2008 with Connectbeam. As Kedrosky notes above, the conference is dedicated to managing the increasing amount of information we’re all exposed to. Now my conference experience is limited. I’ve been to five of them, all in 2008: Gartner Portals, BEA Participate, TechCrunch50, KMWorld, Defrag.

Defrag was my favorite by far. Both for the subject matter discussed and the attendees. The conference has an intimate feel to it, but a high wattage set of attendees.

In true information overflow style, I wanted to jot down some notes from the conference.

Professor William Duggan: He’s a professor at Columbia Business School. He gave the opening keynote: “Strategic Intuition”, which is the name of his book.  Duggan talked about how studies of the brain showed that we can over-attribute people’s actions as being left-brained or right-brained. Scientists are seeing that both sides of the brain are used in tackling problems.

He then got into the meat of his session – that people innovate by assembling unrelated data from their past experience. For example, he talked about how Picasso’s style emerged. Picasso’s original paintings were not like those for which he became famous. The spark? First, meeting with Henri Matisse, and admiring his style. In that meeting, Picasso happened to become fascinated with a piece of African sculpture. In one of those “aha!” moments, Picasso combined the styles of Matisse and African folk art to create his own distinctive style. He combined two unrelated influences to create his own style.

Duggan also described how all innovation is fundamentally someone “stealing” ideas from others. In “stealing”, he means that people assemble parts of what they’re exposed to. This is opposed to imitating, which to copy something in whole. That’s not innovation.

Re-imagining the metaphors behind collaborative tools: This session examined whether we need need ways of thinking about collaboration inside the enterprise. The premise here is that we need to come up with new metaphors that drive use cases and technology design. I’ll hold off on describing most of what was said. My favorite moment was when Jay Simons of Atlassian rebutted the whole notion of re-imagining the metaphors. He said the ones we have now are fine, e.g. “the water cooler”. What we need is to stop chasing new metaphors, and execute on the ones we have.

Rich Hoeg, Honeywell: Rich is a manager in Honeywell’s corporate IT group (and a Connectbeam customer). He talked about the adoption path of social software inside Honeywell, going from a departmental implementation to much wider implementation, and how his own career path mirrored that transition. He’s also a BarCamp guy. Cool to hear an honest-to-goodness geek making changes in the enterprise world.

Yatman Lai, Cisco: Yatman discussed Cisco’s initiatives around collaboration and tying together their various enterprise 2.0 apps. I think this is something we’ll see more of as time goes along. Companies are putting in place different social software apps, but they’re still siloed. Connecting these social computing apps will become more important in the future.

Stowe “The Flow”: Stowe Boyd apparently gave quite the interesting talk. I didn’t attend it, because Connectbeam had a presentation opposite his. But from what I gather, the most memorable claim Stowe made was that there’s no such thing as attention overload. That we all can be trained to watch a constant flow of information and activities go by, and get our work done. I think there will be a segment of the population that does indeed do this. If you can swing it, you’re going to be well-positioned to be in-the-know about the latest happenings and act on them.

But in talking with various people after the presentation, there was a sense that Stowe was overestimating the general population’s ability and desire to train their minds to handle both the work they need to do for their employers, and to take in the cascade of information flowing by (e.g. Twitter, FriendFeed). Realistically, we’ll asynchronously take in information, not in constant real-time.

We’re Becoming Day Traders in Information: I heard this quote a few times, not sure who said it (maybe someone from Sxipper or Workstreamr). It’s an intriguing idea. Each unit of information has value, and that value varies by person and circumstances. Things like Twitter are the trading platform. Of course, the problem with this analogy is that actual day traders work with stocks, cattle futures, options, etc. Someone has to actually produce something. If all we do is trade in information and conversations, who’s making stuff?

Mark Koenig: Mark is an analyst with Saugatuck Technology. He gave the closing keynote for Day 1, Social Computing and the Enterprise: Closing the Gaps. What are the gaps?

  1. Social network integration
  2. Information relevance
  3. Integration with enterprise applications
  4. The culture shift

Mark also believes in the enterprise market,  externally focused social computing will grow more than internally focused. Why? Easier ROI, more of a sales orientation.

Charlene Li: Former Forrester analyst Charlene Li led off Day 2 with her presentation, Harnessing the Implicit Valkue of the Social Graph. Now running her own strategic consulting firm, Altimeter Group, Charlene focused on how future application will weave “social” into everything they do. It will be a part of the experience, not a distinct, standalone social network thing. As she says, “social networks will be like air”. She ran the gamut of technologies in this presentation. You can see some tweets from the presentation here.

One thing she said was to “prepare for the demise of the org chart”. When I see things like that, I do laugh a bit. The org chart isn’t going anywhere. Enterprises will continue to have reporting structures for the next hundred years and beyond. What will change is the siloed way in which people only work with people within their reporting structures. Tearing down those walls will be an ongoing theme inside companies.

Neeraj Mathur, Sun Micro: Neeraj talked about Sun’s internal initiatives around social computing in his session, “Building Social Capital in an Enterprise”. Sun is pretty advanced in its internal efforts. One particular element stuck with me. It the rating that each employee receives based on their participation in the Sun social software. Called Community Equity, the personal rating is built on these elements (thanks for Lawrence Liu for tweeting them):

Contribution Q + Skills Q + Participation Q + Role Q = Personal Q

Sun’s approach is an implementation of an idea that Harvard Professor Andrew McAfee put out there, Should Knowledge Workers Have Enterprise 2.0 Ratings? It’s an interesting idea – companies can gain a lot of value from social computing, why not recognize those that do it well? Of course, it’s also got potential for unintended consequences, so it needs to be monitored.

Laura “Pistachio” Fitton: Twitter-ologist Laura Fitton led a panel called “Finding Serendipitous Content Through Context”. The session covered the value of serendipity, and the ways in which it happens. The panel included executives from Aggregate Knowledge and Zemanta, as well as Carla Thompson from Guidewire.

What interested me was the notions of what serendipity really is. For example, Zemanta does text matching on your blog post to find other blog posts that are related. So there’s an element of structured search to bring related articles.

So I asked this question: Does persistent keyword search, delivered as RSS or email, count as “serendipity”? Carla’s response was , no it doesn’t. Serendipity is based on randomness. It’s an interesting topic worth a future blog post potentially.

And of course, Laura encouraged people to tweet during the session, using the hash tag #serendip. The audience tweets are a good read.

Daniela Barbosa, Dow Jones, DataPortability.org: Daniela works for Dow Jones, with coverage of their Synaptica offering. She’s also an ardent supporter of data portability, serving as Chairperson of DataPortability.org. Her session was titled Pulling the Threads on User Data. She’s a librarian by training, but she kicks butt in leading edge thinking about data portability and organization. In her presentation, she says she’s just like you. She then pops up this picture of her computer at work:

daniela-barbosa-laptop-screen

Wow – now that’s some flow. Stowe Boyd would be proud.

Wrapping up: Those are some notes from what I heard there. I couldn’t get to everything, as I had booth duties for Connectbeam. Did plenty of demos for people. And got to meet many people in real life that I have followed and talked with online. Looking forward to Defrag 2009.

*****

See this post on FriendFeed: http://friendfeed.com/search?q=%22Defrag+2008+Notes+-+Picasso%2C+Information+Day+Trading%2C+Stowe+%E2%80%9CThe+Flow%E2%80%9D+Boyd%22&who=everyone

Greed, Competition, Investor Expectations: Some Things Social Media Will Never Change

Courtesy Bryan Maleszyk on Flickr

Courtesy Bryan Maleszyk on Flickr

In my previous post, I wrote about the Paul Kedrosky session at Defrag 2008, Around the Horn. It was a free form session in which he queried several panelists on a range of subjects. Lots of good discussions from that.

One topic that got some extended discussion both on the panel and in audience questions was this:

Could social media and better information awareness tools have prevented the financial meltdown?

The basis of the question, in the Defrag context, was that there were signs and data that pointed to the implosion. The argument on the table was that there was a failure of information systems, and of social media, to alert the world to what was happening. And the follow-up: how can we improve this?

You can’t. Don’t even bother.

Because the problem isn’t one of not seeing the warning signs. See Morgan Stanley analyst Mary Meeker’s slides about the various financial ratios at the time of the financial collapse. We were clearly running things in the red zone when you see that data. And the data was there to see.

The problem is that people will never change. They will ignore any system telling them that things might be getting out of hand. Why? Tragedy of the commons.

Financial Meltdown = Tragedy of the Commons

In economics and game theory, there is the notion of the commons dilemma. This is the idea that when there is a common good, people will act upon their own interests in consuming that common good:

  • Person 1: consumes proportional share of large resource
  • Person 2: takes an outsized portion of resource, which by itself doesn’t destroy the resource
  • Person 3: sees Person 2 take larger share, matches that or even increases amount consumed
  • And on and on…

The problem is that as people do this, they are not acting as stewards of the common good. The result is that the common good ends up entirely consumed as each person acted in their own self-interest.

With regard to the financial crisis, what was the common good that was over consumed? Home ownership. The grpahic below is from Mary Meeker’s presentation:

morgan-stanley-home-ownership-timeline

Consider the dotted line on the above graph to be the normal consumption rate for home ownership. From 2000 onward, an increasing number of people purchased their own homes. Turns out, this put a mighty strain on the financial system. A lot of people purchased homes who shouldn’t have.

Now I don’t blame people for wanting homes. But the effect was to drive the prices up incredibly, which caused more desire for home ownership. Home ownership is sustained by a number of factors: steady incomes, mortgage tax breaks, personal financial management, a robust collateralized mortgage market, mortgage insurance, etc. All of those are part of the common resources. They were undermined by too many people partaking in home ownership.

The Banker’s Life

So in the case of home ownership, mortgage bankers ended up destroying the various shared resources that make up the market for home ownership. Each banker acted independently to get as much as he could from the system: loosened lending rules overall, finance build-n-flip construction, push home ownership into markets with lower financial means (a.k.a. sub-prime).

I understand the mentality. Once upon a time, I was an investment banker with Bank of America, in the syndicated loan group. Syndicated loans are like bonds, with the risk spread across many lenders. Here are two examples of how things work in banking.

First, I was part of a deal team trying to win a mandate with HMO company Humana. We were trying to unseat Humana’s incumbent bank, Chase Manhattan. We went in aggressively, with some pretty cut-rate financing terms. Chase did want to lose the deal, and so they went even more aggressively. In the syndicated loan market, you need a bunch of lenders to participate. Which means you need realistic pricing to sell the deal, just like in the bond or stock markets. Turns out, in its effort to keep Humana, Chase went too low in its pricing. They couldn’t syndicate the deal. Chase got caught up in the competition to keep Humana.

Second, at a dinner with the head of our group, the subject of minding the bank’s credit position came up. As in, what was the role of the syndicated loans group in being stewards of the bank’s balance sheet? Should we use our superior market knowledge to alert the credit underwriters about risks we’re seeing, and deals from which we should walk? I was a brash young guy, and provocatively opined that we were all about winning and syndicating deals. We shouldn’t focus on the credit risks, as that was the job elsewhere in the bank. To which the head of the group replied to me, “For the sake of your career, I’m going to pretend I didn’t hear that.” He was right – we really did have a responsibility. But I was reflecting the behaviors I’d seen out there in the market.

I bring up these examples to give you a sense of what it can be like inside banking. There is money splashing around everywhere, and when everything is up-up-up, you really can’t turn off the motivations of people.

Social Media Will Never Change This

What was happening in the mortgage industry was that the motivations were all one way: make the deal! You see your colleague cranking on getting deals done, and getting recognition internally. Your compensation is based on how many mortgages you get done. Competitor banks were reporting higher revenues and earnings. If you don’t match the growth of your peers, Wall Street dings your stock. The real estate market just kept going up, up, up.

Social software wasn’t going to change these dynamics. The current financial crisis is just the latest in a string of such events. And there will be more. It’s just human nature.

Professor Andrew McAfee tweeted a musing about Enterprise 2.0 and the financial collapse. I responded with my own thoughts:

amcafee: Could E2.0 have saved Lehman and Merrill? No. :)

bhc3: @amcafee – I used to work in banking. E2.0 would have made the banks better at achieving their growth goals. But those goals hurt the banks.

Social software is a powerful tool for organizations to get better in terms of innovation, productivity and responsiveness. But companies are still run by humans, and we’ll never be rid of that, for both the good and the bad.

*****

See this post on FriendFeed: http://friendfeed.com/search?q=%22Greed%2C+Competition%2C+Investor+Expectations%3A+Some+Things+Social+Media+Will+Never+Change%22&who=everyone

Follow

Get every new post delivered to your Inbox.

Join 663 other followers