Examining Google's algorithm change and how it affects searchers and site owners.
There was one big story in search last week: Google's move to purge low-quality "content farm" material from its search results. Here's a look at the impact, the response, and what site owners should address.
Dig into the Smithsonian Commons and you'll find Gov 2.0 in action.
This Smithsonian Commons project is a marriage of government resources and the web's capabilities. It combines offline and online information, makes experts available in any topic you could want, provides global collaboration, and gives everyone access to valuable knowledge. And since it's driven by iteration and immediate feedback, the Commons is bringing a Web 2.0 approach to the Gov 2.0 world.
When Americans want to know about health care reform, they don't go to opencongress.org and search for "H.R.3200" or H.R.4872". They go to Google and type in "health care reform". One key to making sure that the information you are working so hard to surface makes its way to the citizens who are looking for it? Use free search data to find out the language people are using to refer to that information. At Transparency Camp, I demonstrated a number of these tools.
When we want to find more information about something, hear about something interesting from our friends, see a compelling television commercial, or need a local mechanic, chances are the first place we turn is the Google search box. Fifty percent of us in the United States use search engines every day and over 90% of us search every month. No matter what kind of web site you have–whether it’s a media property like a blog, an ecommerce site, or the online arm of multinational corporation–you want to connect with as many of your potential audience as possible, and organic search can help make that happen.
In an earlier post, I said that key to government opening its data to citizens, being more transparent, and improving the relationship between citizens and government in light of our web 2.0 world was ensuring content on government sites could be easily found in search engines. Architecting sites to be search engine friendly, particularly sites with as much content and legacy code as those the government manages, can be a resource-intensive process that takes careful long-term planning. But two keys are assessing who the audience is and what they’re searching for and also ensuring the site architecture is easily crawlable…
Thursday on this blog, Congressman Honda asked, “how can congress take advantage of web 2.0 technologies to transform the relationship between citizens and government?” He asked for input on what web 2.0 features he should add to his website to take advantage of today’s online world. The most important feature government web sites can add isn’t really feature at all. But it would absolutely transform the relationship between citizens and government and make an amazing array of public data available. What’s this magic feature?
Guest blogger Vanessa Fox is co-chair of the new O'Reilly conference Found: Search Acquisition and Architecture. Find more from Vanessa at ninebyblue.com and janeandrobot.com. Vanessa is also entrepreneur in residence at Ignition Partners, and Features Editor at Search Engine Land. Yesterday, as President-elect Obama became president Obama, we geeky types filled the web with chatter about change. That change of…