Wiki software has been around since 1995, but a rather recent variant is something being called "enterprise wiki software". Heightened interest comes in response to the increasing number of organizations who are turning to wikis as a way to improve internal efficiency.
But if we look at wikis in an enterprise context, we have to confront two important questions:
- What's the difference between a wiki and a CMS? Couldn't we just press our existing CMS into service as a wiki?
- Are wiki tools enterprise-ready? That is, would they pass muster before my CIO?
Below I'll try to answer both questions.
A wiki is a collaborative website where users can create and edit pages. Wikis fall conceptually under the broad concept of content management, and you could certainly use your existing CMS to create a wiki-like site. However, wikis bring unique characteristics that differentiate them from a run-of-the-mill content management systems.
Wikis emphasize ease of content creation. This simplicity comes mostly from many sources:
- A wiki markup language that provides a short-hand way of formatting text and linking documents.
- The ability of users to create and edit pages directly and independently.
- A bottom-up approach to site structure and navigation.
- Very simple templating.
- A conscious decision to eschew workflow or even simple approval steps.
Let's look at each issue in turn.
Wiki software empowers users to create and edit their own pages, but content management systems provide tools for creating and editing content, too. The difference is in approach. When wikis first came out (in 1995), there were not a lot of options for WYSIWYG editing from within a browser, so the wiki markup language (sometimes called "wikitext") provided a particularly valuable short-hand for formatting text that was much easier to learn than pure HTML.
A good CMS will offer a WYSIWYG interface that makes writing content for the web a lot like using a word processor. These days, more wikis have WYSIWYG editing features as well, so the wiki markup language becomes a less interesting feature in terms of formatting, although it does provide the benefit of being supported by all browsers on all platforms, something that is typically not the case with rich-text editors. Many wikis support both wikitext and rich-text editors. The figure below shows an example of the editor from Wikipedia, which supports both forms of content formatting.
However, there is one area where wikitext still retains its power and where wiki software is different from a CMS: linking. Wiki software still provides a much easier way to link pages within the wiki to each other. Links are made based on the title of a page, so the author does not need to use, remember, or type long URLs in order to link one page to another.
Because contributors can readily create new pages and can easily link one page to another, wikis take a unique approach to site structure and navigation.
A CMS usually takes a more formal approach to site structure and navigation, with the site organized into a hierarchy by an information architect. User-created pages in a wiki mean that the hierarchy and structure of the site is created in an ad-hoc way. Navigation tends to be simple, and the hierarchies are flat. For example, the online encyclopedia Wikipedia has hundreds of thousands of articles on a broad range of topics, but these topics are not arranged in any conceptual hierarchy. The entry for dogs serves as a good illustration. The URL for the article about dogs is:
A pug is a kind of dog, and the URL for the pug entry is this:
Since a pug is a kind of dog, you might expect to see the following URL for pugs:
But it's not there. Some wiki packages do support more complex categorization of content, but many are totally flat, just like Wikipedia. Even if the software does support subpages, contributors are still allowed to create subpages in an ad hoc fashion and there is no systematic approach to the site's information architecture.
An experienced system administrator or architect will ask of any content technology, "what does the repository look like?" And for good reason. They have to care about compatibility, performance, back-up, and a raft of similar issues.
Wikis historically have taken a very simple approach to data storage. The first wikis stored content in plain text files that were written using the wiki markup language. When a user requested a page, the page was rendered. This was not speedy, but it worked. These days, wiki packages employ one of several different back-ends, with many housing their content in databases.
One important consideration is whether the system supports automatic back-ups (commercial wiki applications often do). Another thing to think about is what this means in terms of integrating wiki content with content managed by other systems. For example, will the enterprise search system be able to index wiki content? Will the content indexed be raw wikitext, or rendered HTML pages?
This brings us to the issue of APIs. Most wikis don't have one. Want to access a wiki through your portal or integrate with Intranet CMS collaboration system? Today, you typically purchase the from vendor. Going forward, I expect more wikis open up their systems for integration other enterprise packages.
When a page of wikitext is requested, it gets rendered into HTML in a two-part process. First, the wiki markup is converted to HTML, and links are created between pages. Then, this content is wrapped by a template that provides a consistent look to all the pages in the wiki.
Compared to a CMS, most wikis have simple templating systems, often only enabling one template for the entire site. Wiki templates (and page rendering in general) often are not cached, so the page is rendered with each request. From an enterprise perspective, a lack of caching can obviously limit the scalability of the system. On the other hand, there's no finicky caching mechanism to deal with.
Wikis turn the idea of workflow on its head. They are decentralized and typically lack the controlling mechanism of a workflow system with a formal approval process.
The fact that wikis are decentralized and lack sophisticated workflow systems and approval processes is considered a feature of wikis and not a fault. This is contrary to the basic philosophy of many content management systems, which emphasize control over empowerment.
Despite wikis decentralized approach, there is one important thing to remember: the anyone-can-edit policy is just that -- a policy -- and not an inherent feature of the software. At the same time, wikis don't handle content control in the same way that a content management system does, so you will need to take a different approach with wikis.
In CMS software, as in life, there is a classic trade-off between control and flexibility. With a traditional CMS, decision-making is often centralized by an editor of some sort that reads and approves content prior to publishing. With a wiki, the writer writes, then publishes without editorial oversight or approval. This direct channel to publication is what makes wikis so wonderful in scenarios that emphasize speed and flexibility.
But what if the enterprise does want to exercise at least some control? In the absence of traditional workflow controls, content creation in a wiki is managed through change monitoring, automated spam prevention and user access control. Let's look at each one in turn.
As one might expect, one layer of defense is to simply monitor changes that have been made to the wiki. This makes the most sense for wikis that reside exclusively within the firewall.
In addition to monitoring changes, you will want to be able to do something about fixing unwanted changes, like rolling them back to a previous version. In short, the "change monitoring" approach requires two basic features - the ability to monitor recent changes, plus some kind of version control.
Recent changes can be monitored as follows:
- Most Wikis have a "Recent Changes" page that lists all the pages that have been changed, which is illustrated in figure 2. If the wiki supports registration, then it also will identify who made the change.
- E-mail notification of changes is just an e-mail version of the "recent changes" page, but with the convenience of notification.
- A variant of e-mail notification is support for RSS syndication, enabling you to monitor a wiki for recent changes using your favorite RSS reader.
- More sophisticated systems identify and differentiate "trivial" changes from more substantive ones. For example, you may not want to be notified by e-mail every time someone fixes a spelling error.
- If more than one person has been tasked with monitoring changes, some wikis offer the capability to track whether a recently changed page has been checked yet, reducing the chances of duplicated work.
I once encountered a philosophical debate about whether wikis should have version control. The idealist in the conversation argued that version control was against the "Wiki Way" and somehow lacked philosophical purity. The realist argued that people make mistakes and sometimes deliberately do bad things, so the ability to roll back changes was, indeed, a good thing. The realist won the argument in the broader marketplace of ideas and many (if not most) versions of wiki software have version control. Features to look for include capabilities similar to what you would find in a CMS, including:
- The ability to roll-back changes to the previous version.
- The ability to compare different versions side-by-side.
- The use of diffs between versions so that specific differences between them can be easily identified.
Another approach is to monitor the content of changes programmatically. This is sometimes referred to as spam prevention. This differs from user access control in the sense that it monitors wiki edits based on the content itself, or patterns of user behavior. Some systems can block access to IP addresses and URLs, or they can block the posting of individual changes based on the following:
- Restricting the use of certain words or phrases, using word lists or regular expressions.
- Blocking access based on excessive activity.
When a wiki software package bills itself as an "enterprise wiki", it usually means that it has user access control. Most wikis can differentiate between registered and non-registered users and will let you keep non-registered users from making changes. An increasing number of wiki software projects are now offering more sophisticated user access control through the use of access control lists that assign rights at a more granular level. Users and groups can be assigned rights to tasks such as reading a page, writing to it, editing it and rolling it back to a previous version.
There is a lot of variance among wiki packages in terms of how those rights are applied to the site. Some wikis let you restrict access to certain sections of a site, while others let you restrict access to individual pages. A less common but useful feature is the ability to restrict access to parts of pages. For example, you might not allow everyone the ability to post a comment to an article.
The most sophisticated enterprise wikis work with single sign-on security systems like Kerberos, or offer network and directory integration (LDAP and Active Directory) for user authentication and authorization.
Contrary to their reputation, Wikis are content management systems that can be managed. They simply take a different approach to content management by choosing to emphasize speed and flexibility rather than strict controls. In order to successfully implement a wiki software package you will need to look at workflow from a different perspective and be sure to select wiki software that provides the right level of content monitoring and access control for your organization.