The Forestry Commission and associated organisations used a bespoke content management system for their intranet and it was due to drop out of support early in 2019. The search for a new intranet platform was on, and I first heard from the Forestry project team towards the end of 2016 when they were exploring options.
The course is an introduction to the analysis, not only of social networks, but the many other types of networks that exist in the real world such as transport and power networks, disease, biological and information networks. And it has given me some practical skills to turn my attention to analysing intranets.
Networks are made up of nodes and edges. In our way of thinking, nodes equate to pages or documents on the intranet, and the edges equate to the hyperlinks from one page or document to another.
Most intranet managers could probably state, or at least have a stab at guessing, how many pages and documents (nodes) exist in the intranet. But I would guess that this is as far as it goes. If I asked you how many links (edges) exist in your intranet you’d probably be stumped. I had to get my hands on some CMS data exports a do a fair bit of data manipulation in order to get the answer to this.
The number of incoming links (edges) to a page (node) is known as the page’s in-degree. The number of outbound links is known as the out-degree. And edges can be weighted depending on the strength of the connection. Google’s PageRank uses a node’s in-degree – the number of pages linking to a page.
It’s the statistical analysis of the number of nodes and edges and the relationships between them that gives us key metrics such as the average degree, closeness, average shortest path, network density, betweenness, centrality, modularity and clustering. These metrics can give usinsight into the structure, character, effectiveness and efficiency of the network.
I created a dataset from the current intranet that I’m working on and imported it into Gephi which I used to visualise the network. So far I’ve only managed to recreate the hierarchical structure of the intranet, which in itself can give some meaningful visualisations. But I hope to be able to get my hands on some enhanced data including links that cross-reference sections of the intranet.
Using Gephi it’s possible to run algorithms on the data and specify colours and sizes for the nodes and edges in order to produce visualisations.
Here is my gallery of visualisations showing some zoomed-in areas and some birds-eye views of the network.
And while these visualisations are all very pretty, I’ve actually used them to support the work that I’m doing. I’ve been working on a content migration mapping document and by watching the algorithms in action I have been able to pinpoint errors in the data that would have taken ages to spot if simply working with the core spreadsheet. And this is a whole different way to show clients what the structure of their intranet looks like.
In addition to staff satisfaction surveys, user tests and usability benchmarks, if it’s possible to put a number on your intranet through network analysis, is it also possible to measure effectiveness based on network metrics? For example, if I hypothesise that there is an optimum number of links within an intranet in relation to the number of pages then we could analyse any intranet and judge whether it would be a good or poor experience to navigate.
If we combine actual click-through data with the intranet network we could highlight problems with the information architecture or menu navigation systems. If I know that the shortest path between the homepage and a lower level of content is 4 hops but usage analytics shows staff are taking more hops to find the content – then we can work to improve the IA and how we signpost links to content.
Gephi in action
And lastly, for those who are interested in Gephi here are some video guides for creating your own site visualisation.
A-Z index pages are very popular on our intranet. More popular than search. Our staff like to find what they want by looking it up in a list. If they don’t find it in the list they may resort to the search box. By looking at what phrases staff use when they perform a search from an A-Z index page, we can get a good idea of what’s missing from the A-Z listing.
Google Analytics has an inbuilt function to segment visits where staff used the search function. It was this functionality that highlighted our A-Z index pages.
I like to visualise a visit to the intranet as a trip to the convenience store. A member of staff needs something. Some people enter the store and walk up and down the aisles trying to locate what they need (menu navigation). Some look up where to go in a catalogue (A-Z index). Some enter the store and immediately ask for help (search) and others ask for help as a last resort. It is this asking for help as a last resort (searching) that can benefit us.
When I looked at the segmented analytics showing only those visits in which a search occurred, it showed, unsurprisingly, that lots of people search from the homepage, equivalent to entering the store and immediately asking for help. Equally unsurprisingly, lots of our popular pages are showing up. But also high in the reports were our A-Z index pages, indicating that lots of staff are going to the A-Z listings and then having to search.
Having highlighted a problem, I can now take action. I can target the A-Z pages and produce analytics that will tell me which search terms are being used when staff resort to using search from these pages. I can then feed these terms back into the A-Z listings, if appropriate, and over time, improve our offering.
Staff who search from the homepage are just using their preferred method of getting to content. Staff who repeatedly search from an area of content deeper within the intranet structure may highlight a symptom that there is a problem with the content.
At the start of October we introduced the next step in our strategy to improve engagement with news stories on the intranet. A month later we are seeing a 53% increase in traffic.
I had a good hunch that introducing the box would generate *some* interest, but I was amazed by the results at the end of just one month. Pageviews for news stories climbed from 44,185 to 67,872. Similar to Google Adwords, simple text adverts, when placed in context, can be effective.
This increase in news story traffic started when we introduced a “Related stories” advert box, placed top-right of feature news pages. Nothing clever. It’s a simple text box containing a bulleted list of links to past stories. A maximum of 3 links, with something in common between them all.
It’s a manual process for our intranet news editor (yes we just have the one!) to link up the relevant back-stories. We publish at least one, usually two feature stories every day, timed to coincide with our peak news readership periods (elevenses and late lunch) aiming to give a sense of steady momentum to the homepage news stream. Our feature news is varied, with stories from the front-line to seasonal pieces to interviews with board members.
The recent enhancement is a great success for my internal communications colleagues. For them, it suggests an increase in reach and shows that staff are interested enough to want to browse through back-stories to get the news behind the news, creating a richer picture. Being practical, we’d like it if staff had already read these stories, but related stories give us a second chance to increase coverage and helps staff to discover articles that they would not otherwise find. Over time, the ripples should start to run through our news collection as more and more stories backlink and crosslink to each other.
Last weekend I went to Bletchley Park, the home of the code-breakers and Top Secret government facility during WW2. The tour included a visit to the National Museum of Computing, which evoked some old memories. Since returning from the trip I’ve been reminiscing about the world of computers and information when I was a kid and what it’s like for me now.
|Old photo of Bletchley Park|
I consider myself blessed to have gone to a school where I could take Computer Studies as an O Level. I remember our computer room with one Commodore PET and Computer Club kids fighting over time on the machine during lunch break. A quarter of a century on and it was so funny to see one on show at the museum. One of my nieces, along for the day, remarked about the inbuilt cassette player and wondered if it was there so you could listen to music. She couldn’t believe that we used to have to *load* programs and *save* our data.
I consider myself lucky enough to remember working with Winchester hard drives and mainframe systems. It was interesting when our guide remarked at one point during the tour, that we could get the entire contents of the data stored in the room onto one of our smartphones.
But since leaving school, I’ve always been in the business of digital information. From my first job as a database programmer to working on management information systems and humongous data-warehousing projects, my whole career has been the challenge to wrestle with information and attempt to portray it usefully to someone at the end.
I didn’t call myself an information architect or a usability guy until the internet came along. But that’s what I’ve always done. I’ve coded, queried, designed, tested and analysed. I’ve sat with many users, watching them try to complete tasks, seeing the same old design mistakes again and again. Somewhere in my head, the rules, standards and best practice guidelines are stored. But although a lot of the usability rules that I’ve learned since the web thing happened have remained valid and may continue to be valid, the landscape is changing. Fast.
Back to the present day. As the landscape becomes more social or as social becomes more integrated into our online experiences, so I have to be aware of social and be able to integrate it into my solutions. Jakob Nielsen says that you have to do 10 years in the field before you can consider yourself a usability professional. But even if this social thing has been around that long, it’s constantly changing. How can anyone be an expert? By jumping into the water, learning to swim and getting into the flow.
While having a mildly amusing double-entendre as a title for the UK audience, Chris Dixon’s post: “You need to use social services to understand them” is so true. You have to use social, get involved and experience it. I’ve been tweeting for over a year. I started my blog in February. I’m linked in, facebooked and delicious up to the hilt. I have foursquared, yammered and dug. I’ve got glued, poked, bumped and ground. I’ve commented and voted; been rated and liked. I get it. I get the potential of it.
The challenge and next stage in my evolution as an information architect and usability pro is to design the path and encourage the flow of social data through the intranet and the public website. To work with this new form of data within and outside the organisation, attempting to portray it usefully to someone…