EBSE: Structured Abstracts

Anyone who has been through university has likely had to write an abstract at some point in their lives. For such a short piece of writing they can prove remarkably tricky to produce, and it can be difficult to compress the basis, goals, methods and conclusions of your research into as few as 150 words. Making it worse is the fact that after the title your abstract is the first point of contact with your work that a reader will have. If your abstract is poorly written a reader might skip over it completely. If your abstract doesn’t clearly or accurately describe your research the reader may dismiss it as irrelevant and move on.

In short, good abstracts are both hard and necessary.

This is where structured abstracts come in. Traditional abstracts are the familiar “blob of text” at the front of a paper or in the search results. A structured abstract divides the abstract into a series of small sections under headings. These headings are typically along the lines of Context, Aims, Method, Results and Conclusion. You can probably already see the benefits of a structured abstract.

Ease of Writing

Firstly, they are obviously easier to write as they force you to put one or two sentences under each heading. The structure discourages you from adding unnecessary or distracting information and ensures that you don’t miss out anything vital.

Ease of Reading

As mentioned above, abstracts are one of the first parts of your research a reader will see. The better your abstract, the less likely it is that it will be the last. If you are flicking through a set of studies trying to find something relevant, structured abstracts significantly improve comprehension. Does the area or goal sound interesting? Is it the kind of study you were hoping for? Are the conclusions interesting? Instead of wading through (or, more likely, skimming over) a wall of text the information is available at a glance.

Secondary Studies

Maybe not relevant to everyone, but anybody who has done a secondary study understands the horror of being faced with hundreds, possibly thousands, of titles and abstracts. Maintaining focus and discipline while reading through he seemingly endless papers, making informed an repeatable judgements of each one is extraordinarily difficult. Structured abstracts can make this process much easier. Far too often an abstract will neglect to mention whether the paper is an experiment, a case study, a survey, an opinion piece, or something else entirely. I’ve lost count of the times I found an abstract that sounded promising, then spending days acquiring the full paper only to find out that it was a technical demonstration and not a case study, or a secondary study, or in a depressing number of cases the wrong field entirely.

Structured abstracts aren’t a magical cure for poor reporting, but they do make it easier to write clear, complete and accurate abstracts.

A number of studies have been published on the subject, and below are a sample. The first one was published for the EASE Conference, which uses structured abstracts in its proceedings. You decide if it’s easier to read than the others…

Preliminary results of a study of the completeness and clarity of structured abstracts

Context: Systematic literature reviews largely rely upon using the titles and abstracts of primary studies as the basis for determining their relevance. However, our experience indicates that the abstracts for software engineering papers are frequently of such poor quality they cannot be used to determine the relevance of papers. Both medicine and psychology recommend the use of structured abstracts to improve the quality of abstracts.

Aim: This study investigates whether structured abstracts are more complete and easier to understand than non-structured abstracts for software engineering papers that describe experiments.

Method: We constructed structured abstracts for a random selection of 25 papers describing software engineering experiments. The original abstract was assessed for clarity (assessed subjectively on a scale of 1 to 10) and completeness (measured with a questionnaire of 18 items) by the researcher who constructed the structured version. The structured abstract was reviewed for clarity and completeness by another member of the research team. We used a paired ‘t’ test to compare the word length, clarity and completeness of the original and structured abstracts.

Results: The structured abstracts were significantly longer than the original abstracts (size difference =106.4 words with 95% confidence interval 78.1 to 134.7). However, the structured abstracts had a higher clarity score (clarity difference= 1.47 with 95% confidence interval 0.47 to 2.41) and were more complete (completeness difference=3.39 with 95% confidence intervals 4.76 to 7.56).

Conclusions: The results of this study are consistent with previous research on structured abstracts. However, in this study, the subjective estimates of completeness and clarity were made by the research team. Future work will solicit assessments of the structured and original abstracts from independent sources (students and researchers).

Presenting Software Engineering Results using Structured Abstracts: A Randomised Experiment

When conducting a systematic literature review, researchers usually determine the relevance of primary studies on the basis of the title and abstract. However, experience indicates that the abstracts for many software engineering papers are of too poor a quality to be used for this purpose. A solution adopted in other domains is to employ structured abstracts to improve the quality of information provided. This study consists of a formal experiment to investigate whether structured abstracts are more complete and easier to understand than non-structured abstracts for papers that describe software engineering experiments. We constructed structured versions of the abstracts for a random selection of 25 papers describing software engineering experiments. The 64 participants were each presented with one abstract in its original unstructured form and one in a structured form, and for each one were asked to assess its clarity (measured on a scale of 1 to 10) and completeness (measured with a questionnaire that used 18 items). Based on a regression analysis that adjusted for participant, abstract, type of abstract seen first, knowledge of structured abstracts, software engineering role, and preference for conventional or structured abstracts, the use of structured abstracts increased the completeness score by 6.65 (SE 0.37, p < 0.001) and the clarity score by 2.98 (SE 0.23, p < 0.001). 57 participants reported their preferences regarding structured abstracts: 13 (23%) had no preference; 40 (70%) preferred structured abstracts; four preferred conventional abstracts. Many conventional software engineering abstracts omit important information. Our study is consistent with studies from other disciplines and confirms that structured abstracts can improve both information content and readability. Although care must be taken to develop appropriate structures for different types of article, we recommend that Software Engineering journals and conferences adopt structured abstracts.

Reporting computing projects through structured abstracts: a quasi-experiment

Previous work has demonstrated that the use of structured abstracts can lead to greater completeness and clarity of information, making it easier for researchers to extract information about a study. In academic year 2007/08, Durham University’s Computer Science Department revised the format of the project report that final year students were required to write, from a ‘traditional dissertation’ format, using a conventional abstract, to that of a 20-page technical paper, together with a structured abstract. This study set out to determine whether inexperienced authors (students writing their final project reports for computing topics) find it easier to produce good abstracts, in terms of completeness and clarity, when using a structured form rather than a conventional form. We performed a controlled quasi-experiment in which a set of ‘judges’ each assessed one conventional and one structured abstract for its completeness and clarity. These abstracts were drawn from those produced by four cohorts of final year students: two preceding the change, and the two following. The assessments were performed using a form of checklist that is similar to those used for previous experimental studies. We used 40 abstracts (10 per cohort) and 20 student ‘judges’ to perform the evaluation. Scored on a scale of 0.1–1.0, the mean for completeness increased from 0.37 to 0.61 when using a structured form. For clarity, using a scale of 1–10, the mean score increased from 5.1 to 7.2. For a minimum goal of scoring 50% for both completeness and clarity, only 3 from 19 conventional abstracts achieved this level, while only 3 from 20 structured abstracts failed to reach it. We conclude that the use of a structured form for organising the material of an abstract can assist inexperienced authors with writing technical abstracts that are clearer and more complete than those produced without the framework provided by such a mechanism.


Empirical evidence about the UML: A Systematic Literature Review

As part of my work with EPIC I worked on a systematic literature review of empirical evidence concerning the UML. This study recently appeared in the journal Software – Practice and Experience, published by Wiley.

The study intended to assess the current state of research into the UML – specifically in the areas of metrics, comprehension, model quality, methods and tools and adoption. We identified and reviewed nearly 50 publications and arrived at the conclusion that “[d]espite indications that a number of problems exist with UML models, researchers tend to use the UML as a ‘given’ and seem reluctant to ask questions that might help to make it more effective.”

Abstract:

The Unified Modeling Language (UML) was created on the basis of expert opinion and has now become accepted as the ‘standard’ object-oriented modelling notation. Our objectives were to determine how widely the notations of the UML, and their usefulness, have been studied empirically, and to identify which aspects of it have been studied in most detail. We undertook a mapping study of the literature to identify relevant empirical studies and to classify them in terms of the aspects of the UML that they studied. We then conducted a systematic literature review, covering empirical studies published up to the end of 2008, based on the main categories identified. We identified 49 relevant publications, and report the aggregated results for those categories for which we had enough papers— metrics, comprehension, model quality, methods and tools and adoption. Despite indications that a number of problems exist with UML models, researchers tend to use the UML as a ‘given’ and seem reluctant to ask questions that might help to make it more effective.


Perforce Woes

Quick tip from a Perforce newbie…

Sometimes, for whatever reason, Perforce might miss updates – additions, deletions or edits – and they simply won’t appear in any change-lists. The changes show up in a diff, but good luck trying to commit the changes.

The solution: right click on the file/folder in the workspace browser and select Reconcile Offline Work. A warning though – this diffs every single file in your selection, so an entire project or workspace may take several coffees.


Inevitable Picsie

Alpha 1.9: Since it’s been a while, have a quick Picsie update – images now show tooltips when you hover over them.

Available, as always, from the main Picsie page


Picsie bug fix

I just noticed that animated GIFs aren’t being released when you move on to another image. Fixed now.

Alpha 1.8 (30 November 2010)

  • Installer (~75KB). This will set up your file associations for you.
  • Standalone .exe (~200KB). Use this if you already have Picsie installed and you just want the latest version. Just drop it over the existing one.

Changelog

  • Now properly releases animated GIFs when you switch to a different image.

For a full changelog, see the main Picsie page.


Quick Picsie update

Picsie how has a taskbar icon in Windows 7, to take advantage of the shiny new taskbar system.

Alpha 1.7 (18 November 2010)

  • Installer (~75KB).  This will set up your file associations for you.
  • Standalone .exe (~200KB).  Use this if you already have Picsie installed and you just want the latest version.  Just drop it over the existing one.

Changelog

  • Has a taskbar icon in Windows 7

For a full changelog, see the main Picsie page.


Of Windows and Minecraft

A Minecraft addiction seems to be this year’s must-have psychosis, and I jumped on that bandwagon with gusto.  Earlier today, thanks to a combination of a loose power cable and wildly flailing feet I managed to switch off my computer several hours into playing Minecraft.  No harm done, switch the computer on, reload Minecraft, click on “World 2″… no World 2.

A lot of time has gone into World 2.  I’ve made bases, sculptures,  gardens, forests, mines and towers.  A lot of time.

I could accept losing several hours of progress, but for the world to be gone completely?  No.  The files were still there, but level.dat and level.dat _old were corrupt.  The internet informed me that with a bit of trickery I could get the world working again by dropping the data into a different world, but there would be… issues.

And here is where Windows comes to the rescue.   While I tend to back my stuff up at irregular intervals, relying on Dropbox for my important stuff, I’m essentially terrible at it.  Windows, bless its unsung soul, has been doing it for me, and I never even knew.

If you’ve managed to corrupt your world, right click on level.dat, go to properties, then to the “Previous Versions” tab.  If the winds are right, you’ll have a bunch of versions you can restore to – thankfully Windows had one made yesterday with a System Restore Point.  Restore that version, reload Minecraft…

And voila, World 2 was back.  Sure, I’d lost a couple of hours of work (but not as much as I thought), and my inventory had gone back to yesterday, but World 2 lives!

Thank you Microsoft, for your Minecraft autobackup feature.

tldr; You can fix corrupted Minecraft saves by using Windows’ “Previous Versions” feature – since the Minecraft data is in your AppData folder, it gets backed up automatically when restore points are made.


Picsie: Now with less broken zooming

Well, that was a bit embarrassing…  All that effort to pre-compute a decent zoom level for images when you load them, and then completely ignoring it.  Oops, fixed now.

Alpha 1.6 (13 October 2010)

  • Installer (~75KB).  This will set up your file associations for you.
  • Standalone .exe (~200KB).  Use this if you already have Picsie installed and you just want the latest version.  Just drop it over the existing one.

Changelog

  • Fix: Sets the zoom level properly when you load an image
  • Fix: Now consistently applies the minimum zoom level

For a full changelog, see the main Picsie page.


Picsie Update

Nice little update to Picsie for you.

Alpha 1.5 (26 September 2010)

  • Installer (~75KB).  This will set up your file associations for you.
  • Standalone .exe (~200KB).  Use this if you already have Picsie installed and you just want the latest version.  Just drop it over the existing one.

Read More →


Minor Picsie update

Just made a quick update to Picsie – larger images wouldn’t zoom to fill the screen properly, this has been fixed.  At some point I’ll stop being lazy and deal with the “forms can’t be much bigger than the screen” limitation…

Alpha 1.4 (4 September 2010)

  • Installer (~75KB).  This will set up your file associations for you.
  • Standalone .exe (~200KB).  Use this if you already have Picsie installed and you just want the latest version.  Just drop it over the existing one.

Read More →