wiki:Evaluation
Last modified 2008-03-18T15:39:16Z

Phoebe Phase 1 Evaluation Plan

Document history

  • Initial version: 10/08/06
  • Revised in light of meeting with Glenaffric: 25/08/06
  • Additional revisions in light of Glenaffric evaluation report: 05/09/06, 06/09/06
  • Revisions after development of Phase 2 work packages: 12/02/07, 10/04/07
  • Renamed "Phase 1 Evaluation Plan": 28/09/07
  • Further minor amendments in light of Phase 2 evaluation: 18/03/08

Aims of project (recap)

  1. Develop a prototype online planning tool to guide practitioners working in post-16 and higher education in designing effective and pedagogically sound learning activities.
  2. User-test the planning tool for functionality and usability.
  3. Investigate the feasibility of further development and the integration of the planning tool into pedagogic practice by:

a) Linking the planning tool to specific guidance, models of practice, case studies, learning designs and other appropriate support material;
b) Embedding use of the planning tool into specific contexts for piloting and evaluation, e.g. continuing professional development, initial teacher training

Concept underlying Phoebe and its rationale (recap)

Basic concept: An online planning tool to guide practitioners working in post-16 and higher education in designing effective and pedagogically sound learning activities.
(Do we need to elaborate on what we mean by 'pedagogically sound'?)

Specific instantiation of that concept: A tool that propagates the principles of effective practice to as wide an audience as possible, by allowing them to develop new pedagogical approaches while still using the planning tools with which they are familiar.

Rationale: We believe that successful innovations in IT reflect, and build on, the ways in which users actually work, rather than requiring them to adapt their practices. Therefore a planning tool such as Phoebe should take as its starting-point the tools and processes in current use. By meeting practitioners on their 'home ground', we can then introduce them to new, more effective, tools and processes and thereby lead them to espouse the emergent technologies where these are appropriate to their situations. While acknowledging the power and potential of the new generation of learning design tools, we note that a) they constitute only one of a repertoire of tools at teachers’ disposal and b) they still have only a limited user base, yet a large number of practitioners need assistance re getting started with e-learning.

Back Top

Research questions for the evaluation

  1. Is Phoebe, as an embodiment of the concept outlined above, a tool that practitioners in post-compulsory education find usable, helpful and relevant to their needs, whether they are a) beginning or experienced teachers looking to use technology in their teaching for the first time or b) are familiar with e-learning but are looking for new ideas re technology and pedagogy? Specifically, does it encourage practitioners to think about their practice in a structured way?
  2. Can one tool address all sectors of post-16 education, or are separate tools required for, say ACL, WBL, FE versus HE? (reasons: different nature of planning in the sectors; differences in terminology and linguistic style)
  3. To what extent can practitioners can relate to the concept of an abstract learning design, or does a design require contextualisation and specific content to be meaningful?
  4. Is Phoebe suitable as a tool for teacher education and/or a tool for supporting everyday practice?
  5. What additional features and functionality are required to turn Phoebe from a PoC prototype into a tool for general use?
  6. What is needed to support the community dimension of using learning design and make it possible to sustain learning designs as community artefacts?
  7. What other potential issues of sustainability exist, and how might these be resolved?

Critical point: the brief is to design a proof-of-concept tool, which means that the vision underlying the design is on trial as much as the design itself. Of course, all designs embody their underlying vision, but in a PoC tool that vision is necessarily much larger than its embodiment, and so we have to project this to our evaluators.

Back Top

Specific questions to ask (stemming from RQs)

Note: AT: Xxxxx refers to a question derived from the Activity Theory analysis of the LD Tools data (section 4.3.2 of the  LD Tools report).

Key aspects of usability

  • Ease of learning to use
  • Ease of use once learned
  • Productivity/usefulness/effectiveness (task accomplishment) (also covered in practice-related issues below)
  • Affective response (enjoy using it/choose to use it/recommend to others)
  • Sustainability? (Not just new features, but also projections re effort likely to be involved in maintaining currency of content and examples.)

Integration into individual pedagogic practice

  • Mapping to their established practice (processes, functionality)
  • Usefulness at different levels: course/scheme of work planning vs topic planning vs session planning
  • Relevance to their established practice (content) (focus on generic vs examples vs institution-specific examples)
  • Extent to which they feel they have learned something new/been inspired to experiment/innovate in their own practice
  • Relationship between their use of Phoebe and their continued use of existing tools: Do practitioners use Phoebe's notebook to start creating their plans in Phoebe, or do they use it purely for reference? (AT: Subject, Tools)
  • Use again in own practice? If no, why not? What, if anything, would?
  • Useful to others? At what stage of career/at what level of experience wrt e-learning?

Integration into pedagogic practice of the institution (community dimension)

  • Staff trainers: does it map to the models of pedagogy which they are trying to communicate to their trainees? (AT: Rules) If no, how much effort is required to make it map? Could simple customisation options (e.g. ability to edit "default" guidance, add own case studies and examples, components of plan, terminology?) work, or would they need a totally customised tool?
  • Can the tool be used by practitioners working alone or does it work best in an institutional setting, as a community artefact? (AT: Subject, Community)
  • To what extent does Phoebe facilitate or, conversely, impede established practice relating to session planning in the user's institution? (AT: Rules)

Proof of concept

A principal outcome of this project is proving the concept and developing the specification for a pedagogical planner that will be useful in a variety of contexts, not the development of a fully functioning practitioner-ready system (Glenaffric report 11/08/06).

  • Do evaluators endorse the underlying concept?
  • Does the tool operationalise the concept?
  • What would be useful/desirable/essential for its (further) acceptance? (Give list of possible features + invite user to suggest others)

Back Top

Evaluators

We envisage that three groups of users will be represented:

  1. TS: The 'broad church' of teaching staff
    • TS-F: Teaching staff who are already familiar with the use of technology in their teaching/learning
    • TS-N: Teaching staff who are new to technology
  2. TD: Teaching staff responsible for the dissemination of technology in pedagogic practice:
    • TD-T: Those with teacher-training or staff-development roles
    • TD-S: IT support officers and learning technologists who are responsible for disseminating the use of technology in their institution/department and/or are involved in specific 'projects' (e.g. creating an image bank; redesigning a F2F course to run in a blended or fully online environment)
  3. PDT: Teachers and students undergoing CPD and ITT respectively (NB Although we dropped the proposal [at meeting with JISC and Glenaffric 11/08/06] to evaluate embedding the planner in training contexts through workshops with trainees, it has emerged from our meetings with PIs that these are the groups who would benefit the most from a pedagogic planner tool. There will therefore be dedicated workshops for representatives from these groups in Phase 2.)

The PoC evaluation requires a specific type of evaluator: one who can see past the present limitations/imperfections of the tool to the tool as it might be: i.e. the tool is not the object of the evaluation; rather, it serves to mediate the evaluator's mental projections. In this respect, novice users of e-learning technology may not be the most appropriate evaluators: as Rogers and Scaife suggest in their original work with informants, it can be more productive to work with people who are a few stages above the target audience as they can recall, and reflect critically on, their own experiences as novices. (For ideas, see Glenaffric's evaluation form for the EUVLE Toolkit -- stored in Liz's Phoebe\Materials from Glenaffric folder.)

Possible sources for recruitment of expert evaluators

NB This is just a list of 'possibles': none of these groups has yet been approached.

  • Members of JISC Pedagogy Experts group
  • Selected participants in LD Tools project
  • Colleagues of PIs (esp. where originally approached to take part in 'embedding' workshops)
  • TechDis
  • Serendipitous contacts

Back Top

Evaluation techniques

We will use techniques that focus on capturing a rich set of qualitative data from a small group of (experienced) people. The proposed techniques are shown in the following table:

Instrument Subjects Purpose Qual/Quant? data? Notes & issues
Questionnaires All Affective responses to Phoebe; other attitudinal data Qual + Quant
Observation, optionally with recorded think-alouds PIs Affective responses to experience; Usability problems Qual
Semi-structured group discussions Teachers, Learners Affective responses to experience (reflective) Qual
Session plans created during workshops? TS; TD ???? Qual
Usage data TS; TD Determine patterns of uasge during workshop Quant
Heuristic evaluation Project team; PIs Optimise usability of interface so as to eradicate unwanted factors in evaluation of pedagogical aspects of tools. Qual + Quant Base on user-centred and learner-centred principles; modify, extend and/or replace the Nielsen and Sharples & Beale guidelines. (See other lit on UI design in e-learning, incl. Hedberg et al. paper.)

Back Top

Phase 1 evaluation schedule

The project plan identified four key 'events' in the evaluation schedule, the first of which was the programme of interviews with practitioner-informants; the others are the walkthroughs by PIs, practitioner workshops and embedding sessions. However, in retrospect, the material collected from the PIs falls more into the 'requirements gathering' activity, and so the interviews are now excluded from the evaluation.

Timing Factor to Evaluate Questions to Address Method(s) Measure of Success
Aug/Sept? 2006 UI and functionality of prototype tool Consistency, usability, error-free functioning Walkthrough by project team members Number of issues identified and resolved
26 Oct 2006 Acceptability of proposed tool to wider user community Proposed overall vision and functionality of tool Set of activities at JISC ELP Experts' Group meeting Extent to which the tool is judged to be of use to the community
Nov 2006 Usability of revised prototype Usability, error-free functioning Walkthrough by project team members Extent to which issues raised in evaluation with PIs have been resolved satisfactorily
23 Jan 2007 Acceptability of tool to wider D4L community Overall vision and functionality of tool Demonstrate tool at D4L programme meeting Extent to which the tool is judged to be of use to the community
Feb 2007 Usability and usefulness of prototype tool Consistency, usability, mapping to real-world task Walkthroughs by practitioner-informants; give feedback via interviews with structured questions similar to those used at review meeting Quality of feedback and suggestions for improvement; minimal number of bugs and UI/functionality issues
22 Feb 2007 Embedding Suitability for embedding in staff development and/or initial teacher-training context Workshop at University of Greenwich; observations, discussion Quality of feedback re suitability

The following events were rescheduled to Phase 2:

Jan 2007 Embedding: Acceptability of tool as support for design for learning (also usability issues) Mapping to real-world task 2 workshops with max. 10 (expert) practitioners at each. Feedback via questionaires and interviews. Some observation may also be carried out. Quality of feedback re usefulness/acceptability
Mar-Apr 2007 Embedding (?and sustainability) Suitability for embedding in staff development and/or initial teacher-training context; suitability across sectors and domains Individual meetings with representatives from HEA, Becta, ACLearn Quality of feedback re suitability

The following event was omitted on account of the shortcomings of the tool uncovered in face-to-face meetings:

Apr 2007 Usability and usefulness of prototype tool Consistency, usability, mapping to real-world task Online (remote) evaluation by practitioners both inside and outside the 'D4L' community; give feedback via online questionnaire Quality of feedback and suggestions for improvement; minimal number of bugs and UI/functionality issues

Back Top