Decision Support - Sample Protocol for Technology Adoption (Part II)
Please review last week’s blog before reading on. I build from where I left off last week on the topic of technology acceptance as a decision support approach; however, I offer examples from an actual field test that I conducted to give you a sense of the results. The focus of this week's blog is on steps 6-9 of the field test process:
- Purpose of the testing
- Techniques/technologies to be tested
- Testing time frame
- Description of participant responsibilities
- Description of support offered during the testing period
- Feedback collection process and procedure
- Participant Orientation
- Collection of feedback based on feedback collection process
- Analysis and reporting of results
6. Feedback collection process and procedure
Considerations for this step in the process include:
- Instrument creation/identification to capture technology acceptance before, during, and after the testing
- Activities identified for collecting feedback such as focus groups, surveys, etc.
I recently tested a tool that I believed would help my students engage in online discourse. My expectation was that the tool would support the complexity of the discussion. I did this to determine whether this tool would be a viable alternative to the more typical online discussion boards that have a linear, threaded discussion structure. I find that for complex problems the traditional discussion board options do not support in-depth learner engagement. For this purpose I decided to field test and gather feedback by following this procedure:
- Identify the type of technology user
- Self-guided Tutorial (how-to of tool through the completion of tasks they would be conducting on their own as part of their assignment)
- Survey on perceived ease of use and usefulness of the tutorial (immediately following the tutorial)
- Assignment over a period of 6 weeks with weekly tasks
- In-class debriefs of experience every other week (focus group)
- Same survey on perceived ease of use and usefulness (at the end of the 6 weeks)
This procedure could be utilized in other instances; however, it might be too time intensive for targeting faculty. If faculty are field testing then simplify the procedure.
7. Participant Orientation
Considerations for this step in the process include:
- Baseline information on applicant experience with technologies using the instrument created or identified
- Communication expectations throughout the testing period
For the recent field test I conducted using a discourse-based tool, I created the following agenda for the participant orientation session:
- Welcome and introductions
- Participants are asked to complete the What type of technology user are you? questionnaire from Pew Internet and to note their own user type.
- Participants introduce themselves and share their technology user type based on the questionnaire.
- The facilitator notes the participants’ type and shares his/her type.
- The facilitator explains the reason for testing the tool and explains the type of data that will be collected and how.
- The facilitator also explains that collected data will be anonymized and any reports will only include aggregated data
- Self-guided tutorial using the actual tool
- Survey on technology acceptance (download PDF attachment)
8. Collection of feedback based on feedback collection process
- Type of user was collected at the orientation session while each participant was introducing him/herself
- The self-guided tutorial was provided as a handout with instructions on how to create an account on the tool
- Individual interaction was captured within the tool.
- Perceived ease of use and usefulness questionnaire was conducted during the orientation session as part of the Blackboard learning management system.
- Bi-weekly group focus groups were conducted (total of 3) to gauge the depth of the interaction with the discourse. Summaries were created from each session.
- The post perceived ease of use and usefulness questionnaire was conducted at the end of the six weeks within the Blackboard learning management system
9. Analysis and reporting of results
Depending on how much detail you would like to include in your reporting out, you may choose to follow different approaches for analyzing the collected data. In my case, the results of the field test were submitted as a poster (download PDF attachment) at a Learning Analytics conference. However, this is not always going to be the aim of a field test. Two years ago, I conducted a semester-long field test the use of smartpens in a course. The resulting report was a brief narrative of my expert assessment, including recommendations for adoption, based on the overall experience.
In last week's post, I emphasized that the suggested 9-step process for field testing technologies is still a macro-level process protocol. Each of the sub-steps requires its own sub-process for operationalization. These sub-processes have to take into account the realities of the specific context. By providing you with a recent example from my own work, I wanted to offer you a glimpse in how steps 6-9 could be operationalized. I hope you have found it useful. If you are planning to put a field test in place at your MEPI school and require assistance, do not hesitate to get in touch.
Next week’s topic will focus on instructional methodologies for e-learning.