Background
A survey was endeavored to evaluate the use of free online decision and report writing tools designed to promote compliance with the 2005 Mental Capacity Act for England and Wales. In total, 29 individual staff working across a wide range of health and social care settings completed the survey. The percentage return could not be established as the survey invitation was open to anyone who had access to the tools regardless of whether they had used the tools or not.
Results
A total of 5 forms were removed as respondents stated that they had not actually used the tools. This made them of questionable value in terms of evaluating the usability of the tools. Within this group 2 responses were clearly stated that the tools were not relevant to their role. The remaining 3 responses did not state why they had not used the tools.
All but one of the respondents who had used the tools reported that using the tools had increased their knowledge of the Mental Capacity Act with more than half saying that their knowledge had increased quite a lot or a great deal.
Respondents were asked to select from a range of options regarding the usability and thoroughness of the Mental Capacity Assessment tool. Considerably more than half felt that the tool made the conducting of estimates easier and more despite and 7 responses described the tools as faster. Only one respondent felt the tool made the process slower and one felt it was more complicated. No respondents felt that the tool made the process harder. Additional comments about the tools included, "good evidence" and "excellent" .
A total of 96% of respondents said that they would recommend the tools to others.
The providers distinguishing the tools from elearning systems and describe them primarily as 'real-time' decision aids and report writing tools. To assess their value as a training system, users were asked how the tools compared to e-learning systems they had used. The majority (75%) of respondents believes that the tools provided a more effective means of learning about the Mental Capacity Act than e-learning.
Respondents were invited to provide additional feedback using free text. The feedback divided into three areas including, benefits, possible improvements and problems.
Benefits
"This tool has assisted with ensuring that completing a MCA and best interest decisions are both and agree with the MCA."
"A great tool – has helped us a lot already in meeting the requirements as set out in Law."
"… a way to balance developing knowledge v dealing with the increasing demands placed on care staff due to reduced budgets."
"I have asked a couple of colleges – a social worker and residential care provider to try them out. if the tool means people are documenting assessments and decisions this is a step forward as many people are not currently! "
Improvements
Main themes included points regarding the words of some questions eg
"Some of the words can be a little confusing."
"Improve the way the questions are asked."
Some comments suggested improvements to the content and additional support material or guidance or modifications for expert users eg
"Reference to cases in my view helps."
"A clarification as to whether psychological care care interventons are 'treatments', as we had a scenario where if it was and there was a major dissension then it was potentially a contested Medical Decision and should go to the COP, it was a very minor situation that did not require this but that was where the program led us from the earlier choice, remedied by not choosing that it was a Medical Treatment. "
"Workers bombarded with information these days, application must be as user friendly as possible, the DOLS tool felt quite long, and it would be extra specifically useful if it was obvious that information from the report would feed into the DOLS application forms."
"A modification for Best Interests Assessors in the Doors and MCA processes."
Some responses found the restrictions on free text or the ability to move back in the tools limiting, eg
"Allow information to be input on what has already been completed."
"I do think free text is necessary to augment replies, and make assessments more person centered, and my concern would be people may not add any further detail."
"Allow information regarding who you have spoken to.
Problems
A number of respondents (n = 5) described having technical difficulties when using the site. These included crashes and loss of connection. There are a number of possible explanations for these problems and it is difficult to establish the extent to which technical problems are due to the tools themselves, the web hosting service, the equipment and browser being used or the reliability of the responses Internet connection. The providers acknowledge the possibility that these issues are due to the site technology.
Conclusion
The survey finds are positive and provide support for the tools. The tools appear to be valued by the majority of those who use them and demonstrate a number of strengths in terms of their usability, their utility as real time decision aids and also as educational tools. There are some technical aspects identified that could benefit from improvement, particularly technological improvement to provide a smoother and faster user experience.