Developing assessments for the National Assessment and Accreditation System (NAAS)

Hannah Rowe and Priya Dutta, Assessment Managers at AlphaPlus, talk about their experience working with the Department for Education on the National Assessment Accreditation System (NAAS).

Background

AlphaPlus worked with the Department for Education from 2019 to 2023 on the National Assessment and Accreditation System (NAAS) for child and family social workers. NAAS enabled child and family social workers to develop skills and knowledge to improve outcomes for children and families; it was therefore crucial that the assessment was valid and reliable, while still being manageable for the sector.

Purpose

The government’s Munro Report was released in 2011, following serious failures in high-profile child protection cases (Victoria Climbié, Baby P). The report aimed to effect change in the children’s social care sector, moving it away from being overly focused on procedure towards a more holistic and child-centred approach. The NAAS system came about following this government review as a means to strengthen social work practice, restore faith in the profession and recognise performance.

Assessment developed

Multiple approaches were used to assess candidates:

  1. A multiple-choice assessment was completed on an online platform. This was 1 hour 30 minutes / 60 multiple-choice questions. The assessment included: short general knowledge questions and applied knowledge questions covering all the professional areas defined for practitioners (known as Knowledge and Skills Statements).
  2. Two simulated practice scenarios – role play/OSCE station that include a simulated meeting between the social work candidate and an actor trained to play the role of either a family member, young person, other professional etc. These were 15-minute scenarios.
  3. A 15-minute verbal reflection based on one of the simulated practice scenarios.
  4. A 30-minute written task based on the other simulated practice scenario.

AlphaPlus worked with industry experts to produce the knowledge assessment items, updating them annually. Sector experts worked with our in-house assessment specialists and proof readers to produce high-quality materials.

To create the scenarios needed for the role plays, we worked with sector experts to identify suitable assessment themes from the Knowledge and Skills Statements, which would be accessible to all candidates, irrespective of which Local Authority and team they were in. We produced the information on the scenario presented to the candidate, as well as a detailed set of guidelines for the actors in the scenario, giving them the required background information on the social work context and prompts and directions on how to respond based on the approach taken by the candidate.

For the reflective and written exercises, we called on the professional judgement of senior social work professionals to re-design the reflective and written components, ensuring that they were both relevant and up to date.

Focus on valid assessment

Validity was of the utmost importance for these assessments. Social workers deal with a wide range of high-pressure situations and work closely with society’s most vulnerable; those who pass these assessments must be prepared.

To ensure validity, we established a test specification which laid out the details and conditions of the assessments, stating the required balance of content.

We set up robust quality assurance processes to make sure that the assessments would be accepted by the profession. This included the establishment of a Quality Assurance Group (QAG) and a Content Advisory Group (CAG) – both made up of sector experts – to ensure that the content of the tests was authentic to the experiences of those sitting the assessments. All assessments and all standard-setting exercises were overseen by these subject matter experts, ensuring validity was at the forefront of this work.

We also commissioned an independent validity report, working with the social work sector to consider if the assessments were doing what they aimed to do.

Test maintenance/item banking

One of AlphaPlus’ main responsibilities was to oversee the maintenance and development of several hundred live assessment items in the item bank. This included statistical analysis of the performance of pre-existing items and refreshment of the item bank as necessary.

New knowledge assessment items were authored using our in-house authoring platform, tagged with metadata and imported into the NAAS item bank. This meant that we had clear oversight of the number and types of questions in the bank. Items were used to create varying test forms with common anchor items.

Statistics and standard setting

Statistical analysis was a key component of AlphaPlus’ work on the project, and we produced reports on the assessment data at key junctures throughout the project. The ongoing analysis of live assessment data was then aligned with standards-setting activities. It was decided that the Hofstee method was most suitable for the simulated practice assessments, with the Angoff method being used for the knowledge assessments.

A standard-setting panel was selected from the sector, with consensus-driven work conducted to compile profiles for minimally competent candidates. This work aligned the NAAS assessments with similar assessments in other professions.

Working with partners and other stakeholders

We took over as the assessment partner from the previous contract providers, facilitating a seamless transition with no interruption to the delivery.

We worked extensively with the delivery partner on the contract, with whom we had a positive working relationship – delivering materials on time, visiting the assessment days to see the assessments in action, frequent catch-up meetings and reflective meetings on how the assessments were operating/performing to maximise the quality of assessment and the candidate experience. This included meetings with the acting agency, a subcontractor of the delivery partner, as the quality of the actors and their performance was key to the validity of the assessment experience.

We also maintained a positive working relationship with the Department for Education, the agency in charge of NAAS. This relationship was fostered through frequent catch-ups (in-person workshops and online), detailed reporting and shared risk logs and through clear milestone mapping, for clear project planning and timely delivery.

Researching new approaches

Following the decision by the DfE to cease the delivery of NAAS in 2022, we have subsequently worked with DfE to research what assessment for social workers might look like moving forward.

Our role included leading workshops for groups of up to 30 senior civil servants, from various divisions within the  DfE. The series of workshops explored themes and approaches to early-year education and included delivering presentations and producing research reports for DfE and other stakeholders. These considered the future iteration of this assessment, prioritising validity and reliability whilst still being manageable for a larger cohort. Our proposal emphasised the importance of a clear assessment specification from the outset, whilst also considering various new ways to assess content in future.