Dialogue as psychological method – a study of training interviewing and communication skills in psychology students

Abstract Verbal interaction is at the heart of many helping professions. This is often reflected in the curriculum of professional training. Somewhat surprisingly, there is a dearth of research regarding the assessment and training of students in these skills, not least of those about to become psychologists. The present study focused on communicational microskills in psychology students. We were interested in generic skills that should be observable across different training tasks focused on training interviewing skills. Video-taped interview segments, where the students acted as interviewers, were rated by their instructors using an eleven-item assessment of active listening skills. A total of 206 students were included, of whom 63 were assessed on five consecutive occasions over three semesters. The assessment showed satisfactory overall homogeneity. An investigation of the factor structure identified three principal components: Attention and exploration; Structuring skills; and Direct relating. In the subsample that was studied over repeated sessions, a marked progression was observed over time. The results are discussed in terms of implications for assessment and training within an educational context.

training of psychology students were identified (Kuntze et al., 2007;Smit & Van der Molen, 1996) among the 100 most relevant publications.By contrast, we found several studies on medical students (e.g. Brown et al., 2005;Deveugele et al., 2005;Makoul et al., 2007;Yedidia et al., 2003;Zick et al., 2007) and also on social workers and general clinical professionals (e.g.Ammentorp et al., 2014;Dixon, 2013;Tompsett et al., 2017).Further consecutive searches were performed with a variety of keywords, and one further study was identified on self-assessment of communication skills in medical and psychology students (Tiuraniemi et al., 2011).Although this was not a systematic and exhaustive search, the picture was consistent.Within medical education, training of communication skills is fairly well studied (Gilligan et al., 2021;Kurtz & Silverman, 1996), but within the training of psychology student this is not the case.
In the medical professions, interviewing and communication skills are generally considered to be core competencies (Rider & Keefer, 2006;Suchman, 2003;Yedidia et al., 2003).The emphasis on these skills is in line with the idea that communicating and establishing a good relationship with the patient is an essential part of the professional repertoire.The practice of training medical students in communication skills is based on the premise that these skills have been shown to improve patient satisfaction and physicians' empathy and self-efficacy, and, moreover, reduce physicians' burnout (Boissy et al., 2016).In the literature, there seems to be a reasonable consensus that communication skills include empathy, active listening, maintaining contact, and the ability to use open questions, as well as reflections and summations (Bachmann et al., 2013;Ivey et al., 2018;Levitt, 2002;Pedersen, 2010;Winefield & Chur-Hansen, 2000).
To identify literature relevant for the training of psychology students, one viable approach is to search for the topic of counseling skills (Fyffe & Oei, 1979;Hill et al., 1981;Kuntze et al., 2007).However, training in counseling skills tends to be at an applied level.This implies that the skills tend to be more complex, being composed of several microskills (Ivey et al., 2018).Some studies have been conducted on developing core competencies, focusing on students' ability to aggregate theoretical knowledge and interpersonal skills in approaching and assessing patients with specific psychiatric diagnoses (Sheen et al., 2015;Yap et al., 2012).Such research was part of a pre-clinical exam in preparation for clinical internship, where an established assessment method adapted from medical education, the Objective Structured Clinical Examination (OSCE), was used.Nylund and Tilsen (2006) describe skills training on a continuum ranging from obtaining practical and theoretical knowledge, to rehearsing and reflecting, and onwards to integration and automatization, as a useful model for this process.We followed this line of reasoning when developing a training format for psychology students based on the work of the first author (Ljunggren, 2014).We have not been able to find any empirically validated methods for assessing or training psychology students regarding generic interviewing and communication skills.However, there is a multitude of available assessment formats for interviewing and communication skills (e.g.Gude et al., 2005;Makoul et al., 2007;Russel-Chapin & Sherman, 2000).Generally, these assessment formats are focused on the interviewer's skills in relating to the person being interviewed, and sometimes also on the interviewer's clinical assessment skills in these situations.However, our interest was specifically in assessing microskills rather than more complex, composite skills.
The available research indicates that skills training should recur throughout professional training to maintain skill levels and achieve progression (Deveugele et al., 2005;Koprowska, 2003;Lunt, 2002).Using this perspective, we identified one study that assessed active listening and self-efficacy (Levitt, 2002).The instrument used in that study also focused on composite counseling skills.However, they were broken down into different constituent skills that fall under the heading of active listening.Both from a theoretical and from a pedagogical perspective, we found this approach promising for our purposes.
This instrument, which has not yet been empirically validated, was originally developed as a formative assessment instrument to be used in counseling training to assess students' progressive development.Students' performance on eleven items during counseling sessions was rated on a Likert-type scale, where 1 indicated insufficient, 2 ¼ further training required, 3 ¼ satisfactory, and 4 ¼ skillful.The skills identified were based on the counseling microskills originally developed by Allan Ivey and coauthors in 1967 (Ivey et al., 2018).Ivey's microtraining approach introduces trainees to discernable skills that are essential to the counseling relationship and for effective counseling.
There seems to be a general agreement on the necessary skills in counselor training.The microskills are: Attendingdemonstrating interest in, and focusing on, the client; Active listeningdemonstrating the ability to follow and understand the client; Reflecting on content clarifying, paraphrasing, and summarizing; Reflecting emotiondemonstrating and communicating empathy by reflecting the client's emotions; Probing/Questioningdemonstrating the use of purposeful questions to keep the session on track and encourage further communication; Challenging/confrontingidentifying discrepancies and having the ability to challenge or confront when necessary; Using non-verbalsexhibiting appropriate use of body language, vocal tone, facial expressions, and eye contact; Opening the session; Closing the session; Having immediacyusing "I-you" statements and process-related questions when appropriate; and Using silenceallowing appropriate silence and demonstrating the ability to tolerate silence during the session.Levitt (2002) assessed counseling students on each of these skills in a repeated measures study to explore how students developed all of the microskills when training, and in her conclusion emphasizes active listening as a foundation.Since the microskills are predicated on attending and listening, Levitt's study and instrument sought to explore the effects of a training approach on the development of what Ivey et al. (2018) identified as foundational counseling competencies.This approach is similar to instruments used in other studies (e.g.Kuntze et al., 2007), where communication skills were assessed during counseling training, as described by researchers such as Hill as early as 1978.
In Sweden, education to become a psychologist is a 5-year master's degree program, and license to practice can be applied for from the National Board of Health and Welfare after 1 year of additional practice.The present study was conducted in the context of revising the curriculum of the Psychology program at Stockholm University.We wanted to study and develop the format used for the training of interviewing and communication skills.Apart from developing and refining progressive and viable teaching methods, we wanted to implement and validate a formative assessment procedure for generic microskills in interviewing and communication that could be used by instructors and by the students themselves.One of the objectives was to identify an assessment format that could be used for a range of different applications in training in interviewing and communication skills.We were also interested in identifying the internal structure between the different constituent skills that were assessed by the instructor.

Method
The assessment format used for active listening, emanating from Levitt's (2002) study, was translated and in part adapted to the training format in this study.Participants were recruited from among the psychology students from their first semester to their third.The training consisted of video feedback, a generally recommended method for learning interviewing and communication skills (Cartney, 2006;Fukkink et al., 2011).Each participating student met with another person not known to the interviewer (generally another student doing "interview time" as a course requirement).A 15-minute authentic interview on an everyday topic, that was perceived to be reasonably challenging and interesting but not provocative, was performed and video-recorded.These short interviews (interview "segments") were then watched and discussed in small groups, generally consisting of four persons, together with an instructor.The instructors were all clinical psychologists or teachers at the department, involved in various parts of the education program.They rated the interview based on the eleven items on the assessment form from 1 to 4 (See Appendix A, supplementary material).If a certain skill was not observed it was rated N/A (not available), which was coded 0. Some modifications of the original instrument were made.This was done to focus more closely on microskills that could be used in any communicative setting, not solely in counseling sessions as in Levitt's (2002) study.
Data on 245 unique individuals were collected from the fall of 2016 to the fall of 2018.Three consecutive course groups were assessed at five measurement points in the first, second, and third semester, within their first and second year of the program.Two course groups were assessed at three or four measurement points in semester one and two, or semester two and three, and two groups were assessed only at one or two points (in the first and/or third semester).This was due to the two and a half-year time frame of the project and the ambition to include as many students as possible on the three semesters.The first measurement was made during a listening exercise in the first semester when the students were practicing maintaining focus on another person's narrative.During the second semester, there were two training sessions where the students were instructed to apply an overarching plan for the interview and to explore the content in more depth while maintaining focus.Finally, during the third semester, there were two sessions when students were learning how to conduct a qualitative research interview and the focus was slightly shifted towards gathering data.Focusing on the interviewee was still emphasized.All sessions shared the same format, in which the interviews were video-recorded and discussed in small groups, and formative feedback was given to spur further development of skills.
Along with this data collection, a randomly selected subset of the recorded interviews was rated by the instructors as a group and discussed among them.Initially (before the actual data collection), this was used as pilot testing and as a basis for revising the instrument and the instructions given.A brief manual was developed to clarify the questions and we conducted conjoint sessions where the instructors together assessed the same interview and discussed their ratings.These occasions also served as an opportunity to develop acuity and improve agreement among the group of instructors.This was greatly appreciated as a training opportunity.
The study was approved by the regional ethics board in Stockholm County (dnr.2016/ 1114-31/5).Participation in the study was voluntary and all participants (students, interviewees, instructors) signed an informed consent form.Participation could be withdrawn at any time during the study without any consequences for the individual.Data were analyzed using bivariate principal component analysis (PCA) in IBM SPSS Statistics, version 27 (IBM Corporation, Armonk, NY, USA).

Results
The sample originally consisted of 245 individuals, but due to incomplete data, 39 cases were excluded, and the reported analyses are based on 206 individuals, 136 women and 70 men with a mean age of 29.28 years (standard deviation (SD) 8.10).For the longitudinal analysis of development over the five measurement points, 63 cases with complete data on all occasions were identified.
To analyze the internal structure between the different constituent skills that were assessed by the instructors, a PCA was performed.As a first step, the suitability of data for factor analysis was assessed.The correlation matrix revealed the presence of many coefficients of .3 and above (Table 1).Bartlett's Test of Sphericity (Bartlett, 1954) reached statistical significance (p < .001)and a Kaiser-Meyer-Olkin value of .84indicated good factorability, supporting the results of the correlation matrix.
The eleven items of the assessment instrument for active listening were analyzed by means of a PCA (N ¼ 206) using direct oblimin rotation.Cronbach's alpha for the scale from the current sample was .714,which is acceptable and indicates good internal consistency of the items.Three components with an eigenvalue of >1.0 were found: (1) Attention and exploration; (2) Structuring skills; and (3) Direct relating.The three-component solution explained a total of 59.01% of the variance, with component 1 contributing 38.33%, component 2, 11.06%, and component 3, 9.62%.The component loadings are shown in Table 2.The three components differed with regard to score ranges, depending on the number of items in each component.Component 1 consisted of seven items and had a min/max value of 0-28; component 2 contained six items with a min/max value of 0-24, and component 3 included two items with a min/max value of 0-8.
To evaluate the effect of the training in interviewing and communication skills that was carried out over three semesters, paired samples t-tests were performed on each of the components comparing training session 1 and 5 in the subsample (N ¼ 63) that was assessed at five measurement points.There was a statistically significant increase in component 1 scores from training session 1 to 5, t(62) ¼ À6.68, p < .000.The eta squared (.42) indicated a large effect size (Cohen, 1988).Likewise, component 2 displayed a significant increase, t(62) ¼ À4.65, p < .000,and a large effect size, eta squared .26.Finally, in component 3, there was also a significant effect of the conducted training, t(62) ¼ À2.99, p < .001,and a moderate effect size, eta squared .12.Progression over time is displayed in Figure 1 and complementary statistics concerning component means and SDs are presented in Table 3.

Discussion
The purpose of the present study was twofold.We were interested in the assessment of interviewing and communication skills, especially assessment of active listening skills and its Dialogue as psychological method constituents.But our interest was also in finding a procedure that could be used to track progression over training in formats that teach interviewing skills in different contexts.A central question was whether listening skills were approached by instructors as a monolithic construct or whether these skills could be broken down into components.What we found was an overall high degree of homogeneity in the assessment instrument.This is in accordance with previous research on the assessment of communication skills (Kuntze et al., 2007;Makoul et al., 2007;Russel-Chapin & Sherman, 2000).However, the PCA found three components with an eigenvalue >1.0.When looking at the items, the first seemed to capture attention and exploration, which pertains to items concerning listening, reflecting, and asking questions.We labeled the second Structuring skills, since the two most important items concerned opening and closing the interview.The third and last component we labeled Direct relating, since it was dominated by the item "immediacy," i.e. involving the interviewer as a person in the questions and comments posed.Kuntze et al. (2007) found that a two-factor structure was a good fit in the assessment of communication skills, consisting of a basic skills factor and an advanced skills factor.However, the learning context in our study differed from theirs in substantial ways.We studied the continuous progression in skill over several terms.Our ambition was also to identify the constituents of active listening and our findings are more in line with the aim of Newman et al. (2022) and Hill et al. (2020), which was to focus on microskills (i.e.paraphrasing, clarifying, reflecting emotion, and immediacy), skills that are in general agreement with the components that we identified.It could also be argued that these three different components could be meaningfully treated and trained  as separate areas of microskills.However, in an actual interview, we found that these components are intertwined with each other.
It should be noted that the training, with regard to both contexts and instructors, differed over the three semesters; however, a progression in listening skills was still evident in the material.There was a dip in the ratings at measurement point four, which is interesting because the students were instructed to use a more research-like format when they conducted the interviews.This may have impacted the skills rated in the active listening assessment.It should also be noted that, due to the time frame and the decision to involve all firstand second-year students, only a minority were assessed at five measurement points.
We can only assume that their progression was representative of the group of students at large, but this is of course only an assumption.
Data were collected as an integral part of the regular training and the ratings were made after the interview segments had been discussed in the group with the instructor.This is a weakness since it may have impacted the ratings, but this was inevitable when using the standard training procedure.Overall, the educational aspect had priority over methodological considerations.While this may have marred the scientific lucidity, it could be argued that it may have increased the ecological validity.Two recent studies (Hill et al., 2020;Newman et al., 2022) that were conducted in actual professional training contexts (counseling and consulting) used deliberate practice (Ericsson & Harwell, 2019) in the training of specific microskills.Both studies are in accordance with our own in underscoring the importance of studying training in an actual educational context.
We observed a progression in active listening skills over the semesters when the students were assessed on consecutive occasions.Previous research has failed to show effects of training in communication skills in occupational therapy students (Jeffery & Hicks, 1997).In some respects, our results strengthen the validity of the present procedure of formative assessment of listening skills.However, validity cannot be taken for granted.One difficulty when assessing progression over time is that there are no absolute measures of the students' achievements.We strongly suspect that the instructors took the students' lack of initial experience as well as their greater experience in the later assessments into account (though not deliberately).This would imply that the progression may have been larger than captured in these assessments.Still, it is promising that the assessments do seem to capture a progressive development of listening skills over different training sessions.
It should also be noted that the instructors themselves underwent change during the three semesters.Striving for consistency, they all went through the same introductory training.The manual that we developed may have contributed to increased interrater reliability, but this was not systematically assessed.However, this training was very much appreciated by the instructors and perceived as very helpful, both in guiding the observation of and feedback given to the students, and as a learning experience.Overall, it must be taken into consideration that the communication and interviewing skills training program underwent development and further refinement during the data collection period.From an organizational point of view this was, of course, desirable.From a scientific perspective, it may have hampered the validity of our results.The ultimate value of any training resides in its longterm impact i.e. contributing to making competent professional psychologists.From that perspective, it must be acknowledged that the time frame of the present study is limited.Despite this, we would still argue that the study has important strengths in that it was conducted longitudinally in an important context that is underresearched with regard to the progressive development of listening skills.Furthermore, it has a pragmatic value in providing an assessment format that can easily be used by instructors and provide a basis for consensus among instructors on the nature of the skill of active listening.Further strengthening the format of deliberate practice should be an important contribution to this kind of skill development (Hill et al., 2020;Newman et al., 2022).
Future research should focus on investigating the interrater reliability of the assessment format.Such a study could readily be included in the naturalistic and pragmatic format used in the present project.Further on the matter of reliability, Cronbach's for the scale was indicating a good consistency of the items.We used an existing instrument and the aim of the study was not to assess the scale itself, so no additional reliability analyses were performed.Having said that, the scale was translated into Swedish and an important future topic would be to proceed with more comprehensive reliability analyses.
We would also suggest that the field would benefit from looking further at students making either self-or peer assessments and how these assessments relate to the instructors' assessments of the same skills.Some studies have already done this (Tompsett et al., 2017;Winefield & Chur-Hansen, 2000); however, they have focused on perceived self-efficacy regarding communication skills rather than on the students' assessment of their competence in actually performing these skills.Studying self-assessment of skills would create a more student active training format where the feedback becomes even more accessible to the students, as well as spurring critical assessment of and reflection on their own behavior in a critical situation that has relevance for a vast number of applied settings in their future career as psychologists.

Disclosure statement
No potential conflict of interest was reported by the author(s).

Figure 1 .
Figure 1.Progression of interviewing and communication skills over three semesters, assessed at five consecutive measurement points (Training 1-5) and analyzed based on three components from the principal component analysis (PCA).

Table 1 .
Correlation matrix for items in the instrument for assessment of active listening.

Table 2 .
Results from a principal component analysis (PCA) of the instrument for assessment of active listening (IAAL).

Table 3 .
Mean values and standard deviation (SD) for each component and training session.