Hostname: page-component-8448b6f56d-jr42d Total loading time: 0 Render date: 2024-04-23T08:03:03.058Z Has data issue: false hasContentIssue false

Exploring the Relationship of a Gamified Assessment with Performance

Published online by Cambridge University Press:  01 March 2019

Ioannis Nikolaou*
Affiliation:
Athens University of Economics and Business (Greece)
Konstantina Georgiou
Affiliation:
Athens University of Economics and Business (Greece)
Vasiliki Kotsasarlidou
Affiliation:
Athens University of Economics and Business (Greece)
*
*Correspondence concerning this article should be addressed to Ioannis Nikolaou. Athens University of Economics and Business. Department of Management Science and Technology. 104 34 Athens (Greece). E-mail: inikol@aueb.gr
Rights & Permissions [Opens in a new window]

Abstract

Our study explores the validity of a game-based assessment method assessing candidates’ soft skills. Using self-reported measures of performance, (job performance, Organizational Citizenship Behaviors (OCBs), and Great Point Average (GPA), we examined the criterion-related and incremental validity of a game-based assessment, above and beyond the effect of cognitive ability and personality. Our findings indicate that a game-based assessment measuring soft skills (adaptability, flexibility, resilience and decision making) can predict self-reported job and academic performance. Moreover, a game-based assessment can predict academic performance above and beyond personality and cognitive ability tests. The effectiveness of gamification in personnel selection is discussed along with research and practical implications introducing recruiters and HR professionals to an innovative selection technique.

Type
Research Article
Copyright
Copyright © Universidad Complutense de Madrid and Colegio Oficial de Psicólogos de Madrid 2019 

In order to gain a competitive advantage and make a profit from their activities, organizations need a good strategy. But to gain a sustainable competitive advantage, that can last a long time and should not be easily imitated by competitors; organizations must have the people resources in place to successfully implement the strategy. Along these lines, the need to screen out talented prospective employees possessing the required skills to fit the job and meet the performance standards is apparent for every business. Traditional selection methods, such as general mental ability and personality tests, predict job performance to some extent (Ryan & Ployhart, Reference Ryan and Ployhart2014). A number of researchers have recently suggested that the use of gamification in personnel selection, such as game-based assessments, might predict job performance beyond traditional selection methods (e.g., Armstrong, Landers, & Collmus, Reference Armstrong, Landers, Collmus, Davis and Gangadharbatla2016; Fetzer, Mcnamara, & Geimer, Reference Fetzer, McNamara, Geimer, Goldstein, Pulakos, Passmore and Semedo2017). Game-based assessments is a new assessment method incorporating game elements in employee selection and is lately widely applied in personnel selection practice, raising questions about its ability to predict job performance. To the best of our knowledge, no published empirical research has established the effectiveness of game-based assessments in the employee selection process. Our study is designed to examine the potential of a game-based assessment in predicting a number of performance measures. Specifically, we test the relationship between a game-based assessment and performance criteria (e.g., perceived job performance, Grade Point Average-GPA, perceived Organizational Citizenship Behavior-OCB) to explore its criterion related validity. We also explore the extent to which a game-based assessment predicts performance beyond traditional selection methods (personality measures and cognitive ability).

Traditional selection tests and performance

Cognitive ability and personality tests are widely used nowadays by organizations in an effort to predict future work performance. Several studies and meta-analyses support not only the validity of cognitive ability and personality tests but also their effective combination in predicting job performance (Schmitt, Reference Schmitt2014). Cognitive ability tests measure the levels of general cognitive ability or intelligence, as well as aspects of it (e.g., numerical, verbal, abstract, and spatial ability). Meta-analytic findings indicate that both general cognitive ability and specific cognitive abilities predict successfully performance and work-related outcomes (e.g. Ones, Dilchert, & Viswesvaran, Reference Ones, Dilchert, Viswesvaran and Schmitt2012). Moreover, cognitive ability is supported to be the single best predictor of performance at work, as well as, of performance outcomes in the majority of job positions and situations (Schmitt, Reference Schmitt2014). As far as personality is concerned, the most popular personality model is the five-factor model of personality (FFM) studied extensively in diverse countries and cultures around the world. The predictive validity of at least two key factors of the FFM (especially conscientiousness but also neuroticism) has been well established across different job positions and organizations, whereas, meta-analytic findings (Barrick, Mount, & Judge, Reference Barrick, Mount and Judge2001) have also supported the predicted validity of most personality dimensions of the FFM.

In the performance domain we often study criterion measures, such as academic attainment and OCB, apart from job performance. OCBs or extra-role performance are defined as the voluntary and non-mandatory employee behaviors that positively influence organizational effectiveness and contribute to the overall productivity of the organization (Smith, Organ, & Near, Reference Smith, Organ and Near1983). Both emotional and cognitive intelligence have been found to be related to organizational citizenship behaviors (e.g., Cote & Miners, Reference Côte and Miners2006). Whereas, personality traits, such as agreeableness and conscientiousness, have been found to predict OCB as well (e.g., Chiaburu, Oh, Berry, Li & Gardner, Reference Chiabur, Oh, Berry, Li and Gardner2011). Similarly, academic performance has been found to be significantly predicted by personality and cognitive ability. Academic performance is usually measured with student grades or grade point average-GPA, which is supported to predict performance at work (Roth, BeVier, Switzer, & Schippmann, Reference Roth, BeVier, Switzer and Schippmann1996). A number of meta-analytic studies exploring the relationship between personality and academic performance supported that agreeableness, conscientiousness and openness to experience, as well as intelligence, predict academic performance (Poropat, Reference Poropat2009; Strenze, Reference Strenze2007). The relationship between cognitive ability and academic performance is also well established (Chamorro-Premuzic & Furnham, Reference Chamorro-Premuzic and Furnham2008). “Academic performance has been the criterion for validating IQ tests for over a century, and one would hardly refer to these tests as “intelligence” measures if they did not correlate with academic performance” (Chamorro-Premuzic & Furnham, Reference Chamorro-Premuzic and Furnham2008, p. 1597). It is worth reporting that both general cognitive ability and specific cognitive abilities (working memory, processing speed, spatial ability) can predict academic performance whereas, specific cognitive abilities can predict academic performance beyond general cognitive ability (Rohde & Thompson, Reference Rohde and Thompson2007).

To sum up, there is a large body of research which indicates general mental ability and personality tests as important predictors of performance. However, traditional selection methods, such as personality tests, predict job performance to some extent, whereas, they are prone to faking and social desirability (e.g., Morgeson et al., Reference Morgeson, Campion, Dipboye, Hollenbeck, Murphy and Schmitt2007; Ryan & Ployhart, Reference Ryan and Ployhart2014). Phenomena, that the application of gamification in employee testing might restrain increasing thus the assessment’s predictive validity and utility in practise. Moreover, the advent of technology has started to render traditional selection methods obsolete, paving the way for more technologically advanced methods capable to reduce the cost of hiring and improve applicant reactions.

Game-based assessment methods and performance

Gamification, the application of game-design elements in non-game contexts (Armstrong et al., Reference Armstrong, Landers, Collmus, Davis and Gangadharbatla2016), has recently caught the attention of researchers and practitioners in Work/Organizational Psychology and Human Resources Management, as a promising tool in employee selection. Employee testing methods have started to incorporate game elements and designs turning into assessments that are likely to be more fun and attractive to candidates, as well as more difficult to fake (Armstrong et al., Reference Armstrong, Landers, Collmus, Davis and Gangadharbatla2016). The addition of game elements into the assessments might render the assessments more difficult for candidates to decode and identify what the correct answer is, as personality traits or intentions and behaviors are assessed indirectly. For example, in a gamified Situational Judgement Test (SJT) the clothing of the scenarios and answers with game elements might make the desirable behaviors less obvious to candidates and as a result, more difficult to distort intentionally or unintentionally what their reactions would be in a given situation as it is away from real life situations.

Moreover, building on the concept of “stealth assessment”, Fetzer et al. (Reference Fetzer, McNamara, Geimer, Goldstein, Pulakos, Passmore and Semedo2017) highlighted the potential of game-based assessments in predicting job performance. Stealth assessments can accurately and efficiently diagnose the level of students’ competencies by extracting continuously performance data that are gathered during the course of playing/learning (Shute, Ventura, Bauer, & Zapata-Rivera, Reference Shute, Ventura, Bauer, Zapata-Rivera, Ritterfeld, Cody and Vordered2009). In other words, stealth assessment is an assessment that is “seamlessly woven into the fabric of the learning or gaming environment so that it’s virtually invisible…reducing thus test anxiety while not sacrificing validity and consistency” (Shute, Reference Shute2015, p. 63). Along these lines, a gamified assessment environment might distract candidates from the fact that they are assessed, reducing test anxiety and promoting behaviors that are more likely to appear unconsciously instead of the desirable or socially acceptable ones. Game engagement and the use of contexts diagnosing how an individual handled a given problem – similar to work-sampling techniques - might lead to more robust inferences about performance than traditional selection inventories that rely on self-reported measures (Fetzer et al., Reference Fetzer, McNamara, Geimer, Goldstein, Pulakos, Passmore and Semedo2017). Taking into consideration all the evidence mentioned above, we aim to explore the effectiveness of the game-based assessment method measuring four soft skills (i.e., resilience, adaptability, flexibility, and decision-making) by testing whether its dimensions are related to performance measures over and above traditional selection measures.

A major challenge that employers nowadays face when hiring young graduates is the lack of applicants with the right skills and competencies (Picchi, Reference Picchi2016, August 31). Among the most desirable soft skills that employers are looking for are adaptability, flexibility, decision-making, and resilience (e.g., Gray, Reference Gray2016; McKinsey & Company, 2017). Resilience, the ability to bounce back from adversities (Luthans, Reference Luthans2002), might be vital for both personal and job effectiveness with numerous positive outcomes in work and academic settings. For example, resilient individuals are likely to have higher levels of job performance, job satisfaction and organizational commitment (e.g., Avey, Reichard, Luthans, & Mhatre, Reference Avey, Reichard, Luthans and Mhatre2011), as well as, OCB (Paul, Bamel, & Garg, Reference Paul, Bamel and Garg2016). Moreover, students with higher levels of resilience are likely to demonstrate increased academic performance levels, as well as higher class participation, enjoyment and self-esteem (Martin & Marsh, Reference Martin and Marsh2006, Reference Martin and Marsh2008). Similarly, adaptability, the “response or people’s adjustment to changing environmental situations” (Hamtiaux, Houssemand, & Vrignaud, Reference Hamtiaux, Houssemand and Vrignaud2013, p. 130) has positive outcomes in both academic and work contexts. For example, successful students (GPA of 80% or more) were found to have high levels of interpersonal, adaptability, and stress management skills (Parker et al., Reference Parker, Creque, Barnhart, Harris, Majeski, Wood and Hogan2004). Moreover, high adaptability is related to positive relationships and behaviors in school, such as studying, leadership, and reduced school problems (Brackett, Rivers, Reyes, & Salovey, Reference Brackett, Rivers, Reyes and Salovey2012). In the work context, adaptability is important in performing well, handling ambiguity, and dealing with uncertainty and stress (Kehoe, Reference Kehoe2000). Whereas, “volunteering to help co-workers (an aspect of OCB) might require one to adapt to changing co-worker behaviour” (Ployhart & Bliese, Reference Ployhart, Bliese, Shawn Burke, Pierce and Salas2006, p. 11). Similarly to adaptability, flexibility, defined as the individual’s capacity to adapt, is likely to have positive outcomes in work, academic and job seeking settings (Golden & Powell, Reference Golden and Powell2000). Individuals with high levels of flexibility are able to address different situations creating thus value to organizations instead of harming them because of their inability to adjust in changes (Bhattacharya, Gibson, & Doty, Reference Bhattacharya, Gibson and Doty2005). Moreover, OCB performers are likely to increase their flexibility in order to adjust to the requirements of various roles and settings at work displaying thus behaviors that contribute to organizational effectiveness (Kwan & Mao, Reference Kwan and Mao2011). Organizational success, especially in changing environments, depends also largely on effective decision-making, defined as an intellectual process leading to a response to circumstances through the selection among alternatives (Nelson, Reference Nelson1984). Employees who are capable of effective decision-making devote effort to analyze information to better understand a company’s threats, opportunities and options, consult other people and collaborate together in making decisions and act proactively in getting the things done, enhancing thus, organizational performance (Miller & Lee, Reference Miller and Lee2001). Whereas, participation in decision-making leads to positive outcomes within educational settings, such as OCB (Somech, Reference Somech2010).

Taking into consideration all the evidence mentioned above, we aim to establish the effectiveness of the gamified selection method that we developed by testing whether the gamified SJT dimensions are related to performance and in particular, to performance measures, OCB and GPA, over and above traditional selection measures (e.g., personality tests, cognitive ability); therefore, we state the following hypotheses.

  • H 1: Game-based assessment dimensions will be positive associated with participants’ job performance scores.

  • H 2: Game-based assessment dimensions will be positive associated with participants’ GPA.

  • H 3: Game-based assessment dimensions will be positive associated with participants’ OCB.

  • H 4: Game-based assessment dimensions will provide incremental validity above and beyond the effect of cognitive ability and personality in predicting participants’ job performance scores.

  • H 5: Game-based assessment dimensions will provide incremental validity above and beyond the effect of cognitive ability and personality in predicting participants’ GPA.

  • H 6: Game-based assessment dimensions will provide incremental validity above and beyond the effect of cognitive ability and personality in predicting participants’ OCB.

Method

Sample & Procedure

The study was conducted in Greece during the last months of 2017, attracting participants via the authors’ university career office, along with post-graduate and final-year undergraduate students or recent graduates. We contacted final-year undergraduate students, graduate students or recent graduates to participate in a survey about a selection method, as these students were approaching graduation and were likely to search for employment soon (e.g., van Iddekinge, Lanivich, Roth, & Junco, Reference van Iddekinge, Lanivich, Roth and Junco2016).

The data collection took place in two phases. In the first phase, participants were invited to complete the self-reported measures of cognitive ability, personality, performance measures and OCB. Three to four weeks after completion, participating individuals of the first phase were invited to play the game-based assessment. 193 participants took part in the first phase and 120 of them participated in the second phase, as well, a response rate of 62%. The majority of them were females (64%) with a mean age of 26 years. As far as their education level is concerned, 46% of the participants were final year undergraduates, 15% were post-graduate students, another 15% were university graduates and 24% had already acquired a post-graduate degree. Most of them (55%) were currently employed, working in entry-level (57.5%) or middle-level positions (27.5%).

Measures

Cognitive ability. This was measured with items taken from the International Cognitive Ability Resource (ICAR) (2014),Footnote 1. ICAR is a public-domain and open-source tool created by Condon and Revelle (Reference Condon and Revelle2014), aiming to provide a large and dynamic bank of cognitive ability measures for use in a wide variety of applications, including research. The test includes four item types: Three-Dimensional Rotations, Letter and Number Series, Matrix Reasoning, and Verbal Reasoning. We used the 11 Matrix Reasoning items, which contain stimuli similar to those used in Raven’s Progressive Matrices, and which is also more closely related to abstract reasoning. “The stimuli are 3x3 arrays of geometric shapes with one of the nine shapes missing. Participants are instructed to identify which of six geometric shapes presented as response choices will best complete the stimuli” (ICAR, 2014, p. 2).Footnote 2 It is worth noting that the correct answer is only one, whereas the options “None of the above” and “Do not know” are also available. An overall score is calculated, with high scores indicating higher levels of cognitive abilityFootnote 3.

Personality. Participants completed the 50 items International Personality Item Pool (IPIP; Goldberg et al., Reference Goldberg, Johnson, Eber, Hogan, Ashton, Cloninger and Gough2006) to assess the Five-Factor model of personality. Each scale consisted of 10 items. Standard IPIP instructions were presented to participants, who responded on a 5-point Likert-type scale ranging from 1 (inaccurate) to 5 (accurate). Research has reported good internal consistencies for IPIP factors (see, for example, Lim & Ployhart, Reference Lim and Ployhart2006). In our study, reliability estimates were .81 for conscientiousness, .83 for emotional stability, .83 for extroversion, .79 for agreeableness, and .75 for openness to experience.

Performance measures. Overall job performance was self-evaluated by working individuals only using a measure used by Nikolaou and Robertson (Reference Nikolaou and Robertson2001). It consists of six items where the individual has to indicate whether she/he agrees or disagrees with the behavior described in a five-point scale ranging from 1 (strongly disagree) to 5 (strongly agree). An overall job performance score was calculated by averaging the scores of the six items eliciting internal consistency reliability of .91. Example items include “Achieve the objectives of the job” and “Demonstrates expertise in all aspects of the job”. We also asked participants to indicate their GPA from their first degree in order to use it as an alternative to job performance for non-working individuals. The range of the grading system in Greek public universities is 0.00–10.00 (Excellent = 8.50–10.00, Very Good = 6.5–8.49, Good = 5.00 –6.49, and Fail = 0.00–4.59). The GPA reported by participants was the average grade awarded for the duration of their bachelor studies.

Organizational Citizenship Behavior (OCB). OCBs were self-evaluated by working individuals only using a measure developed by Smith et al. (Reference Smith, Organ and Near1983). It consists of 16 items where the individual has to indicate whether she/he agrees or disagrees with the behavior described in a five-point scale ranging from 1 (strongly disagree) to 5 (strongly agree). The original scale measures two subscales; altruism and generalized compliance. However, for the purposes of the current study we only used the overall OCB score eliciting internal consistency reliability of .70. Example items include “I help other employees with their work when they have been absent” and “I exhibit punctuality in arriving at work on time in the morning and after lunch breaks”.

Soft skills. We used a Game-Based Assessment (GBA) developed by OwiwiFootnote 4 in order to measure the four soft skills evaluated by the game, namely resilience, adaptability, flexibility and decision-making. The four skills are evaluated following a SJT methodology converted into an on-line game environment, with fictional characters. The Owiwi game has demonstrated satisfactory psychometric elements and increased equivalence with the originally developed SJT measuring the four soft skills (Georgiou, Nikolaou, & Gouras, Reference Georgiou, Nikolaou and Gouras2017). Resilience is defined as “the developable capacity to rebound or bounce back from adversity, conflict, and failure or even positive events, progress, and increased responsibility” (Luthans, Reference Luthans2002, p. 702), “Αdaptability is related to change and how people deal with it; that is to say, people’s adjustment to changing environments” (Hamtiaux et al., Reference Hamtiaux, Houssemand and Vrignaud2013, p. 130). Flexibility is defined as the demonstration of “adaptable as opposed to routine behaviors; it is the extent to which employees possess a broad repertoire of behavioral scripts that can be adapted to situation-specific demands” (Bhattacharya et al., Reference Bhattacharya, Gibson and Doty2005, p. 624) and finally decision-making is defined as an intellectual process leading to a response to circumstances through selection among alternatives (Nelson, Reference Nelson1984). Individualized feedback is provided to all participants upon completion of the game.

Results

Table 1 presents the inter-correlation matrix of the study’s variables. An interesting pattern we observe in the inter-correlation matrix, is that the cognitive ability measure is not associated with any of the scales measured here. Also, the self-reported job performance measure is correlated significantly with conscientiousness, emotional stability and openness to experience for the five-factor model of personality. Moreover, the OCB measure is associated with agreeableness, similarly to past research on the relationships between agreeableness and OCB, but not with conscientiousness. Finally, the soft skills assessed by the game-based assessment, which is the main focus of the current study, are not correlated with any of the criterion measures, with the exception of the positive correlation between GPA and decision making, rejecting thus H 1 and H 3 and only partially confirming H 2.

Table 1. Inter-Correlation Matrix of Study’s Variables (N = 63–120)

Note: *p < .05. **p < .01. ***p < .001.

Next, we proceed with the examination of our research hypotheses. Our main focus in this study is the suitability of the game-based assessment as a selection tool, above and beyond the well-established effect of cognitive ability and personality, especially conscientiousness. Our first three hypotheses deal with the association between game-based assessment and the three performance criteria. In order to explore these hypotheses we executed three separated multiple regression analyses for each one of the three criterion measures. The results of these analyses are presented in Table 2.

Table 2. Hierarchical Regression Analysis of the GBA on the Three Criterion Measures

Note: OCB = Organizational Citizenship Behavior; GPA = Great Point Average.

* p < .05. **p < .01. ***p < .001.

The results of the regression analyses show that flexibility and decision-making are positively associated with self-reported job performance and GPA respectively. The block of the four skills predict 13%, 7% and 10% of the total variance in job performance, OCB and GPA respectively. Therefore, H 1 and H 2 are partially confirmed, whereas H 3 is rejected. Subsequently, we explored the incremental validity of the game-based assessment. In order to explore H 4-H 6 we conducted a number of hierarchical regression analyses, controlling for the effect of cognitive ability and the five-factor model of personality. The results of these analyses are presented in Table 3.

Table 3. Hierarchical Regression Analysis of the GBA on the Three Criterion Measures controlling for Cognitive Ability and Personality

Note: OCB = Organizational Citizenship Behavior; GPA = Great Point Average.

* p < .05 **p < .01. ***p < .001.

The results of these analyses demonstrate that the soft skills measured by the game-based assessment do not predict additional variance in either job performance or OCBs for the working individuals of our sample, above the effect of cognitive ability and personality rejecting thus H 4 and H 6. However, they seem to have an important effect on GPA. More specifically, both as a group and separately (adaptability and decision making) demonstrate a statistical significant relationship with GPA, above and beyond the effect of cognitive ability and personality. These results establish the usefulness of game-based assessments in predicting educational attainment, as measured by the GPA, both as a group and individually in the case of adaptability and decision making.

Discussion

Our study explores the effectiveness of a game-based assessment in employee selection. Extending previous research on Work/Organizational Psychology and traditional selection methods, we introduce a game-based assessment designed to measure candidates’ soft skills (e.g., adaptability, flexibility, decision-making) that is found to be associated with self-reported measures of performance. Our study contributes to employee selection research, providing some support to the use of gamification in soft skills assessments and their ability to predict performance in work and academic settings. For example, a game-based assessment measuring soft skills, such as decision-making and flexibility, can predict test-takers’ self-reported job performance and GPA. By incorporating game elements into assessments that do not use self-reported measures, but assess behavioral intentions, test-takers’ attractiveness and engagement into the assessment might be enhanced, while it might be more difficult for them to understand what is being assessed and what the correct answer is (Armstrong et al., Reference Armstrong, Landers, Collmus, Davis and Gangadharbatla2016; Fetzer et al., Reference Fetzer, McNamara, Geimer, Goldstein, Pulakos, Passmore and Semedo2017). As such, the use of game elements and designs might improve the validity of assessments.

Moreover, Armstrong et al. (Reference Armstrong, Landers, Collmus, Davis and Gangadharbatla2016) suggested that game-based assessments, such as gamified simulations, might be employed to assess important predictor constructs like learning agility in employee selection settings where survey methodology may not be adequate. Along these lines, our study extends research on traditional selection methods, exploring the incremental validity of a game-based assessment assessing soft skills. Game-based assessments measuring soft skills, such as adaptability and decision making, can predict academic performance (e.g., GPA), above and beyond traditional selection methods (e.g., cognitive ability and personality tests). However, the soft skills measured by the game-based assessment do not predict additional variance in either job performance or OCBs, above the effect of cognitive ability and personality.

To sum up, both personality and intelligent tests have been extensively tested in academic contexts and their validity in predicting GPA has been established. The emergence of internet and technology as well as the familiarity of new generations with games are likely to reflect an increasing interest in the validity of game-based assessments in predicting academic performance beyond traditional selection methods. The additive value of using a game-based assessment measuring adaptability and decision making, both as a group and individually, in predicting GPA beyond personality (e.g., FFM) and cognitive ability tests (e.g., ICAR), has been established.

Our results are of interest to researchers and practitioners of Work/Organizational Psychology interested in the prediction of work and academic performance, in that they support the incremental validity of a game-based assessment over and above traditional selection methods. They contribute to empirical unknowns about the psychometrics properties and effectiveness of the use of game-based assessments in employee selection.

Game-based assessments might be used as a supplement or replacement tool to traditional selection methods as they add to the prediction of performance of candidates or students. However, it is of high importance to test the effectiveness of game-based assessments using objective measures of performance, such as supervisor’s ratings, and a test-retest reliability methodology to establish further the psychometric properties of the new assessment method. Moreover, similar to SJTs, game-based assessments might improve the information gathered about applicants during the selection process as well as applicant reactions (Armstrong et al., Reference Armstrong, Landers, Collmus, Davis and Gangadharbatla2016). Gamification might increase engagement levels which in turn might lead to retention and motivation during the process of selection as well as better predictions about person-job fit (e.g., Chamorro-Premuzic, Akhtar, Winsborough, & Sherman, Reference Chamorro-Premuzic, Akhtar, Winsborough and Sherman2017). Using new technologies and game elements in assessments, recruiters and HR professionals might improve selection decisions making more robust inferences about their performance as game-based assessments do not use self-reported measures that applicants are likely to fake (Fetzer et al., Reference Fetzer, McNamara, Geimer, Goldstein, Pulakos, Passmore and Semedo2017).

Another reason that the use of traditional selection methods might be reconsidered and replaced by new game based tools is that the latter are popular among younger generations. Organizations including game-based assessments into the employee selection process might provide a new technologically advanced experience to applicants sending thus signals about organizational attributes (e.g., innovation) and making the process more fun.

The present study is not without limitations. First of all, performance outcomes were assessed via self-report measures. Although it is suggested that objective measures are the best indicators of individual employee performance, the unavailability of such measurements has forced many previous studies to use self-reported measures of performance (Pransky et al., Reference Pransky, Finkelstein, Berndt, Kyle, Mackell and Tortorice2006). The use of objective measures or supervisor’s report of employee’ performance would lead to more robust findings about the predictive validity of the game-based assessment. Also, some of the GBA’s dimensions were not found to predict performance. One reason might be the use of self-reported measures of performance. “It is likely that self-report and objective measures provide information on distinct, different aspects of work performance. Objective measures, even in jobs that are apparently routine and straightforward, can present challenging levels of complexity, and may provide an estimation of only one dimension of actual job performance.” (Pransky et al., Reference Pransky, Finkelstein, Berndt, Kyle, Mackell and Tortorice2006, p. 396). Future research should explore the ability of the GBA to predict one dimension of performance (e.g., resilience or adaptability) using supervisory ratings or objective performance data.

To establish further the effectiveness of the use of gamification in employee selection, future research should also explore applicants’ reactions. For example, candidates perceive multimedia tests as more valid and enjoyable and as a result, they are more satisfied with the selection process while organizational attractiveness and positive behavioral intentions are increased (Oostrom, Born, & van der Molen, Reference Oostrom, Born, van der Molen, Derks and Bakker2013). The impact of game-based assessments on perceived fairness, organizational attractiveness and job pursuit behaviors should also be investigated to support further their suitability in the selection process. Also, the current study does not address competence and previous experience with technology, which might influence test-takers’ performance. For example, candidates who have experience with on-line games and/or feel competent to use new technology might have less anxiety when new technology is used (Cascio & Montealegre, Reference Cascio and Montealegre2016), and as a result, perform better in a game-based assessment. In general, the limited knowledge and lack of empirical research on the use of gamification in employee selection has made the establishment of a game-based assessment as an effective selection method even more challenging.

Future research should also explore the role of demographic variables on individuals’ performance in game-based assessments. Instead of using demographic variables simply as mere control variables in theory testing, Spector and Brannick (Reference Spector and Brannick2011) suggest to rethink the use of demographics in the first place focusing on the mechanisms that explain relations with demographics rather than on the demographic variables that serve as proxies for the real variables of interest.

Finally, the study might suffer from common method variance effects, since we only used self-reported measures. In order to reduce its effect, we asked the participants to complete the measures in two separate occurrences. Moreover, the Harman’s single factor test we conducted following the guidelines of Podsakoff, Mackenzie, Lee, and Podsakoff (Reference Podsakoff, MacKenzie, Lee and Podsakoff2003) discouraged the impact of common method variance on our results.

Game-based assessments have recently appeared in employee selection calling for further research on their validity. Our study contributes to research on employee selection methods by examining the criterion related validity of a game-based assessment measuring soft skills. Findings of our study indicate that assessments incorporating game elements might predict self-rated job performance, and academic performance, as measured by GPA. Moreover, exploring the incremental validity of the game-based assessment method, we provided evidence that it can predict GPA above and beyond the effect of traditional selection methods, such as personality and cognitive ability tests. These results could change the way organizations and colleges approach traditional assessment methods making the use of gamification in work and academic contexts more widespread in the future.

Footnotes

How to cite this article:

Nikolaou, I., Georgiou, K., & Kotsasarlidou, V. (2018). Exploring the relationship of a game-based assessment with performance. The Spanish Journal of Psychology, 21. e6. Doi:10.1017/SJP.2019.5

3. For an example item visit https://icar-project.com/ICAR_Catalogue.pdf

References

Armstrong, M. B., Landers, R. N., & Collmus, A. B. (2016). Gamifying recruitment, selection, training, and performance management: Game-thinking in human resource management. In Davis, D. & Gangadharbatla, H. (Eds.), Emerging research and trends in gamification (pp. 140165). Hershey, PA: IGI Global.CrossRefGoogle Scholar
Avey, J. B., Reichard, R. J., Luthans, F., & Mhatre, K. H. (2011). Meta-analysis of the impact of positive psychological capital on employee attitudes, behaviors, and performance. Human Resource Development Quarterly, 22(2), 127152. https://doi.org/10.1002/hrdq.20070CrossRefGoogle Scholar
Barrick, M. R., Mount, M. K., & Judge, T. A. (2001). Personality and performance at the beginning of the new millennium: What do we know and where do we go next? International Journal of Selection and Assessment, 9(1-2), 930. https://doi.org/10.1111/1468-2389.00160CrossRefGoogle Scholar
Bhattacharya, M., Gibson, D. E., & Doty, D. H. (2005). The effects of flexibility in employee skills, employee behaviors, and human resource practices on firm performance. Journal of Management, 31(4), 622640. https://doi.org/10.1177/0149206304272347CrossRefGoogle Scholar
Brackett, M. A., Rivers, S. E., Reyes, M. R., & Salovey, P. (2012). Enhancing academic performance and social and emotional competence with the RULER feeling words curriculum. Learning and Individual Differences, 22(2), 218224. https://doi.org/10.1016/j.lindif.2010.10.002CrossRefGoogle Scholar
Cascio, W. F., & Montealegre, R. (2016). How technology is changing work and organizations. Annual Review of Organizational Psychology and Organizational Behavior, 3(1), 349375. https://doi.org/10.1146/annurev-orgpsych-041015-062352CrossRefGoogle Scholar
Chamorro-Premuzic, T., Akhtar, R., Winsborough, D., & Sherman, R. A. (2017). The datafication of talent: How technology is advancing the science of human potential at work. Current Opinion in Behavioral Sciences, 18, 1316. https://doi.org/10.1016/j.cobeha.2017.04.007CrossRefGoogle Scholar
Chamorro-Premuzic, T., & Furnham, A. (2008). Personality, intelligence and approaches to learning as predictors of academic performance. Personality and Individual Differences, 44(7), 15961603. https://doi.org/10.1016/j.paid.2008.01.003CrossRefGoogle Scholar
Chiabur, D. S., Oh, I.-S., Berry, C. M., Li, N., & Gardner, R. G. (2011). The five-factor model of personality traits and organizational citizenship behaviors: A meta-analysis. Journal of Applied Psychology, 96, 11401166. https://doi.org/10.1037/a0024004CrossRefGoogle Scholar
Condon, D. M., & Revelle, W. (2014). The international cognitive ability resource: Development and initial validation of a public-domain measure. Intelligence, 43, 5264. https://doi.org/10.1016/j.intell.2014.01.004CrossRefGoogle Scholar
Côte, S., & Miners, C. T. H. (2006). Emotional intelligence, cognitive intelligence, and job performance. Administrative Science Quarterly, 51(1), 128. https://doi.org/10.2189/asqu.51.1.1CrossRefGoogle Scholar
Fetzer, M., McNamara, J., & Geimer, J. L. (2017). Gamification, serious games and personnel selection. In Goldstein, H. W., Pulakos, E. D., Passmore, J., & Semedo, C. (Eds.), The Wiley Blackwell handbook of the psychology of recruitment, selection and employee retention (pp. 293309). West Sussex, UK: John Wiley & Sons Ltd.CrossRefGoogle Scholar
Georgiou, K., Nikolaou, I., & Gouras, A. (2017). Serious gaming in employees’ selection process. In I. Nikolaou, Alliance for Organizational Psychology Invited Symposium-The impact of technology on recruitment and selection: An international perspective. Paper presented at the 32nd Annual Conference of the Society for Industrial and Organizational Psychology, Orlando, USA.Google Scholar
Goldberg, L. R., Johnson, J. A., Eber, H. W., Hogan, R., Ashton, M. C., Cloninger, C. R., & Gough, H. G. (2006). The international personality item pool and the future of public-domain personality measures. Journal of Research in Personality, 40(1), 8496. https://doi.org/10.1016/j.jrp.2005.08.007CrossRefGoogle Scholar
Golden, W., & Powell, P. (2000). Towards a definition of flexibility: In search of the Holy Grail? Omega, 28(4), 373384. https://doi.org/10.1016/S0305-0483(99)00057-2CrossRefGoogle Scholar
Gray, A., (2016). The 10 skills you need to thrive in the Fourth Industrial Revolution. Retrieved from The World Economic Forum website: https://www.weforum.org/agenda/2016/01/the-10-skills-you-need-to-thrive-in-the-fourth-industrial-revolutionGoogle Scholar
Hamtiaux, A., Houssemand, C., & Vrignaud, P. (2013). Individual and career adaptability: Comparing models and measures. Journal of Vocational Behavior, 83(2), 130141. https://doi.org/10.1016/j.jvb.2013.03.006CrossRefGoogle Scholar
International Cognitive Ability Resource (ICAR) (2014). [Public-domain assessment tool]. Retrieved from https://icar-project.com/Google Scholar
Kehoe, J. F. (2000). Managing selection in changing organizations: Human resource strategies. San Francisco, CA: Jossey-Bass Publ.Google Scholar
Kwan, H.-K., & Mao, Y. (2011). The role of citizenship behavior in personal learning and work–family enrichment. Frontiers of Business Research in China, 5(1), 96120. https://doi.org/10.1007/s11782-011-0123-6CrossRefGoogle Scholar
Lim, B.-C., & Ployhart, R. E. (2006). Assessing the convergent and discriminant validity of Goldberg’s international personality item pool: A multitrait-multimethod examination. Organizational Research Methods, 9(1), 2954. https://doi.org/10.1177/1094428105283193CrossRefGoogle Scholar
Luthans, F. (2002). The need for and meaning of positive organizational behavior. Journal of Organizational Behavior, 23(6), 695706. https://doi.org/10.1002/job.165CrossRefGoogle Scholar
Martin, A. J., & Marsh, H. W. (2006). Academic resilience and its psychological and educational correlates: A construct validity approach. Psychology in the Schools, 43(3), 267281. https://doi.org/10.1002/pits.20149CrossRefGoogle Scholar
Martin, A. J., & Marsh, H. W. (2008). Academic buoyancy: Towards an understanding of students’ everyday academic resilience. Journal of School Psychology, 46(1), 5383. https://doi.org/10.1016/j.jsp.2007.01.002CrossRefGoogle ScholarPubMed
McKinsey & Company (Producer). (2017). The digital future of work: What skills will be needed? [Video]. Available from https://www.youtube.com/watch?v=UV46n44jnoAGoogle Scholar
Miller, D., & Lee, J. (2001). The people make the process: Commitment to employees, decision making, and performance. Journal of Management, 27(2), 163189. https://doi.org/10.1177%2F014920630102700203CrossRefGoogle Scholar
Morgeson, F. P., Campion, M. A., Dipboye, R. L., Hollenbeck, J. R., Murphy, K., & Schmitt, N. (2007). Are we getting fooled again? Coming to terms with limitations in the use of personality tests for personnel selection. Personnel Psychology, 60(4), 10291049. https://doi.org/10.1111/j.1744-6570.2007.00100.xCrossRefGoogle Scholar
Nelson, G. D. (1984). Assessment of health decision making skills of adolescents. Retrieved from ERIC database (ED252774).Google Scholar
Nikolaou, I., & Robertson, I. T. IV. (2001). The Five-Factor model of personality and work behavior in Greece. European Journal of Work and Organizational Psychology, 10(2), 161186. https://doi.org/10.1080/13594320143000618CrossRefGoogle Scholar
Ones, D. S., Dilchert, S., & Viswesvaran, C. (2012). Cognitive abilities. In Schmitt, N. (Ed.), The Oxford handbook of personnel assessment and selection (pp. 179224). New York, NY: Oxford University Press.Google Scholar
Oostrom, J. K., Born, M. P., & van der Molen, H. T. (2013). Webcam tests in personnel selection. In Derks, D. & Bakker, A. (Eds.), The psychology of digital media at work (pp. 166180). USA & Canada: Psychology Press.Google Scholar
Parker, J. D. A., Creque, R. E., Barnhart, D. L., Harris, J. I., Majeski, S. A., Wood, L. M., ... Hogan, M. J. (2004). Academic achievement in high school: Does emotional intelligence matter? Personality and Individual Differences, 37(7), 13211330. https://doi.org/10.1016/j.paid.2004.01.002CrossRefGoogle Scholar
Paul, H., Bamel, U. K., & Garg, P. (2016). Employee resilience and OCB: Mediating effects of organizational commitment. Vikalpa, 41(4), 308324. https://doi.org/10.1177/0256090916672765CrossRefGoogle Scholar
Picchi, A. (2016, August 31). Do you have the “soft skills” employers badly need? CBSNews.com . Retrieved from https://www.cbsnews.com/news/do-you-have-the-soft-skills-employers-badly-need/Google Scholar
Ployhart, R. E., & Bliese, P. D. (2006). Individual adaptability (I–ADAPT) theory: Conceptualizing the antecedents, consequences, and measurement of individual differences in adaptability. In Shawn Burke, C., Pierce, Linda G., & Salas, Eduardo (Eds.) Advances in human performance and cognitive engineering research (Vol. 6, 339). Bingley, UK: Emerald Group Publishing Limited.Google Scholar
Podsakoff, P. M., MacKenzie, S. B., Lee, J.-Y., & Podsakoff, N. P. (2003). Common method biases in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology, 88(5), 879903. https://doi.org/10.1037/0021-9010.88.5.879CrossRefGoogle ScholarPubMed
Poropat, A. E. (2009). A meta-analysis of the five-factor model of personality and academic performance. Psychological Bulletin, 135(2), 322338. https://doi.org/10.1037/a0014996CrossRefGoogle ScholarPubMed
Pransky, G., Finkelstein, S., Berndt, E., Kyle, M., Mackell, J., & Tortorice, D. (2006). Objective and self-report work performance measures: A comparative analysis. International Journal of Productivity and Performance Management, 55(5), 390399. https://doi.org/10.1108/17410400610671426CrossRefGoogle Scholar
Rohde, T. E., & Thompson, L. A. (2007). Predicting academic achievement with cognitive ability. Intelligence, 35(1), 8392. https://doi.org/10.1016/j.intell.2006.05.004CrossRefGoogle Scholar
Roth, P. L., BeVier, C. A., Switzer, F. S. III., & Schippmann, J. S. (1996). Meta-analyzing the relationship between grades and job performance. Journal of Applied Psychology, 81(5), 548556. https://doi.org/10.1037/0021-9010.81.5.548CrossRefGoogle Scholar
Ryan, A. M., & Ployhart, R. E. (2014). A century of selection. Annual Review of Psychology, 65(1), 693717. https://doi.org/10.1146/annurev-psych-010213-115134CrossRefGoogle Scholar
Schmitt, N. (2014). Personality and cognitive ability as predictors of effective performance at work. Annual Review of Organizational Psychology and Organizational Behavior, 1(1), 4565. https://doi.org/10.1146/annurev-orgpsych-031413-091255CrossRefGoogle Scholar
Shute, V. (2015, August). Stealth assessment in video games. Paper presented at the Australian Council for Educational Research Research “Conference Learning Assessments: Designing the future conference”. Melbourne, Australia.Google Scholar
Shute, V. J., Ventura, M., Bauer, M., & Zapata-Rivera, D. (2009). Melding the power of serious games and embedded assessment to monitor and foster learning. In Ritterfeld, U., Cody, M., & Vordered, P. (Eds.), Serious games: Mechanisms and effects (pp. 295321). New York and London: Routledge, Taylor & Francis.Google Scholar
Smith, C. A., Organ, D. W., & Near, J. P. (1983). Organizational citizenship behavior: Its nature and antecedents. Journal of Applied Psychology, 68(4), 653663. https://psycnet.apa.org/doi/10.1037/0021-9010.68.4.653CrossRefGoogle Scholar
Somech, A. (2010). Participative decision making in schools: A mediating-moderating analytical framework for understanding school and teacher outcomes. Educational Administration Quarterly, 46(2), 174209. https://doi.org/10.1177/1094670510361745CrossRefGoogle Scholar
Spector, P. E., & Brannick, M. T. (2011). Methodological urban legends: The misuse of statistical control variables. Organizational Research Methods, 14(2), 287305. https://doi.org/10.1177/1094428110369842CrossRefGoogle Scholar
Strenze, T. (2007). Intelligence and socioeconomic success: A meta-analytic review of longitudinal research. Intelligence, 35(5), 401426. https://doi.org/10.1016/j.intell.2006.09.004CrossRefGoogle Scholar
van Iddekinge, C. H., Lanivich, S. E., Roth, P. L., & Junco, E. (2016). Social media for selection? Validity and adverse impact potential of a Facebook-based assessment. Journal of Management, 42(7), 18111835. https://doi.org/10.1177/0149206313515524CrossRefGoogle Scholar
Figure 0

Table 1. Inter-Correlation Matrix of Study’s Variables (N = 63–120)

Figure 1

Table 2. Hierarchical Regression Analysis of the GBA on the Three Criterion Measures

Figure 2

Table 3. Hierarchical Regression Analysis of the GBA on the Three Criterion Measures controlling for Cognitive Ability and Personality