Abstract

Objective:

The current investigation sought to define the relationship between established performance validity tests and measures of memory via a factor analytic strategy first published by Heyanka, Thaler, Linck, Pastorek, Miller, Romesser, & Sim (2015). A Factor analytic approach to the validation of the Word Memory Test and Test of Memory Malingering as measures of effort and not memory. Archives of Clinical Neuropsychology, 30, 369–376.

Method:

The full range of Medical Symptom Validity Test (MSVT) and Non-Verbal Medical Symptom Validity Test (NV-MSVT) subtests were factor analyzed with the memory scales of the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS) in a sample of 346 service members with a history of concussion.

Results:

A two-factor solution was extracted with the MSVT and NV-MSVT effort and paired associate subtests loading on one factor and the RBANS subtests loading on a second factor.

Conclusions:

Results support the conclusion that the effort subtests of the MSVT and NV-MSVT tap a different construct from established memory measures.

Introduction

Performance validity tests (PVTs) are increasingly recognized as an essential element of neuropsychological assessment (Bush et al., 2005; Heilbronner, Sweet, Morgan, Larrabee, & Millis, 2009). The foundational premise of these instruments is that they are reasonably unaffected by brain dysfunction and as a result gauge the test taker's effort and motivation to perform well across the administered cognitive battery. A number of stand-alone PVTs are available with an increasing number of studies demonstrating acceptable classification statistics for these measures across various patient populations (Flaro, Green, & Robertson, 2007; Gervais, Rohling, Green, & Ford, 2004; Green, 2007; Hartman, 2002; Williamson et al., 2004; Wynkoop & Denney, 2005). Despite this literature, there continues to be some dissention in the field, with a minority of research indicating that PVTs are unduly influenced by tester taker ability (e.g., Bowden, Shores, & Mathias, 2006; Willis, Farrer, & Bigler, 2011).

In an effort to further inform this debate, Heyanka and colleagues (2015) employed an exploratory factor analytic (EFA) strategy to investigate the relationship between select subtests on the Word Memory Test (WMT; Green, 2003), Test of Memory Malingering (TOMM; Tombaugh, 1996), and the California Verbal Learning Test-Second Edition (CVLT-II; Delis, Kramer, Kaplan, & Ober, 2000) in a sample of 160 Veterans with a history of mild Traumatic Brain Injury (mTBI). More specifically, the researchers examined the Word Memory Test Immediate Recognition (IR), Delayed Recognition (DR), and Consistency (CNS) subtests; TOMM Trials 1 and 2; and the CVLT-II Trials 1–5, Long Delay Free Recall, Long Delay Recognition, and Forced Choice. A two-factor solution was extracted with the five PVT variables from the TOMM and WMT loading on one factor and the four CVLT-II subtests loading on a second. The authors concluded that the analyzed WMT and TOMM subtests “measured effort independent of memory” within their veteran sample.

The first aim of the current study was to extend the Heyanka and colleagues (2015) findings by evaluating the relationships between PVT and memory measures in an independent sample composed of active duty military service members. Given the data available in this retrospective study, different PVTs and memory measures were employed. More specifically, the Medical Symptom Validity Test (MSVT; Green, 2004) and Non-Verbal Medical Symptom Validity Test (NV-MSVT; Green, 2008) represented the PVTs and select subtests of the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS; Randolph, 1998) represented the memory measures.

A secondary aim of the current study was to evaluate the full range of MSVT and NV-MSVT subtests. In addition to the IR, DR, and CNS subtests, the WMT also has Multiple Choice (MC), Paired Associated (PA), and Free Recall (FR) subtests. As noted above IR, DR, and CNS are considered relatively easy and generally impervious to brain dysfunction. However, the MC, PA, and FR subtests were originally designed to be measures of verbal memory (Green, 2003). A recent study by Armistead-Jehle, Green, Gervais, and Hungerford (2015), which evaluated the WMT, CVLT, and CVLT-II in a sample of over 3,000 subjects, confirmed this assertion. The Heyanka and colleagues (2015) study did not include the WMT MC, PA, and FR subtests in their EFA. However, the MSVT and NV-MSVT have PA and FR subtests that can be included in a factor analysis. We hypothesized that the factor analysis would result in a two-factor solution with the MSVT and NV-MVST effort subtests (i.e., MSVT IR, DR, CNS, and the NV-MSVT IR, DR, CNS, Delayed Recognition Variation [DRV], and Delayed Recognition Archetypes [DRA]) constituting one factor and the RBANS memory subtests (i.e., RBANS List Learning, Story Learning, List Recall, Story Recall, and Figure Recall) as well as the FR and PA subtests of the MSVT and NV-MSVT constituting a second factor.

Methods

Sample

The study included 346 United States active duty military service members evaluated in an outpatient Neuropsychology Clinic located at a Midwest US Army Health Center between June 2009 and July 2015. Subjects were referred for neuropsychological evaluation primarily by family practice or behavioral health providers as a consequence of their reported or documented history of concussion and presenting symptoms. A small subsample (6.1%, N = 21) was in the midst of a medical board evaluation at the time of assessment. Based on self-report, all subjects met criteria for mild TBI per the American College of Rehabilitation Medicine Criteria (ACRM, 1993). The average time since the most recent reported concussion was 49.83 months (SD = 37.20). The average number of reported life-time concussions was 4.6 (SD = 5.0). Unfortunately, data related to severity of mTBI (e.g., durations of alteration of consciousness, post traumatic amnesia, or lost consciousness) were not available for this study. Regarding psychopathology, based on clinical evaluation, 20.1% of the participants had an anxiety disorder/PTSD diagnosis, 15.1% had a depressive disorder diagnosis, 8.1% had comorbid diagnoses of anxiety and depression, 4.1% had an adjustment disorder diagnosis, 4.4% had other psychiatric diagnoses, and 16.9% had no psychiatric diagnosis. There were no indications of other notable neurological or neurodevelopmental illness, injury, or diagnoses in the sample and as such no related exclusionary criteria were employed in subject selection.

The average age of the sample was 35.4 years (SD = 7.8) with an average education of 15.0 years (SD = 2.3). The majority of the sample was male (91.3%). Ethnic breakdown was as follows: Caucasian (79.2%), African American (13.0%), Hispanic (5.2%), Native American (0.9%), Asian (1.4%), and Pacific Islander (0.3%). This retrospective analysis of clinical data was approved by the Institutional Review Board at Madigan Army Medical Center.

Measures

The MSVT (Green, 2004) is a brief automated verbal memory screening with several subtests designed to measure verbal memory and response consistency. Ten easy-to-remember word pairs representing a single common object (e.g., Ballpoint-Pen) are shown across two trials. Afterward, two forced choice recognition subtests are administered and the measure is concluded with Paired Associates and Free Recall trials. In addition to data presented in the manual, a number of studies have demonstrated the utility of this measure in the discrimination between those with genuine memory impairment and those simulating impairment in a range of patient samples (see Carone, 2009 for review).

The NV-MSVT (Green, 2008) is a brief automated nonverbal memory screening with several subtests designed to measure non-verbal memory and response consistency. Ten artist-drawn colored images representing an intuitive pair of items (e.g., a baseball and a baseball bat) are shown across two trials. Afterward, a series of forced choice trails with varying degrees of difficulty are presented. These are followed by a Paired Associates recall subtest and the measure is concluded with a Free Recall task. The sensitivity and specificity of the NV-MSVT as a measure of performance symptom validity has been validated in a number of studies (see Wager & Howe, 2010 for review).

The RBANS (Randolph, 1998) is a brief battery comprised 12 subtests that yield five index scores and a total summary score. The psychometric properties and clinical utility of this measure has been well established across various patient populations to include TBI (e.g., McKay, Casey, Wertheimer, & Fichtenberg, 2007; Moser & Shatz, 2002).

Data Analysis

Data were analyzed utilizing SPSS 20. The original intent was to closely follow the analysis of Heyanka and colleagues (2015), but unfortunately Principle Axis Factoring (PFA) produced a Haywood case preventing initial extraction. Instead, the initial solution was developed with principle components analysis (PCA), which has slightly different assumptions that can yield small differences in factor loadings, but provides very similar solutions to PFA with reliable factors (Kline, 2013). In general, PCA assumes commonalities sum to 1, as opposed to PFA which makes an estimate of communalities. Consequently, PCA is much less likely to produce a Haywood case, particularly in datasets where assumptions of normality are violated. Initial factor solutions were restricted to those factors with Eigen values over 0.80, >5% of unique variance contributed to the solution, analysis of scree plots, and unique loading of items to factors. After establishing the number of factors, extraction was again conducted via PFA as in Heyanka and colleagues, but was limited to 2 factors selected through an initial PCA analysis (please see results below), and then an oblique rotation was chosen.

Results

Correlations between subtests of the RBANS, MSVT, and NV-MSVT ranged from 0.897 to 0.146, with subtests of the MSVT and NV-MSVT being more highly correlated with each other than with RBANS subtests (Table 1). Using PCA yielded, five variables with Eigen values over 0.80, four that accounted for >5% of solution variance, and three on the analysis of the scree plot slope. When the initial solution was reviewed only 2 factors had >1 item with unique loading, and thus a 2-factor model was accepted as the best explanation of this data. Following this, a two-factor solution was evaluated via PFA, following the Heyanka and colleagues solution accounting for 51.65% of the variance through Principal Axis Factoring with an oblique Promax rotation. In review of the Pattern Matrix (see Table 2), Factor 1 (Effort) consisted of the MSVT IR, DR, CNS, and PA and the NV-MSVT IR, DR, CNS, DRA, DRV, and PA. Factor 2 (Memory) consisted of all RBANS subtests (with the List Recognition loading on both factors, but more heavily on the memory factor), the NV-MSVT FR subtest, and the MSVT FR (which loaded on both factors, but more heavily on the memory factor). This factor structure largely supports the hypothesis that a memory and an effort factor would emerge. In review of the structure matrix (Table 3), variables on the Effort Factor possessed moderate cross loadings (.648 to .316), as did the Memory Factor (.656 to .376), as was the overall correlation between factors (.649). Of note, the CNS variables of the MSVT and NV-MSVT are not independent measures, but are derived from the respective IR and DR scores. As such, one could argue a part-whole correlation issue when including the CNS subtests in the factor analysis. In re-running, the factor analysis without the MSVT and NV-MSVT CNS subtests the analysis converged and there was no Haywood case. Still the resulting pattern matrix produced the same two-factor solution with nearly identical factor loadings as reported above when the CNS subtests were included.

Table 1.

Correlation matrix of MSVT, NV-MSVT, and RBANS memory subtests

 MSVT IR MSVT DR MSVT CNS MSVT PA MSVT FR NV-MSVT IR NV-MSVT DR NV-MSVT CON NV-MSVT DRA NV-MSVT DRV NV-MSVT PA NV-MSVT FR RBANS List Learning RBANS Story Memory RBANS List Recall RBANS List Recognition RBANS Story Recall RBANS Figure Recall 
MSVT IR                  
MSVT DR 0.83                 
MSVT CNS 0.72 0.89                
MSVT PA 0.67 0.77 0.75               
MSVT FR 0.50 0.61 0.63 0.68              
NV-MSVT IR 0.63 0.51 0.35 0.37 0.28             
NV-MSVT DR 0.49 0.52 0.49 0.50 0.42 0.38            
NV-MSVT CON 0.59 0.60 0.51 0.53 0.44 0.55 0.97           
NV-MSVT DRA 0.61 0.65 0.66 0.58 0.51 0.35 0.49 0.53          
NV-MSVT DRV 0.56 0.59 0.59 0.61 0.51 0.45 0.55 0.57 0.60         
NV-MSVT PA 0.43 0.42 0.38 0.42 0.31 0.54 0.40 0.45 0.37 0.45        
NV-MSVT FR 0.33 0.45 0.49 0.44 0.59 0.19 0.31 0.33 0.35 0.34 0.28       
RBANS List Learning 0.42 0.46 0.47 0.47 0.57 0.31 0.27 0.31 0.38 0.42 0.20 0.43      
RBANS Story Memory 0.32 0.42 0.40 0.40 0.46 0.25 0.17 0.22 0.27 0.29 0.17 0.35 0.52     
RBANS List Recall 0.28 0.32 0.30 0.33 0.42 0.23 0.29 0.32 0.25 0.27 0.20 0.34 0.56 0.34    
RBANS List Recognition 0.39 0.51 0.44 0.49 0.41 0.29 0.45 0.47 0.46 0.45 0.36 0.31 0.37 0.31 0.56   
RBANS Story Recall 0.33 0.43 0.43 0.42 0.45 0.22 0.27 0.30 0.30 0.36 0.21 0.28 0.46 0.65 0.51 0.52  
RBANS Figure Recall 0.20 0.23 0.24 0.27 0.24 0.15 0.24 0.25 0.17 0.20 0.16 0.26 0.25 0.26 0.41 0.26 .55 
 MSVT IR MSVT DR MSVT CNS MSVT PA MSVT FR NV-MSVT IR NV-MSVT DR NV-MSVT CON NV-MSVT DRA NV-MSVT DRV NV-MSVT PA NV-MSVT FR RBANS List Learning RBANS Story Memory RBANS List Recall RBANS List Recognition RBANS Story Recall RBANS Figure Recall 
MSVT IR                  
MSVT DR 0.83                 
MSVT CNS 0.72 0.89                
MSVT PA 0.67 0.77 0.75               
MSVT FR 0.50 0.61 0.63 0.68              
NV-MSVT IR 0.63 0.51 0.35 0.37 0.28             
NV-MSVT DR 0.49 0.52 0.49 0.50 0.42 0.38            
NV-MSVT CON 0.59 0.60 0.51 0.53 0.44 0.55 0.97           
NV-MSVT DRA 0.61 0.65 0.66 0.58 0.51 0.35 0.49 0.53          
NV-MSVT DRV 0.56 0.59 0.59 0.61 0.51 0.45 0.55 0.57 0.60         
NV-MSVT PA 0.43 0.42 0.38 0.42 0.31 0.54 0.40 0.45 0.37 0.45        
NV-MSVT FR 0.33 0.45 0.49 0.44 0.59 0.19 0.31 0.33 0.35 0.34 0.28       
RBANS List Learning 0.42 0.46 0.47 0.47 0.57 0.31 0.27 0.31 0.38 0.42 0.20 0.43      
RBANS Story Memory 0.32 0.42 0.40 0.40 0.46 0.25 0.17 0.22 0.27 0.29 0.17 0.35 0.52     
RBANS List Recall 0.28 0.32 0.30 0.33 0.42 0.23 0.29 0.32 0.25 0.27 0.20 0.34 0.56 0.34    
RBANS List Recognition 0.39 0.51 0.44 0.49 0.41 0.29 0.45 0.47 0.46 0.45 0.36 0.31 0.37 0.31 0.56   
RBANS Story Recall 0.33 0.43 0.43 0.42 0.45 0.22 0.27 0.30 0.30 0.36 0.21 0.28 0.46 0.65 0.51 0.52  
RBANS Figure Recall 0.20 0.23 0.24 0.27 0.24 0.15 0.24 0.25 0.17 0.20 0.16 0.26 0.25 0.26 0.41 0.26 .55 

Note: MSVT = Medical Symptom Validity Test; NV-MSVT = Non-Verbal Medical Symptom Validity Test; RBANS = Repeatable Battery for the Assessment of Neuropsychological Status; IR = Immediate Recognition; DR = Delayed Recognition; CNS = Consistency; DRV = Delayed Recognition Variations; DRA = Delayed Recognition Archetypes; PA = Paired Associates; FR = Free Recall.

Table 2.

Pattern matrix of the RBANS, MSVT, and NV-MSVT

 Factor
 
Subtest 1 (Effort) 2 (Memory) 
MSVT IR 0.834 −0.103 
MSVT DR 0.785 0.138 
MSVT CNS 0.668 0.213 
MSVT PA 0.628 0.247 
MSVT FR 0.342 0.484 
NV-MSVT IR 0.663 −0.103 
NV-MSVT DR 0.824 −0.145 
NV-MSVT CNS 0.921 −0.176 
NV-MSVT DRV 0.701 0.056 
NV-MSVT DRA 0.785 0.028 
NV-MSVT PA 0.612 −0.082 
NV-MSVT FR 0.225 0.389 
RBANS List Learning 0.064 0.659 
RBANS Story Learning −0.165 0.733 
RBANS List Recall −0.079 0.708 
RBANS List Recognition 0.302 0.403 
RBANS Story Recall −0.100 0.899 
RBANS figure Recall −0.081 0.535 
 Factor
 
Subtest 1 (Effort) 2 (Memory) 
MSVT IR 0.834 −0.103 
MSVT DR 0.785 0.138 
MSVT CNS 0.668 0.213 
MSVT PA 0.628 0.247 
MSVT FR 0.342 0.484 
NV-MSVT IR 0.663 −0.103 
NV-MSVT DR 0.824 −0.145 
NV-MSVT CNS 0.921 −0.176 
NV-MSVT DRV 0.701 0.056 
NV-MSVT DRA 0.785 0.028 
NV-MSVT PA 0.612 −0.082 
NV-MSVT FR 0.225 0.389 
RBANS List Learning 0.064 0.659 
RBANS Story Learning −0.165 0.733 
RBANS List Recall −0.079 0.708 
RBANS List Recognition 0.302 0.403 
RBANS Story Recall −0.100 0.899 
RBANS figure Recall −0.081 0.535 

Note: MSVT = Medical Symptom Validity Test; NV-MSVT = Non-Verbal Medical Symptom Validity Test; RBANS = Repeatable Battery for the Assessment of Neuropsychological Status; IR = Immediate Recognition; DR = Delayed Recognition; CNS = Consistency; DRV = Delayed Recognition Variations; DRA = Delayed Recognition Archetypes; PA = Paired Associates; FR = Free Recall.

Table 3.

Structure matrix of the RBANS, MSVT, and NV-MSVT

 Factor
 
Subtest 1 (Effort) 2 (Memory) 
MSVT IR 0.819 0.518 
MSVT DR 0.875 0.648 
MSVT CNS 0.807 0.647 
MSVT PA 0.789 0.655 
MSVT FR 0.656 0.709 
NV-MSVT IR 0.597 0.328 
NV-MSVT DR 0.730 0.390 
NV-MSVT CNS 0.807 0.647 
NV-MSVT DRV 0.737 0.511 
NV-MSVT DRA 0.725 0.487 
NV-MSVT PA 0.559 0.316 
NV-MSVT FR 0.477 0.535 
RBANS List Learning 0.492 0.701 
RBANS Story Learning 0.376 0.668 
RBANS List Recall 0.381 0.657 
RBANS List Recognition 0.564 0.599 
RBANS Story Recall 0.381 0.657 
RBANS figure Recall 0.266 0.482 
 Factor
 
Subtest 1 (Effort) 2 (Memory) 
MSVT IR 0.819 0.518 
MSVT DR 0.875 0.648 
MSVT CNS 0.807 0.647 
MSVT PA 0.789 0.655 
MSVT FR 0.656 0.709 
NV-MSVT IR 0.597 0.328 
NV-MSVT DR 0.730 0.390 
NV-MSVT CNS 0.807 0.647 
NV-MSVT DRV 0.737 0.511 
NV-MSVT DRA 0.725 0.487 
NV-MSVT PA 0.559 0.316 
NV-MSVT FR 0.477 0.535 
RBANS List Learning 0.492 0.701 
RBANS Story Learning 0.376 0.668 
RBANS List Recall 0.381 0.657 
RBANS List Recognition 0.564 0.599 
RBANS Story Recall 0.381 0.657 
RBANS figure Recall 0.266 0.482 

Note: MSVT = Medical Symptom Validity Test; NV-MSVT = Non-Verbal Medical Symptom Validity Test; RBANS = Repeatable Battery for the Assessment of Neuropsychological Status; IR = Immediate Recognition; DR = Delayed Recognition; CNS = Consistency; DRV = Delayed Recognition Variations; DRA = Delayed Recognition Archetypes; PA = Paired Associates, FR = Free Recall.

Discussion

The primary aim of this investigation was to extend the previous work by Heyanka and colleagues (2015) who factor analyzed select WMT, TOMM, and CVLT-II subtests in an effort to determine if the measures represented unique constructs. The current study employed different PVTs and memory measures, but found similar results. More specifically, the MSVT IR, DR, and CNS and the NV-MSVT, IR, DR, CNS, DRV, and DRA loaded on a single factor, while the RBANS List Learning, Story Learning, List Recall, List Recognition, Story Recall, and Figure Recall subtests loaded on a second factor with the MSVT and NV-MSVT FR subtests. The results extend the Heyanka and colleagues study by demonstrating that, like the WMT, the MSVT and NV-MSVT effort subtests do in fact measure effort as a unique construct separate from memory. Like the Heyanka and colleagues data, the current data lend support to the assertion that PVTs measure effort independent of memory abilities.

In addition to employing different PVTs, the current study also extended the Heyanka and colleagues (2015) work by investigating the range of available subtests on these measures. We initially hypothesized that the PA and FR subtests of the MSVT and NV-MSVT would load on the memory factor. By demonstrating that the FR subtests of the MSVT and NV-MSVT aligned most closely with the RBANS memory subtests, the current data provide partial support for this hypothesis. However, it is acknowledged that the FR subtests also loaded, though to a lesser degree, onto the effort factor. As such it appears that both constructs could be measured to some extent by the FR subtests. Our hypothesis that the PA subtests of the MSVT and NV-MSVT would also load with the memory scales was not supported, as both PA subtests loaded with the effort factor. Thus it appears that these subtests function more as effort measures than ability measures in the current sample.

To the authors knowledge no previous study has investigated how the MSVT and NV-MSVT PA and FR subtests operate in terms of effort versus ability measures. A few studies have however examined the PA and FR subtests of the WMT, which are similar in format but less challenging than the MSVT. Donders and Strong (2013) concluded in their sample of 107 subjects with a history of traumatic brain injury that the WMT PA and FR subtests are not sensitive to memory impairment. However Armistead-Jehle and colleagues (2015) found in a sample of over 3,000 disability seeking cases that the WMT PA and FR subtests functioned as measures of episodic memory in a manner similar to the California Verbal Learning Test First and Second editions.

The difference between the current results and that of the Armistead-Jehle and colleagues (2015) study could potentially be a function of the difficulty of the WMT versus the MSVT and NV-MSVT stimuli. More specifically, the WMT has more items and it could be argued that the pairs of stimuli are not as intuitive relative to the MSVT and NV-MSVT stimuli. As such, the MSVT and NV-MSVT FR and PA subtests are objectively less difficult to manage and thus require less memory ability, which in turn could have resulted in the factor loadings that were demonstrated. The same argument could be made for the RBANS List Recognition subtest, which also loaded to a relative degree on both factors. The degree of difficulty in the recognition of 10 among 20 items is less than the free recall of the 10 items after an ∼20 min delay. It could then be hypothesized that less memory ability is necessary for the recognition subtest, which is why we see a high, but not predominant, loading on the effort factor for this subtest. It is however readily acknowledged that these are empirical questions that require future investigation.

Limitations of the current study include a convenience sample of subjects that were relatively homogeneous in terms of both demographics and diagnosis. Future investigations with a dissimilar diagnostic group would extend the external validity of the current results and those of the Heyanka and colleagues (2015) work. Next, while all subjects reported a history of mTBI, data on the severity of these injuries (e.g., durations of alteration of consciousness, post-traumatic amnesia, or lost consciousness) were not available for this investigation. This lack of descriptive data could hinder the generalizability of the findings. The current study also included a screening measure of memory and more robust instruments of this cognitive domain (i.e., CVLT-II or Rey Complex Figure) would strengthen the conclusions. Finally, by design PVTs have a higher floor and thus produce a more skewed distribution relative to other cognitive measures. Conceivably, this could produce commonalities in factor analysis, which is a limitation of this study and this line of research in general. Despite these limitations, the investigation served to extend the factor analytic findings of the Heyanka and colleagues study and provide support for the conclusion that that MSVT IR, DR, CNS, and PA and NV-MSVT IR, DR, CNS, DRV, DRA, and PA subtests in fact measure effort independent of memory functioning.

Authors’ Contributions

The views, opinions, and/or findings contained in this article are those of the authors and should not be construed as an official Department of the Army, Department of Defense, Department of Veterans Affairs, or U.S. Government position, policy or decision unless so designated by other official documentation.

References

American College of Rehabilitation Medicine
(
1993
).
Definition of mild traumatic brain injury
.
Journal of Head Trauma Rehabilitation
 ,
8
,
86
87
.
Armistead-Jehle
P.
,
Green
P.
,
Gervais
R. O.
,
Hungerford
L.
(
2015
).
The examination of the Word Memory Test as a measure of memory
.
Applied Neuropsychology: Adult
 ,
22
,
415
426
.
Bowden
S. C.
,
Shores
E. A.
,
Mathias
J. L.
(
2006
).
Does effort suppress cognition after traumatic brain injury? A re-examination of the evidence for the Word Memory Test
.
The Clinical Neuropsychologist
 ,
20
,
858
872
.
Bush
S. S.
,
Ruff
R. M.
,
Troster
A. I.
,
Barth
J. T.
,
Koffler
S. P.
,
Pliskin
N. H.
et al
. (
2005
).
Symptom validity assessment: Practice issues and medical necessity NAN Policy and Planning Committee
.
Archives of Clinical Neuropsychology
 ,
20
,
419
426
.
Carone
D. A.
(
2009
).
Test review of the Medical Symptom Validity Test (2009)
.
Applied Neuropsychology
 ,
16
,
309
311
.
Delis
D. C.
,
Kramer
J. H.
,
Kaplan
E.
,
Ober
B. A.
(
2000
).
California Verbal Learning Test – Second Edition
 ,
San Antonio, TX: Psychological Corporation
.
Donders
J.
,
Strong
C. H.
(
2013
).
Does Green's Word Memory Test really measure memory?
Journal of Clinical and Experimental Neuropsychology
 ,
35
,
827
834
.
Flaro
L.
,
Green
P.
,
Robertson
E.
(
2007
).
Word Memory Test failure 23 times higher in mild brain injury than in patients seeking custody: The power of external incentives
.
Brain Injury
 ,
21
,
373
383
.
Gervais
R. O.
,
Rohling
M. S.
,
Green
P.
,
Ford
W.
(
2004
).
A comparison of the WMT, CARB, and TOMM failure rates in non-head injury disability claimants
.
Archives of Clinical Neuropsychology
 ,
19
,
475
487
.
Green
P.
(
2003
).
Manual for the Word Memory Test for Windows
 .
Edmonton
:
Green's Publishing
.
Green
P.
(
2004
).
Green's Medical Symptom Validity Test (MSVT) for Microsoft Windows: User's manual
 .
Edmonton, Canada
:
Green's Publishing
.
Green
P.
(
2007
).
The pervasive influence of effort on neuropsychological tests
.
Physical Medicine and Rehabilitation Clinics of North America
 ,
18
,
43
68
.
Green
P.
(
2008
).
Manual for the Nonverbal Medical Symptom Validity Test
 .
Edmonton, Canada
:
Green's Publishing
.
Hartman
D. E.
(
2002
).
The unexamined lie is a lie worth fibbing: Neuropsychological malingering and the Word Memory Test
.
Archives of Clinical Neuropsychology
 ,
17
,
709
714
.
Heilbronner
R. L.
,
Sweet
J. J.
,
Morgan
J. E.
,
Larrabee
G. L.
,
Millis
S. R.
,
Conference Participants
. (
2009
).
American academy of clinical neuropsychology consensus conference statement on the neuropsychological assessment of effort, response bias, and malingering
.
The Clinical Neuropsychologist
 ,
23
,
1093
1129
.
Heyanka
D. J.
,
Thaler
N. S.
,
Linck
J. F.
,
Pastorek
N. J.
,
Miller
B.
,
Romesser
J.
et al
. (
2015
).
A Factor analytic approach to the validation of the Word Memory Test and Test of Memory Malingering as measures of effort and not memory
.
Archives of Clinical Neuropsychology
 ,
30
,
369
376
.
Kline
R. B.
(
2013
).
Exploratory and confirmatory factor analysis
. In
Petscher
Y.
,
Schatsschneider
C.
(Eds.),
Applied quantitative analysis in the social sciences
  (pp.
171
207
).
New York
:
Routledge
.
Moser
R. S.
,
Shatz
P.
(
2002
).
Enduring effects of concussion in young athletes
.
Archives of Clinical Neuropsychology
 ,
17
,
91
100
.
McKay
C.
,
Casey
J. E.
,
Wertheimer
J.
,
Fichtenberg
N. L.
(
2007
).
Reliability and validity of the RBANS in a traumatic brain injured sample
.
Archives of Clinical Neuropsychology
 ,
22
,
91
98
.
Randolph
C.
(
1998
).
Repeatable battery for the assessment of neuropsychological status manual
 .
San Antonio, TX
:
The Psychological Corporation
.
Tombaugh
T. N.
(
1996
).
The Test of Memory Malingering
 .
Los Angeles
:
Western Psychology Corporation
.
Wager
J. G.
,
Howe
L. L. S.
(
2010
).
Nonverbal Medical Symptom Validity Test: Try faking now!
Applied Neuropsychology
 ,
17
,
305
309
.
Williamson
D. J.
,
Drane
D. J.
,
Stroup
E. S.
,
Miller
J. W.
,
Holmes
M. D.
,
Wilensky
A. J.
(
2004
).
Detecting cognitive differences between patients with epilepsy and patients with psychogenic seizures: Effort matters
.
Epilepsia
 ,
45
,
179
.
Willis
P. F.
,
Farrer
T. J.
,
Bigler
E. D.
(
2011
).
Are effort measures sensitive to cognitive impairment?
Military Medicine
 ,
176
,
1426
1431
.
Wynkoop
T. F.
,
Denney
R. L.
(
2005
).
Test review: Green's Word Memory Test (WMT) for Windows
.
Journal of Forensic Neuropsychology
 ,
4
,
101
105
.