Cognition Package is a Flutter package for building cognitive tests for study apps on Android and iOS built using the Research Package.
The overarching goal of Cognition Package is to enable developers and researchers to design and build cross-platform (iOS and Android) cognitive assessment applications that rely on validated gold-standard cognitive tests. When combined with Research Package, Cognition Package meets the requirements of most scientific research, including capturing participant consent, extensible input tasks, and the security and privacy needs necessary for IRB approval.
Cognition Package is a Flutter implementation of a cognitive test battery including 14 validated gold-standard cognitive tests spanning all 8 neurocognitive domains:
- Sensation
- Perception
- Motor skills and construction
- Attention and concentration
- Memory
- Executive functioning
- Processing speed
- Language and verbal skills
Each test in Cognition Package is implemented as an RPActivityStep
from Research Package.
As such, they may be used inside an RPTask
along with other types of RPStep
s.
Each test consists of 3 key sections; the instructions for the test, the test itself, and the test results. Hence, each test includes 3 classes that defines:
- The model class which extends
RPActivityStep
and defines the parameters available for the specific test (e.g., the length of the test or the amount of repetitions), as well as the function to calculate the final score of the test. - The UI class which describes how the test is rendered on the screen and the logic of running the test.
- The
RPResult
class which describes the data collected from the test and adds it to the list of all results from the task.
The current set of cognitive tests in the Cognition Package are:
- Multiple Object Tracking
- Corsi Block Tapping
- Verbal Recognition Memory
- Delayed Recall
- Flanker
- Letter Tapping
- Paired Associative Learning
- Picture Sequence Memory
- Rapid Visual Information Processing
- Reaction Time
- Stroop Effect
- Finger Tapping
- Trail Making
- Visual Array Change
Cognition Package is part of the overall Copenhagen Research Platform (CARP) which also provides a Flutter package for mobile and wearable sensing called CARP Mobile Sensing.
There is a set of tutorials, describing:
- the overall software architecture of Research Package
- the overall software architecture of Cognition Package
- how to create a cognitive test
- the cognition_package Flutter API is available (and maintained) as part of the package release at pub.dev
- localization support in Research Package which also applies for Cognition Package
There is an example app which demonstrates the different features of Cognition Package as implemented in a Flutter app.
The cognitive test to be shown in the example app can be configured in the cognition_config.dart
file:
// Here the list of cognitive test are added to an RP ordered task.
// Uncomment the ones you want to see a demo of.
RPOrderedTask cognitionTask = RPOrderedTask(
identifier: 'cognition_demo_task',
steps: [
reactionTime,
pairedAssociatesLearning,
tapping,
corsiBlockTapping,
stroopEffect,
rapidVisualInfoProcessing,
trailMaking,
continuousVisualTracking,
wordRecall,
pictureSequenceMemory,
letterTapping,
flanker,
visualArrayChange,
delayedRecall,
completionStep,
],
);
The cognitionTask
defines the list of cognitive tasks to be shown and you may include the ones you want to see.
Cognition Package support localization via the localization support in Research Package. Currently, the package supports English (en
), Danish (da
), French (fr
), and Portuguese (pt
).
Note: The sounds used in the Letter Tapping test and Word Recall tests for now only use English letters and words. This might be translated in a future version of the package and PRs for this is most welcome.
In order to support localization in your app, add the RPLocalizations.delegate
and the CPLocalizations.delegate
delegates to your list of delegates in your MaterialApp
configuration. See the main.dart
in the example app for how this can be done.
Cognition Package is made by the Copenhagen Center for Health Technology (CACHET) and is a component in the Copenhagen Research Platform (CARP), which is used in a number of applications and studies.
We are more than happy to take contributions and feedback. Use the Issues page to file an issue or feature request. Besides general help for enhancement and quality assurance (bug fixing), we welcome input on new cognitive tests.
Note that the tests in this package may be subject to different copyright terms. It is your responsibility to investigate if you can use these tests for your specific purpose and application, and if you need to obtain a permission from the copyright holder(s).
In the table below, we have provided links to copyright statements (where applicable), which you may want to consult, if you're using a test. If it states (c) CACHET this implies that the test is designed by us, and hence copyright (MIT license) to CACHET.
Note, however, as per the MIT license, this software is provided "as is" and in no event shall the authors (i.e., us) be liable for any claim - including copyright issues - arising from the use of this software.
Test | Copyright |
---|---|
Multiple Object Tracking | (c) CACHET |
Corsi Block Tapping | PsyToolkit |
Verbal Recognition Memory | MoCa |
Delayed Recall | MoCa |
Flanker | (c) CACHET |
Letter Tapping | MoCa |
Paired Associative Learning | Cambridge Cognition Ltd |
Picture Sequence Memory | NIHTB-CB |
Rapid Visual Information Processing | Cambridge Cognition Ltd |
Reaction Time | (c) CACHET |
Stroop Effect | (c) CACHET |
Finger Tapping | (c) CACHET |
Trail Making | public domain |
Visual Array Change | (c) CACHET |
This software is copyright (c) Copenhagen Center for Health Technology (CACHET) at the Technical University of Denmark (DTU). This software is available 'as-is' under a MIT license.