-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Active tasks requiring motion sensor data #1
Comments
Hi @davwillev - collection of sensor data is supported in the CARP Mobile Sensing framework (that I am responsible for). See This framework is, however, designed for continuous, long-term sensing in the background,. What is you exact use case? |
Hi @bardram My personal interest is to use motion data to assess aspects of body movements (range, smoothness, etc). There are several 'walking' active tasks on RK which provide some useful data relating to gait. There are also some 'range of motion' tasks on RK (knee and shoulder), which I have contribbuted to and have since created versions of my own (back and neck). It would be nice to reproduce all of these on RP. However, I thnk background sensing would also offer some excellent possibilities for research data. |
Hi @davwillev
Thanks for the input. I am aware of the mPower study, but have not looked into the details and replicated this in CAMS. But it would be a good idea to see if this is doable, so we can see if CAMS can support such a study. I have tried to implement a simple task on how it might look in CAMS. Here is the code in Flutter:
This is taken from the Pulmonary Monitor app, which allow us to set up a set of tasks for the user to do. What this code does is basically to:
I hope this helps understand how CAMS and Cognition Package works together. |
Hi @bardram This example makes it look fairly straightforward to implement sensor data, which is good to know. To match RK's functionality and architecture, and to make future active tasks easy to add, we would probably have to reproduce the Device Motion Recorder, which provides a broad range of processed sensor data and can easily be called within any active task requiring motion sensor data. This is one of several sucbclasses of the ORKRecoder class within RK. These data recorders are crucial for the structure of the active tasks and how they behave. Importantly, all of the recorders automatically record the data in JSON (see dependencies within ORKRecorder) and allow a datafile to be saved (which is arguably the most useful output for upload and analysis). The active task's StepViewController accesses the data during these recordings via a delegate method, which permits calculations to be made on the fly from the data (e.g. conversion from quaternion to degrees and then calculating min/max angles, etc). See line 161 onwards here, for example. So, following this, I think the first required stage is to make the parent recorder class, and then we can try creating a device motion recorder subclass from this. After this, we can try to access this data stream from within the active task steps. Best wishes |
@MadsVSChristensen. I have just had a quick look and this package looks very exciting.
Continuing our discussion from Research Package, I noticed that all of the current activities/active tasks in CP do not seem to use motion sensor data (e.g. accelerometer, gyroscope, etc.) However, the tasks that I am interested in creating will need this. In the past, once I can get hold of sensor data, I can create the tasks relatively quickly.
From working on ResearchKit and ResearchStack previously, I know that there are big differences between how iOS and Android capture device motion. Is this something (e.g. a hook?) that will have to be built into RP or CP?
Thanks (and great work)!
The text was updated successfully, but these errors were encountered: