(Week 3) Mobile Detection for the Onset of Dementia

This past Saturday, some of the math and CS REU folks (myself included) took a trip into D.C. Since I’m basically a NoVA native who’s a hop, skip, and a metro ride away from the place and my grandparents often have former foreign exchange students and their families visit them here in the U.S., I’ve gone into D.C. numerous times in my life. Especially since my older brother graduated about five years ago and concluded that he occasionally wanted to spend more quality sibling time when he was home. So I didn’t really feel like being a tourist, going to the places I normally went, like the Smithsonian Natural History Museum or the Capital Building, or even the National Zoo. I tagged along with a group of recently-made friends from the math REU, whom I met while Nhung and I were tossing a frisbee one day. Lucky for me, we ended up going to places I seldom visit first; we went to the Smithsonian Air and Space Museum, where I saw this really cool exhibit on how we historically kept time and bearing while traveling (they’ve got whozits and whatzits galore!), the National Gallery of Art, where I saw some pretty humorous paintings of nobility, social gatherings, small children, fruits and bugs, spiritual scenes (many of which were of baby Jesus, his mother, or Jesus’ baptism), and landscapes, and the Lincoln Memorial, where I made it about halfway up the stairs to the site, saw an insanely tall set of stairs in front of me, noticed that the misty rain might have made the stairs slick, saw how many people were already crowded around the top of the steps, and was like, “haha, nope! Y’all go on ahead. I’mma stay here and enjoy the view.” All in all, though, I enjoyed the outing much more than I thought I would. I guess the good company encouraged my good day!

 

Progress Report

I finally managed to figure out how to activate the accelerometer motion sensor and process the data values! The Android developer guides, helpful though they are, were cryptic to me when I first started looking at sensors. They’re still confusing, since there’s so many classes and types, but after taking a substantial amount of time to figure out what one means in the context of another, I think I’ve gotten to the point where I can more or less understand how they work on the surface. At the very least, now I have a button in my app that starts sensing and shows what direction the phone is moving in any of the three axial directions. I tried adding the gyroscope into the app, but since I’m still unsure what the difference between that and a rotation vector sensor is, I don’t think it shows directional data accurately yet. Maybe I’m trying to test it incorrectly? At any rate, it’s definitely my next sensor to focus on, since I can use these two together to more accurately define the user’s motion.

 

For the time being, the sensors run only when that particular part of the app is active and quit processing input when it isn’t. Aside from getting the sensors right, managing when the sensors turn on and off is the biggest hurdle right now. Our goal is to start the sensors and they’ll stop sampling data after “X” amount of time (for example, fifteen hours from the start time). I attempted to solve this by first looking to see if there were any methods that the SensorManager class had available to developers looking to do just this. Unfortunately, that lead nowhere of use to me – at least from what I could see. I tried searching the Internet with keywords I thought might help me, but to no avail. Then I considered that what I wanted was to keep track of time using a start time and an end time. I looked for something related to system clocks in the Android Developer Guides and, lo and behold, I stumbled on an AlarmManager class! I remembered from a systems course I was enrolled in last semester that an alarm is a signal that a process can send to the system to do something when a time condition is met, much like an alarm clock signals an individual to wake up in the morning. I started browsing the class and I feel like I found something I can use, but at the moment, I’m still unsure of both what method would be best appropriate for our project and of how to implement it. This is definitely something I’m going to be studying more over the next week, for sure. Until then, laters, baby!

 

Click here for the next week’s post

Click here for last week’s post

Advertisements

2 thoughts on “(Week 3) Mobile Detection for the Onset of Dementia

  1. Hello Susan! It sounds like your project is going well overall. I dabbled a bit in app development for a course I taught a year or so ago. Android Studio definitely has a bit of a learning curve associated with it in addition to having to remember java. Glad you were able to enjoy the DC trip even if you have been there many times.

    My daughter is also very interested in ASL. I have learned the alphabet and a few words, but am very rusty now. I think learning it would be great too.

    I am coming in on Monday to meet with Dr. Simmons and Dr. Rahman around noon. I hope to catch up with you in person then.

    Dr. Weikl

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s