Much more has happened this week than last week. For example, I got really hooked on java, in addition to getting hooked on Java. So much so that during my most recent trip to the local Walmart, a 15-pack of iced vanilla frappucinos managed to find its way into my cart and follow me home to my apartment. I blame it on having to read Greek. Literally. Some of the statistics presented in my article readings used Greek symbols in the math to explain their results, so I found myself trying to crash after lunch, when I usually went through an article-reading spree. I’ve immersed myself in Java so much that for trivia night at this local dining place called “Clementines,” where when one of their questions’ answer was the programming language, I forgot to filter myself and basically gave the answer to every team in the room. Whoops… Reason number twenty-four why I should be last pick for us-versus-them-type knowledge games.
Another super awesome thing that happened this past week: the math and chemistry REU programs started. Hooray! More human life! Conversations! Except, you know, for the fact that chemistry is on the other side of our building and math works closer to the apartments than we do.. and the fact that some of the students in one of the programs seem to primarily speak ASL. Although I know a little bit of ASL, I don’t think I know enough to really hold a conversation at this point. I’m kind of hoping to try to learn more, but they seem a little unapproachable and I rarely see anyone outside of the computer science (cs) program. Breaking the ice is actually really hard as an introvert, I don’t think y’all understand! I’ve always lamented not having the right resources to really learn ASL growing up, since having someone on the inside who’s willing to practice with me is the biggest hurdle. There’s only so much the Internet can teach before you need more… Unfortunately for me, I was never taught ASL since my hearing loss has always been in the mild to moderate range and ‘The Adults’ didn’t deem it necessary. However, as I get older, I find that I’m in more environments in which ASL would have been my communication of choice. So conversations, not so much, but given the chance and a person or two who are willing to help me out by letting me practice with them, I really want to learn more ASL from the people who use it.
Continuing where I left off last week, I’ve been reading more journal articles to get a better sense of things that worked and things that didn’t in relation to studies regarding dementia, stress, and activity sensing. I’ve learned that early-dementia folks are more likely to wander frequently than their sound-minded counterparts. Additionally, while repetitive questions are prominent in early-dementia individuals, repetitive actions (for example, wiping a table) were less common. However, since repetitive actions increased in frequency as an individual became more definitely diagnosed with dementia, the absence of such behavior in a person suspected with dementia doesn’t completely clear them of suspicion. I also read about a study in which the authors attempted to classify real-time sensor data inputs into categories based on where the triggered sensor went off and what occurred in the previous time window. Interestingly, rather than ignore unknown activity like most activity recognition classifiers would do, they purposefully made “Other Activity” an option for their classifier to sort sensor activity into. In doing this, however, the authors largely skewed their data set, since the authors’ study subjects were told to go about their normal days, unscripted, and the classifier was told to sort everything it received. Since many things in an unscripted environment couldn’t be anticipated in training, about half of the activity windows were considered “Other Activity.” All of these articles are definitely more in-depth than my extremely short summaries, but these are a few things that really stood out for me in relation to this project’s goals.
Speaking of goals, ours has changed a bit since last week. We still intend to build a mobile phone application for individuals diagnosed with dementia. However, Dr. Rahman and I are now also considering the possibility of a companion application for the individual’s caretaker. During this week, I drew out a (very messy) flow sketch to illustrate the basic user interface for each application, in addition to what the user should do to signal communication between the two applications. I’m still unsure at what point mobile sensing starts up in the patient’s application, but my thoughts run along starting it up on initial app startup. I think, for both applications, I’ll have a sort of “setup” step, which will only appear on app startup if no “about me” sort of information is stored yet. I’d include a photo of my flow chart here, but it’s a seemingly haphazard myriad of boxes and arrows and chicken scratch, likely making it unreadable to anyone but me.
I don’t know for sure how and where we’ll be processing the sensors’ raw data, but I’m thinking of storing the information in an SQL-type database for the backend. As far as what sensors we’ll employ, I think we’re going to start with motion and location data collection, since those two seem like they’ll be the most influential in detecting dementia relapse phases. I’ve begun setting up the primary application (the one for the dementia-afflicted individuals), getting some basic page changes and buttons in and making use of the department’s latest Android phone to test-run my app. I’m still struggling with some of the aspects of Android Studio, like inserting a table layout and filling the data fields, but I’m definitely getting more comfortable with it overall. Over the next week, I want to toy around with the motion sensors and figure out how to use them in the background.
I’ve also been introduced to the wonders of LaTeX (pronounced “LAH-TEHK”) via an online platform called “Overleaf.” Whether or not LaTeX is “wonderful” remains to be seen, but I’ll be learning how to use that while I write our project proposal, which is a new task that I’ve picked up this week. At the very least, I’m thankful for my high school dual-enrollment English class, since I don’t need to also learn how to use Zotero, which, by the way, is pretty much a bibliographer’s dream.
So while I’ve been doing all of this over the past week, I’ve mostly finished reviewing Java and my goodness, you should see the number of post-it flags in this textbook. I’m glad I put them in and color-coded them, though. It definitely makes quick-referencing super easy. As far as the upcoming week goes, I’ve already mentioned my goal about incorporating motion sensing into the primary application, I want to finish the project proposal, and I still need to summarize this week’s papers before I get more reading material next week. Since I’ve finished reading my current article assignments, reading them again to summarize and analyze them should be much easier. Until next week, so long, and thanks for all the fish!