In-Depth on Our Latest Paper: Who Benefits Most from Mindfulness-Based Stress Reduction Training (MBSR)?

Standard

If you’ve read our previous posts, you know that our recent work has focused on social comparison, physical activity, and women’s health. Ultimately, this work is intended to help us design treatment programs that will promote physical activity and improve other physical and emotional health outcomes. But we’re also interested in how factors such as gender and pre-treatment health experiences can help us target existing, effective treatments to the right people. Our most recent publication, in Journal of Health Psychology, focuses on this topic; specifically, identifying pre-treatment characteristics that predict outcomes after the popular Mindfulness-Based Stress Reduction (MBSR) program.

What is MBSR?

Mindfulness, or non-judgmental awareness of the present moment, is positively associated with a number of health outcomes. MBSR is an 8-week program that teaches participants about the influence of mindfulness on stress and health, guides mindfulness meditation practice, and facilitates group discussion. The program also includes a full day of meditation (seven hours) following the sixth week of the program. Participants are expected to continue meditation outside of class, as well as to practice mindfulness during daily activities (such as eating and communication with others). Given the overlapping research interests of the Mindfulness, Stress & Health Lab and CHASE Lab, such as identifying ways to improve mind-body health, we teamed up on this paper with the goal of better understanding the characteristics that predict who benefits most from mindfulness training.

What did we do?

We conducted a secondary analysis on data from an existing project. Our goal was to examine whether baseline anxiety, sleep quality, or gender could predict change in two different outcomes, from before to after participating in MBSR. The first outcome was emotion regulation, which is how an individual controls (or regulates) their emotions. Two emotion regulation techniques we examined were emotion suppression (a way to manage your feelings by essentially muting them) and cognitive reappraisal (changing your thinking to change how you feel). The second outcome was physical symptoms of stress, such as headaches and fatigue. For this paper, we chose to include only the people who completed both the pre- and post-treatment questionnaires (203 participants).

What did we find?

We found improvements in both of our outcomes from pre- to post-treatment among both men and women, with men showing greater improvement in emotion suppression than women. In other words, when it came to emotion suppression, men benefited the most from the MBSR program by showing a bigger drop in suppressing their emotions from the beginning to the end of the program. In addition, people who started the program with higher levels of anxiety and worse sleep quality actually saw the most improvement in physical symptoms of stress. Those who started with higher anxiety also showed greater improvements in cognitive reappraisal. Overall, it appeared that men, individuals with high anxiety, and those with poor sleep quality ended up benefiting the most from the MBSR program.

What does this tell us?

Our findings suggest that evaluating baseline characteristics may be an important first step in identifying who can benefit the most from mindfulness training. This information may help clinicians refer people to MBSR who have a high likelihood of benefiting from it, and steer other people toward different treatments. 

What was it like to work on this paper?

Dr. Greeson and I started developing the idea for this paper back in 2018, when I graduated from Rowan. I took a health psychology class with Dr. Greeson as an undergraduate and loved the topics we learned, especially the introduction to mindfulness. During a meeting with Dr. Greeson we discussed my interest in research and mindfulness, and that is where the idea of this paper began Then, once I began working for Dr. Arigo in the CHASE Lab in 2018, we were able to identify a direction we could take the paper that overlapped with all of our interests, so Drs. Arigo, Greeson and I decided to work together as co-authors.

Writing this paper as first-author was intimidating, but also a huge learning experience for me. I could not have asked for better mentors to help guide me through this process, and both Drs. Arigo and Greeson have taught me so much when it comes to developing and taking the lead on a paper. I also appreciated the collaborations we had with the co-authors on this brief report (at Duke University and the University of Pittsburgh), and how everyone was able to contribute information based on their own expertise. Overall, we saw this paper evolve from an idea, to a conference presentation, and now to a brief report in the Journal of Health Psychology, which is an exciting accomplishment.

 – Megan Brown, CHASE Lab Research Coordinator

This project is a great example of cooperation between research groups to learn from an existing dataset. It’s not easy to manage a project with so many contributors, and Megan did a great job with guidance from us. I enjoyed the opportunity to work with and learn from the Mindfulness Lab and I hope that this is the start of a long-term partnership.

– Dr. Dani Arigo, CHASE Lab Director

As a professor, researcher, and mentor, you always want to support bright, eager students who show interest and initiative. Megan was one of the top students in my Health Psychology class her senior year, and I knew she had the aptitude for graduate school. Working on this paper together, in partnership with Dr. Arigo, Megan was able to grow as a scholar, to develop her scientific writing, and to integrate feedback from collaborators across Psychology and Medicine. Perfect preparation for a PhD program!*

– Dr. Jeff Greeson, Mindfulness Lab Director

Next Steps

For this paper, we focused on whether pre-treatment characteristics predict outcomes after a standard MBSR program. While there is plenty of evidence that mindfulness training can reduce stress and enhance mental health, far less is known about the impact of mindfulness training on objective measures of physical health (like obesity, blood pressure, or blood sugar), or, whether mindfulness training benefits diverse groups of people. Future research is needed to not only identify factors that best predict outcomes of mindfulness training, but to also directly compare how helpful mindfulness programs are for different types of people, facing different types of stressors. New, adapted Mindfulness-Based Interventions (MBIs) aim to go beyond the general MBSR program and target specific symptoms or conditions like depression, chronic pain, and heart disease. This raises an important, ongoing question about “What works best, for whom, and why?” By studying predictors of outcomes – in addition to outcomes per se – we can better tailor existing mindfulness programs, increase access, and improve outcomes for as broad a group of people as possible.

*And Megan is joining the CHASE Lab as a clinical psychology Ph.D. student this fall! Like what we do? See here for more information about doctoral training with us.

@RowanCHASELab’s First Virtual Conference

Standard

On May 14th, UConn’s Center for mHealth and Social Media hosted a virtual version of it’s annual conference. This year’s theme was “Building an Evidence Base for Commercially Available Technology,” a topic that our group has some experience with (see here for an example). Sessions took place on the Center’s site and on Zoom, and several presenters incorporated interactive elements using Slido (polls, Q&A). Instead of poster sessions – and instead of making traditionally formatted posters available electronically – presenters were asked to create 1-minute video summaries of their studies.

Our group had three “posters” at the conference this year. Check out what our presenters had to say about the experience of conceptualizing and creating a short video – and watch their videos at the links below:

  • Kristen: Overall, I really enjoyed pushing myself to be creative in this process. I was excited to see that the video posters were intended to have language more suited for a lay audience. This language, coupled with the visually-driven format could be a great opportunity for dissemination to a wider audience and the general public. Looking back on my process creating the video poster, my goal was to take advantage of the expanded format options. I had seen some YouTube videos titled “Draw My Life” in which individuals took out a white board and markers and drew scenes illustrating key moments in their life. I really enjoyed the simplicity of this video format and wondered if anyone had ever created a “Draw My Research” video before. Thinking that it might be worth a try, I drew some preliminary sketches that were paired with a script and asked for feedback from my labmates. Once I knew that the sketches made sense with the accompanying lines of script, I practiced drawing the designs on a whiteboard while recording video. This allowed me to ensure that the designs could be drawn sufficiently and sped up to fit into the 1-minute timeframe. Next, I had to create a makeshift tripod using items in my house (asI don’t have video recording equipment) and record the video. Lastly, I downloaded the video so that I could fast-forward and trim, overlay audio, and use additional software to add captions. Though this was not my first experience with video-editing software, there was much I had to learn along the way and I enjoyed doing it. Arigo, D., Pasko, K.P., Brown, M.M., Vendetta, E., Travers, L., Gupta, A.A., Ainsworth, M.C., Symons Downs, D., & Smyth, J.M. (2020, May). Daily Social Influences on Physical Activity among Midlife Women with CVD Risk: An Ecological Momentary Assessment Study. Poster presented at the 2020 annual meeting of the UConn Center for mHealth and Social Media (virtual). 

In addition to Megan, Kristen, and Cole, Laura and Dr. Dani Arigo also attended the conference. Here are some of our reflections on the experience as a whole:

  • Laura: Attending a conference virtually was definitely interesting. I enjoyed being able to interact with polls and questions throughout the presentations, and having the ability to pose questions during the Q&As at the end. I had been worried it was going to sound robotic during the Q&A, especially if questions were just answered from a screen, so the fact that they included a designated speaker/host was wonderful! The biggest difficulty I had was sitting for such a long period of time. I had to make sure I got up and walked around between presenters, either around the house or a quick break outside (I didn’t realize how much I appreciated walking to different rooms during an in-person conference until now!) Overall it was a great experience and I believe they did a fantastic job!
  • Dani: This conference was a model for what all virtual professional meetings should be – extremely well-run by people who understand technology and how to use it to engage attendees remotely. I loved the interactions using Slido and the integration of Zoom interactive sessions when appropriate. I also attended all three post-conference workshops (Research Designs for Testing Commercially Available Technology, Introduction to Social Network Analysis, and How To Write an Effective Seed Grant) and benefitted from each one in a different way. Although there are advantages to face-to-face meetings that virtual formats can’t yet replicate, it’s friendly to the wallet AND the environment to avoid flying and driving to meet in person. I would definitely attend a virtual conference again, if it were organized like this. 
  • Cole: I enjoyed the virtual conference format. Although it was less interactive than a traditional in-person conference, I found the Slido Q&A and polling functions very useful. It also helped to have a Slack chat open with other CHASE lab members, so we could stay connected during the presentations. The ability to screenshot slides definitely beats hurried notetaking, too!
  • Megan: A virtual conference was new for me (as I’m sure it was for many people attending), so I wasn’t sure what to expect. While I think having been able to interact with people and presenters in person would have allowed for more opportunities to ask questions, I did enjoy this experience! I liked that I was able to tune in for so many great talks, which I know probably would have been hard to do in person. Even though the poster videos were brief, I thought it was neat seeing how other presenters used this opportunity to disseminate their research in creative ways!   
  • Kristen: I was surprised to enjoy the virtual conference as much as I did! Of course there were a few times that technology did not want to cooperate, and I did not get to experience one of my favorite parts of research conferences (speaking to others who are interested in similar research topics and generating new ideas), though the poll feature and general interactive nature of the conference still allowed for an exciting dialogue. Specifically, I enjoyed learning more recent evidence on the active ingredients in behavior change within apps. 

Did you attend UConn or another virtual conference this year? Let us know your thoughts!

CHASE Lab Plays Work From Home Bingo

Standard

We’ve now been working from home for almost eight weeks, and although our tips to stay sane and productive have worked well, we’re still itching for some new ways to connect with each other and infuse some new energy into our work. A few weeks ago, lab member Bernard Kwiatek (junior Psychology major) suggested that we all play Work From Home Bingo, using a bingo card he made. 

We selected a date (Tuesday, 4/28) and everyone kept track on their bingo card. At our lab meeting that week (Thursday, 4/30), we discussed the activity as a group. Although Dr. Arigo was the only one to get BINGO, we had a lot of fun using a new way to track and share our work experiences with each other. Check out what our members had to say about it:

  • Megan: I really enjoyed playing this game, and it made me laugh realizing how many of the boxes I was able to relate to. I think it’s especially important right now to have lighthearted activities like this to do during the day, which may help take your mind off of other serious matters.  
  • Emily: It was fun reading through the board and I’d get so excited when I could mark off a square. It was a nice little break to take during work hours. 
  • Kristen: I tried to check off the board at the end of the day so that I didn’t influence my chances, and unfortunately, I did not get BINGO. But, it was still a fun activity to do during lab hours that made me smile, which goes a long way during these difficult times. 
  • Cole: This was the only time in my professional career that I had wished for technical difficulties to happen (so I could mark it on my bingo card). Sadly, my laptop performed perfectly that day. Despite not winning, it was an entertaining break from my usual work routine.
  • Laura: It was the perfect distraction throughout the day and served as a nice reminder to not take things so seriously. 
  • Bernard: Going through the day and trying to hit each spot was a hectic but in a good way. It’s interesting seeing what spots I usually hit normally than if I could potentially fill the entire board.

Want to play along, or make your own bingo board? Visit Bingo Baker!

#HealthyFinalsWeek Tips

Standard

Finals week is stressful, and it may be even more stressful this semester as we take finals remotely. Below are some tips from Rowan CHASE Lab to help you stay healthy, manage stress, and finish the semester strong.

DO these things:

  • Make a plan for studying so that you can prioritize effectively. Take into consideration that some finals will be earlier in the week than others, and some will require you to devote more time.
  • Take breaks! Overworking is the best way to get burnt out, so plan a 15- minute break after every hour. At the end of each day, plan a slightly longer break as a reward for hard work. Breaks can include walks outside, coloring, crocheting, or any other activity that lets your brain rest for a few minutes. 
  • Get extra sleep. Research shows that memories are consolidated and integrated during sleep, and fewer hours lead to worse academic performance. 
  • Try to recall what you have learned as you study – real-world examples, how to apply concepts in new situations, etc. This will be more useful than attempting to memorize all of the course material. Your professors care more about learning! 
  • Create your own study guides. Research shows that generating your own study instead of using your professors’ allows you to engage with the material more. 
  • Quiz yourself. Prompting yourself to recall the information in a manner similar to the exam will you help you prepare. 
  • Work with others virtually. Seek out students in your classes who are motivated and doing well – check class discussion boards and/or post a thread about virtual study sessions (Google Hangouts, Facetime, etc.). Quiz each other, support each other, pool resources. If you can explain challenging material effectively to someone who is not in your class, you can be confident that you know it well. Everyone wins.
  • Drink LOTS of water. Staying hydrated will reduce discomfort and distraction. (One of our team members goes without coffee for the whole week! He swears that he feels just as energized with water.) 
  • Plan meals ahead of time. Stock your room/apartment with healthy food, bring meals and snacks with you when you study, or have a plan for where to access healthy options near your study location. Without a plan, it’s easy to get stuck with junk food options that won’t give you the energy you need to power through and perform well. 
  • Spread out your unpacking over the week. If you moved off campus, unpacking offers a productive break from studying.
  • Take off from work. If you are still working during this time, and can afford to schedule a week off from your job, this will free you up to focus on studying and reduce your stress level. 
  • Plan a way to reward yourself for your hard work. An evening in video chatting with friends, ordering a big item you’ve been waiting to splurge on, diving into that new show you’ve been waiting to start…

DON’T do these things:

  • Start a new series on Netflix this week. You will get addicted and not study. Use a new series as a reward at the end of the week!
  • Skip meals, or skip exercise, to study. Research shows that giving yourself a short break and engaging with social support and/or physical activity is best for performance. But make sure you’re doing it virtually, or from the safety of your home! 
  • Forget about professors’ office hours. Make a list of questions for review, and set up an online WebEx meeting with your professor. 
  • Overdo it on coffee and/or energy drinks. Energy drinks in particular will disrupt your sleep and result in worse performance. 
  • Study in your bed. WAY too tempting to lie down, which makes it easy to fall asleep and lose valuable study time. Make your bed and study sitting up (if you have to study in bed) to limit the temptation. (This will also make it easier to fall asleep in bed at night.) 
  • Plan on studying during breaks between exams. Something always comes up, and you may simply be too exhausted to study during short breaks. 
  • Neglect your personal hygiene. Taking care of yourself is just as important as acing your exams! 
  • Ask your professors the day before what is on the final – they will not be happy, and they’re not likely to be able to get back to you in time.
  • Leave studying for the night before a final. You’ll perform better if you review a little bit each day for a few days before. 
  • Drink alcohol the night before a final. Enough said. 

Share your tips with us at @RowanCHASELab on Twitter!

What We’re Working On, Remotely

Standard

Like most academic research groups, we’re working from home these days to prevent the spread of COVID-19. We’re really lucky to be able to continue much of our work from home, and this transition has come with some unique opportunities and challenges. In this post, we share some of what we’re working on lately, and how we’re coping with being away from campus.

Ongoing Research and Response to COVID-19

Our primary ongoing research study, Project WHADE, requires that we meet with participants face-to-face (to take measurements, calibrate physical activity monitors, and get the monitors back when participation is over). For women who were actively participating when Rowan’s campus closed, we were able to set up a mail-in system to get the monitor back, and we conducted exit interviews via phone. But we had to pause recruitment and enrollment of new participants, as the patterns we’re studying are likely to look really different now than they did before March. We hope to start up again over the summer.

Yet, we see this as a unique opportunity to understand more about our patterns of interest. We think they’ve changed, but we won’t know how much or in what ways unless we re-assess them. We’re in the process of inviting previous participants to enroll in “Part 2,” which re-uses remote survey technology to capture daily experiences. We won’t have activity monitor data, but we’re hoping to learn as much as we can about how COVID-19 precautions have affected our participants’s daily lives. Stay tuned for more about this new venture!

In the meantime, we’re doing tons of behind-the-scenes work: 

  • Searching and summarizing existing literature on topics of interest
  • Managing, coding, and analyzing existing data
  • Preparing virtual presentations for the UConn Center for mHealth and Social Media Conference (May 14-15, 2020)
  • Drafting professional articles to describe our recent findings 
  • Doing our best to stay healthy and sane

Remote Teamwork

Rowan uses the WebEx platform for virtual meetings, and we’re using these to stay connected. We still have our weekly lab meeting and regular individual meetings with Dr. Arigo, weekly or as needed. For times when we want some company while working on projects, we’ll set up WebEx meetings, allowing us to virtually work alongside each other.

Slack is another tool that we use a lot. It’s a chat platform that allows for communication between individuals and groups, and you can create “channels” for specific topics. Slack gives us the opportunity to ask each other questions and receive answers quickly, and to create various channels where members can share updates about life and work, as well as anything that might help us all stay motivated and upbeat. 

What We’re Reading

For multiple projects related to health among midlife and older adults, including Project WHADE and several lines of inquiry from RowanSOM’s ORANJ BOWL study, we’re reading about physical activity, weight change, pain experiences, and social support in this population:

How We’re Staying Sane

As clinical health psychology/behavioral medicine professionals (in training), we’re trying to practice what we preach to get us through this difficult time. Our most effective methods so far:

  • Kristen: Working on remote tasks outside when the weather allows, getting in a Facebook Live workout whenever I can, virtual game nights with friends, and spending time with/helping out family while I’m temporarily back in Pennsylvania. 
  • Megan: FaceTiming with friends and family, online workouts (3-5 times a week), teaching myself yoga, and starting a new show on Netflix (Outlander).
  • Bernard: Talking and playing videogames with friends online, trying out new things like baking, and reorganizing my entire Spotify playlist and finding new types of music to enjoy.
  • Dr. Arigo: Running outside 3-4 times per week (and walking other days), virtual yoga classes, spending time with my cats, re-watching every season of The Great British Baking Show, and reading for fun (when I’m up to it). (Dani’s personal reading list: The Broken Earth Trilogy by N.K. Jemisin, The Road to Little Dribbling by Bill Bryson.)
  • Laura: Beginning a morning mindfulness practice, creating a family “Must See” movie list (randomly choosing a title from a jar each week), talking with friends and family either on the phone or FaceTime, getting outside between work tasks.
  • Emily: Working out every morning (even if it’s just a quick, 20-minute workout), going for walks and spending time with my 3-month old, watching new shows at night (recently finished Money Heist and Ozark on Netflix).

Let us know how you’re staying in touch and staying sane these days – we’re always looking for tips!

Meet @RowanCHASELab: Interview with Undergraduate RA Bernard Kwiatek

Standard

Bernard is a junior at Rowan who has worked with the CHASE team since Fall 2019. He was interviewed by postdoctoral fellow Cole Ainsworth.

RowanCHASE Lab: Let’s start off with the basics! Tell us about your undergraduate experience and how you were introduced to psychology research.

BK: Coming into college, I had no idea undergrads could help with research like this. It was only at the end of my second year that my advisor told me that this was a potential opportunity. 

RowanCHASE Lab: What initially got you excited to work in the CHASE lab as a research assistant?

BK: I was initially excited to just experience research. I had no idea what to expect and really wanted to try it out and see if it was something that I would enjoy or not.

RowanCHASE Lab: What are some valuable skills you have learned while working in the CHASE lab?

BK: I’ve learned how to effectively search for and dissect scientific papers in order to cite or learn from it, which was something that took me forever before I started working with a research team. I’m also pretty good at coding data now – I can do it quickly with very few mistakes, which is useful for a lot of other work I might do.

RowanCHASE Lab: What is some advice you would give other students at Rowan interested in pursuing a research assistant position?

BK: Get started as early as possible, even if you don’t think you’ll be qualified. The earlier you start, the more time you have to train and be more effective in future years. Think of it like reserving your spot for the future.

RowanCHASE Lab: Lastly, what are your plans after you graduate and how will working in the CHASE lab support your future endeavors?

BK: I plan to attend graduate school and continue on with research after that. CHASE lab is a big part in helping me reach that goal because it has already provided me with contacts, experience, and advice from co workers on how to look appealing to graduate schools all over.

Close-Up on Our Latest Paper – Social Comparison Features in Physical Activity Apps: Scoping Meta-Review

Standard

In a few of our recent posts, we (re)introduced you to the concept of social comparison and described our efforts to understand how it influences health and health behavior. CHASE lab’s newest paper is an extension of this previous work, focused on the potential to use social comparisons in physical activity interventions. This systematic scoping review of existing review papers is now published in Journal of Medical Internet Research.

A number of research studies show that social comparison can prompt people to be physically active. For example, when we see other people like us being more active than we are, this can motivate us to keep up with or do better than them, and that motivation can lead to activity engagement. We might also be motivated to stay ahead of people who we see as less active than we are. Evidence showing that social comparison can motivate physical activity has led researchers and app developers to include features such as leaderboards and challenges (competitions). These are included to prompt users to make comparisons, as comparisons should lead to increases in activity.

But people who study the effects of social comparisons understand that comparisons are not always motivating:

  • Seeing someone doing better than we are can be discouraging – it shows us that we’re not achieving as much as we could and that we’re being outperformed by others
  • Seeing someone doing worse than we are can show us the worst-case scenario – this can activate anxiety or a sense that effort is pointless

It’s not clear whether satisfaction, anxiety, hope, frustration, or some combination of these experiences is the best immediate consequence of comparison, because any of these experiences could motivate someone to increase their physical activity. And most importantly, the “optimal” consequence of a comparison can differ between people, and within the same person over time. (For more details about these ideas, see Dr. Arigo’s 2018 post for UCL’s Digi-Hub and her 2018 publication with Dr. Jerry Suls in mHealth.) So it’s pretty likely that just giving all users the same physical activity-based social comparison opportunities isn’t going to work equally well for all of them. This means that personalizing the social comparison features of apps might work better than what we’re currently doing.

What Did We Do?

One of our overarching research goals is to determine how best to harness the power of social comparison and other social processes to promote healthy behavior. For this project, which spanned more than a year of work, CHASE Lab teamed up with Dr. Jerry Suls, a longtime colleague and expert in social comparison processes and health. 

Because social comparison is a complicated process, we wanted to understand how apps currently prompt comparison. And because researchers have already published more than 100 reviews (or overviews/summaries) of physical activity app features and related topics, we took a step back to look at what’s already been done. We summarized how other researchers have defined, classified, and attempted to personalize social comparison features of physical activity apps, and compared these to evidence of attempts to engage or personalize other processes (such as goal-setting or feedback).

To do this, we began by developing inclusion criteria. Existing publications were eligible if they:

  1. Were available in English
  2. Were published on or before May 31, 2019
  3. Conducted a systematic or narrative review, or meta-analysis
  4. Reviewed the features of commercially available smartphone apps or included formal intervention programs delivered via smartphone apps 
  5. Used increasing physical activity or reducing sedentary time as a key behavioral outcome. 

We then searched publication databases such as PubMed using specific key terms, and pulled in any publications related to using smartphone apps for physical activity. Our initial search totaled in 3,743 articles. After removing duplicates and reviewing the remaining 1,496 publications, we were left with 26 reviews that met our inclusion criteria. Co-authors Megan Brown and Kristen Pasko then went through each review and extracted specific data points, such as whether the reviews included social comparison as a category, what they used as their definition of social comparison, and which features they classified as prompting comparison processes.

What Was It Like to Work on This Project?

This was my first time being a part of a systematic review project, and this experience has made me so much more appreciative of the work and time that goes into a paper like this. At first it was intimidating knowing we would have to code so many publications, but having a team that encouraged communication and questions made the process much easier. I also found it valuable being a part of the extraction process of the final 26 reviews, where we were able to gather all of this valuable information and answer some very important questions with it. I’m looking forward to seeing how our review contributes to future research aiming to use social comparison in physical activity apps.”

— Megan Brown, CHASE Lab Research Coordinator

“I’m grateful that Dr. Arigo invited me to assist with this project. By and large, when health psychologists have studied social comparison or tested a comparison intervention, there has been little recognition or appreciation of the nuances associated with comparison. It has been treated as a concept that can just be taken off the shelf. This scoping review confirms that impression and leads the way to testing social comparison interventions with more attention to the factors influencing comparison choice and outcomes. The physical activity apps context is really an excellent one to examine these issues. A very rewarding collaboration for me!”

— Dr. Jerry Suls, Northwell Health

This has been one of my passion projects for a long time – we even presented an early version of it at a conference in 2017! It went through several iterations and updates, and it seemed that there always was more to do before we had a final product. The author team did a great job of staying committed to the work and we really benefited from having Dr. Suls’s expertise. He and I have worked together for about 10 years on understanding social comparisons among adults with chronic illness, but social comparison features of apps were new to him. It was fun to be able to introduce him to this new area. The final version is something I’m really proud of. It ties together several lines of our work and t paves the way for our upcoming projects.”

— Dr. Dani Arigo, CHASE Lab Director

What Did We Find?

Of the dozens of reviews we found, 26 met our criteria, and 8 of those included social comparison as a process underlying various app features. Across these 8 reviews, researchers used different definitions of social comparison and classified different features as using vs. not using comparison:

  • Definitions: some authors counted only features that allowed comparisons between users, rather than comparisons to experts like fitness instructors (this was called “modeling”); others allowed comparisons with anyone
  • Features: some authors counted only direct exposures to others’ data in a ranked format (leaderboards or challenges), whereas others counted any social networking (where users could share progress in other forms, such as via message boards); some were even more restrictive and counted challenges as “gamification” rather than comparison

Social comparison was described just as often as social networking (i.e., using message boards), but less often than behavioral modeling (i.e., providing examples of behavior engagement to encourage others to engage). And although we found evidence of personalizing features such as goal-setting and feedback, we found no evidence that (the potential for) personalization had been addressed with respect to social comparison features.

What Does This Mean?

Research is inconsistent about what constitutes social comparison in physical activity apps. This makes it difficult to draw conclusions about the utility or benefit of social comparison processes in these apps, or how to improve these features to make apps more effective. Further, existing work shows that people respond to social comparison differently (from each other and from themselves over time), but we found no evidence that physical activity apps have taken these differences into account. Together, this means that there is a huge opportunity to better understand how social comparison processes can be used to promote physical activity and other healthy behaviors – which is what CHASE Lab will continue to work on!

Close-Up on Our Newest Paper: Accelerometer Cut Point Methods for Midlife Women with Cardiovascular Risk Markers

Standard

Our research team takes a specific interest in women who are between the ages of 40 and 60, a period often called “midlife.” Women in this age range have elevated age-related risk for cardiovascular disease (CVD), are beginning menopause, and are experiencing health conditions such as type 2 diabetes and high cholesterol – all of which independently increase CVD risk. Therefore, midlife women have a lot to gain from physical activity, as it can protect against CVD even when other risk factors are present. So health professionals have spent a good bit of effort on promoting physical activity in this group. A focus has been on getting women to meet U.S. Department of Health and Human Services recommendations for moderate-to-vigorous physical activity (MVPA), or activity at an intensity that gets the heart rate up.

If you’re someone who tries to follow public health recommendations for physical activity (or you do research in the area of physical activity), you may be aware that recommendations changed last year. Specifically, the U.S. Department of Health and Human Services changed the way it defines MVPA. For several years prior to 2019, guidelines indicated that MVPA should happen in “bouts” (or episodes) sustained for at least 10 minutes at a time, and that adults should get 150 minutes of this kind of activity per week. The most recent report has removed the requirement that MVPA happen in 10-minute bouts, indicating that all MVPA is helpful for accruing health benefits. Although this is good news, as it means that shorter bouts of MVPA now count toward the 150-minute total, it raises important questions about population-level activity engagement. For example, most U.S. adults fail to meet the old guidelines; is that true now that shorter bouts count?

To make matters even more complicated, measurement of physical activity engagement isn’t entirely consistent across research studies. There are several methods for calculating whether activity reaches the threshold to be considered MVPA, and it’s not clear whether these methods give the same answers about how much time midlife women spend in MVPA. In other populations (such as among children and pregnant women), different methods give wildly different answers about how much MVPA participants get – differences of up to 100 minutes.

In our new publication (currently in press at Menopause), we took a closer look at two questions about midlife women’s MVPA:

(1)  How different are estimates of MVPA between considering only 10-minute bouts and considering all minutes?

(2)  How different are estimates of MVPA (bouted and all minutes) between different calculation methods?

What did we do?

We looked at four popular calculation (or “cut point”) methods for MVPA: Freedson et al. (1998), Swartz et al. (2000), Matthews et al. (2008), and Troiano et al. (2008) in two separate studies. The first was an observation-only study conducted by our CHASE team at The University of Scranton (before we moved to Rowan University in 2018), and the second was part of a weight loss clinical trial conducted by our collaborators at Drexel University’s WELL Center. This two-study approach allowed us to replicate our initial findings in a separate sample and confirm that findings were consistent across contexts.

What did we find?

In both studies, we met with midlife women at our research center for brief interviews, to train them in the use of a research-grade physical activity monitor to wear during waking hours for the following 7 days. Both studies showed that (1) using non-bouted (total) minutes of MVPA resulted in significantly more minutes than using 10-minute bouts only (across calculation methods), and (2) calculation methods meaningfully differed in the number of MVPA minutes they estimated (across non-bouted and bouted MVPA). Additionally, two of the methods (Freedson et al., and Troiano et al.) showed that midlife women did not meet MVPA recommendations using either bouts or not-bouts, while the other two methods (Matthews et al., and Swartz et al.) showed that midlife women met or exceeded MVPA recommendations if non-bouted minutes were considered.

What does this tell us?

Overall, our series of studies seems to be the first of its kind to focus on differences between cut point methods for physical activity among midlife women with elevated CVD risk, and to compare MVPA bouts with total (non-bouted) minutes. Findings suggested that using different cut points provide different answers, and researchers should keep in mind respective strengths and weaknesses of each method. This work is not only timely considering recent changes in physical activity recommendations, but also necessary for understanding how to estimate MVPA toward the goal of reducing CVD risk in midlife women.

What was it like to work on this study?

“It is amazing to think about how far the lab has come with various iterations of this [observational] study. When it first started, Dr. Arigo and I were at The University of Scranton running a pilot for our WHADE project, which is now in its full form. At this time, we were just beginning to learn the ins and outs of recruiting through primary care. I still remember being excited at the thought of getting any experience in this setting. This was my first research experience recruiting outside of the college population. It was thrilling to be recruiting those out in the community, trying to meet people where they were. ”

– Kristen Pasko, CHASE Lab Member

“Collaborating with Dr. Arigo and her team at Rowan University was an incredible experience. I processed some of the accelerometers from Drexel University that were used as part of this larger study. Working on this project allowed me to see the research process through from start to finish, from assisting with analyzing the raw data to the writing of the manuscript. Before this project, I had never worked on research specifically relevant to the question of women’s health and physical activity. It was a pleasure to work with Dr. Arigo and her students to answer such an important research question that has clear clinical implications for how women are advised to engage in physical activity.”

–  Savannah Roberts, Former research coordinator at Drexel’s WELL Center (current Ph.D. student at the University of Pittsburgh)

“This was a pretty large project that involved a number of team members, for two different studies, across three different universities (including Rowan, Drexel, and Penn State). So it took a lot of open communication and teamwork to bring the project together and communicate what we found. Our group was fantastic and stayed focused on learning what we could from the project. It’s been fun and rewarding to do this work and see it published in a journal that focuses on women’s health.”

– Dr. Dani Arigo, CHASE Lab director

Next Steps

If you follow our posts, you’ll remember that recently, we summarized our review of studies that assess social comparison using within-person methods – those that capture comparisons repeatedly for the same person over days or weeks. This review and the physical activity study described in this post was designed to help us make informed decisions about how to estimate midlife women’s physical activity in our women’s health study, which is running now. The goal of this work is to understand the circumstances that contribute to changes in midlife women’s physical activity from day to day, and ultimately, to design better activity interventions for midlife women. Stay tuned as we work toward these goals!

Inside our Newest Paper – Methods to Assess Social Comparison Processes within Persons in Daily Life: A Scoping Review

Standard

Have you ever had the experience of comparing yourself to others? For example, learning that someone in your work unit got a raise (typically a positive for them) or got written up (typically a negative for them), and thinking about your situation in comparison to theirs? This experience, called social comparison, is extremely common. It can happen in response to conversations with close others and information we get about other people through social media or TV, or by simply imagining someone in a particular situation. More than sixty years of research on social comparison suggests that this process can make us feel good or bad and that it can affect our self-perceptions and behaviors (for better or worse). 

Our team is particularly interested in understanding how social comparisons can affect health behaviors such as eating and engaging in physical activity. We’ve done a lot of work in this area (see our list of publications) and we have multiple ongoing studies devoted to understanding particular aspects of these associations. A consistent challenge for this research is selecting with tools and methods to use to assess comparison, as these decisions can affect the answers we get. For instance, asking someone how often they make comparisons or how interested they are in making comparisons requires people to consider their thoughts and behaviors over long stretches of time (we’re not good at doing this accurately!) and over different situations (which could affect our responses – yes in some situations, no in others). 

Recently, we’ve been asking questions about the best way to assess social comparison – as in, how to get the most accurate information about how and when comparisons happen and how people respond. To avoid the problems associated with a person indicating how much they make comparisons overall (called the “between person” method), we’ve considered asking the same person to report their comparisons as they happen in daily life, repeating the same assessment for each person multiple times (called the “within-person” method). 

Repeated, within-person assessment should allow us to map how often comparisons happen and any changes in how a given person makes or responds to comparisons with greater accuracy. But because this approach is relatively new, there hasn’t been much work to provide guidance on how to conduct within-person assessments of social comparison or how to report findings from these studies. Our group wanted to meet these needs by giving an overview of existing social comparison studies that use within-person methods and identifying next steps for this type of research. 

To do this, CHASE lab teamed up with members of the ReMind and SHADE labs at Penn State University for a large-scale project. We conducted a systematic scoping review (now published in Frontiers in Psychology), which involves a process of carefully searching for and identifying existing research on a topic and summarizing what this research can tell us, using pre-identified research questions and selection criteria. (Our review questions and criteria were preregistered with the Open Science Framework.) We searched the databases PubMed, PsycInfo, and CINAHL for studies of social comparison that used within-person assessment methods. This resulted in 621 potential articles that we could include, which we evaluated with respect to our inclusion criteria. In the end, we included and reviewed 36 studies; we coded these studies on a range of variables, including how participants recorded their comparisons (via paper vs. technology such as smartphones), how often they were asked to record comparisons (how many times per day), and what other experiences were assessed.

What was it like to work on this project?

“Social comparison is my primary research interest, and a key training goal of my current K23 grant is to learn more about using within-person methods to study it. So working with Dr. Mogle to coordinate a multi-lab review of what we know in this area was a dream come true. Our teams worked really well together, as usual. Most papers don’t have submission deadlines, but this one did [as part of a Frontiers in Psychology special issue], and everyone stepped up to overcome some logistical setbacks and keep us on track to finish and submit on time. It’s exciting to see the final product after months of intense focus to get us here.”

— Dr. Dani Arigo, CHASE Lab, Rowan University

“Working on this review opened my eyes to the extent of differences between study methods, even when the studies have similar goals. I enjoyed working collaboratively with colleagues from another institution, and was impressed by how easily collaboration was. This was mostly due to our lead authors consistent and clear communication to the rest of the team. This was the first paper I have had the opportunity to work on, and the process was extremely rewarding. I’m excited for future work!”

— Laura Travers, M.S., second-year Ph.D. student

“Conducting this review with the CHASE lab was fascinating! My area of expertise is in methodology, and I didn’t know as much about social comparison measurement. There are so many ways researchers are trying to capture this experience in the real world, which all get at different aspects of the experience. We worked together to create a method for coding and summarizing the differences across studies so we could synthesize and make sense of the scope of this literature in the paper. I enjoyed working with the team and together we generated an exciting product; we’re hoping that our conclusions and recommendations will be helpful to other researchers.”

— Dr. Jacquie Mogle, ReMind Lab, Penn State University

What did we find?

  • Most studies assessed only comparisons of appearance comparison and included only college students or young women. 
  • The majority of studies collected information in response to signals (rather than initiated by participants). 
  • Studies meaningfully differed in the number of assessments of comparison per day, the number of days of assessment, how participants recorded comparisons, and even how “comparison” was defined.

From this and other information we summarized, some of our recommendations for future work are:

  • Conducting more work to understand social comparisons that occur in understudied groups, such as men, older adults, people with chronic illnesses, and people who attempt to change their behavior
  • Several aspects of the method should be more clearly spelled out in future publications, including: the rationale for selecting the number of assessment days, total number of assessments, timing of assessments, item wording, specific definition of comparison, and the instructions provided to participants (regarding what “counts” as a comparison and how to recognize one)
  • Published reports should include estimation of within-person fluctuation in the number of and response to comparisons during the study

Next steps

We’re combining what we learned from this review with findings from two studies of midlife women’s physical activity.* This contributed to the design of a within-person study on associations between social comparisons (and other experiences) and physical activity among midlife women (currently underway), which will help us better understand how to deliver physical activity interventions in this population. Stay tuned for updates as we move this work forward!

*See our upcoming post on this paper!

MEET @ROWANCHASELAB: Interview with Kristen Pasko

Standard

Kristen Pasko is a second-year student in Rowan’s clinical psychology Ph.D. program, and she’s been with CHASE Lab since 2014. She was interviewed by first-year Ph.D. student and lab member Laura Travers.

@RowanCHASELab: Let’s start off with a broad question. When did you first know you wanted to focus on the field of psychology?

KP: I think my first recollection of wanting to be in this field came at the end of high school. At that point, I had seen the impact that mental illness could have on others around me – how it could truly prevent individuals from doing what they wanted in life. I decided that at the very least, I wanted to be a mental health advocate.

@RowanCHASELab: What made you choose to work with the CHASE lab and to have Dr. Arigo as your mentor?

KP: I actually have quite a bit of history with CHASE. Dr. Arigo has been my mentor since my sophomore year at The University of Scranton. I began working with her as a research assistant, then as her research coordinator after graduation. At that point, I fell in love with health psychology from everything I learned from her in that lab. When it came time to apply for graduate school, she brought my attention to Rowan University’s Ph.D. in Clinical Psychology for its emphasis in health psychology and integrated healthcare. Little did I know that she would get a job offer at Rowan around the same time. I was fortunate enough to interview with Dr. Arigo and had the realization that many of my research interests were borne out of my very training with her, and that she was the best fit for a grad mentor.

@RowanCHASELab: Could you tell us about your research experience so far? What you think has helped you be a good researcher?

KP: One of my favorite things about research is that it feeds my never-ending curiosity. I’ve always thought about the process like completing a puzzle. There are so many pieces missing and it takes a great amount of focus, dedication, and curiosity to collect pieces to understand the bigger picture. Once one piece is found, then you have a clue about the next piece. Once one research question is answered, you’re left with the next question.

Focusing on enjoying this process has been important. Much of the time, research can feel drawn out, and setting little goals for yourself can provide that small reward. One of the biggest takeaways for my personal success has been to force myself to engage in constant critical thinking. As a researcher, there has to be a reason why you’re studying that specific topic, with that specific method to answer that specific research question.

@RowanCHASELab: What are your research interests? How have they changed from your undergraduate career to now?

KP: That’s actually a bit of a loaded question! Much of my initial work focused on health behavior and social influence broadly. Towards the end of my undergraduate career and time as a research coordinator, I realized that my true interest lies in health behavior within the context of chronic illness and how social influences through family, friends, and other individuals with illness directly and indirectly help and hinder healthy behavior.

@RowanCHASELab: What professional goals do you have for yourself this year?

KP: For the remainder of my second year in the Ph.D. program, I hope to push myself to even higher levels of quality and efficiency in my work. I also want to remember to enjoy myself.

@RowanCHASELab: How has what you’ve learned so far affected your plans for after graduate school?

KP: I have many answers for this. However, one of the biggest things is giving myself time to meet my long-term goals. I have so many research ideas that I am itching to pursue now, but a career in clinical psychology means lifelong learning. Trying to do everything at once usually means that you don’t do anything well. So I want to pace myself and see how each step informs the next.